The University of Florida College of Dentistry is in Gainesville, Florida. Established in 1972, the college is the only publicly-funded dental school in Florida and is a national leader in education, research and service. www.dental.ufl.edu
Reviews, get directions and information UF College of Dentistry.