Skip Navigation


What is Dentistry?

Dentistry is a healing arts and sciences devoted to maintaining oral health. Dentists diagnose and treat problems with a patient’s teeth, gums, and other parts of the mouth. They provide advice and instruction on taking care of teeth and gums and on diet choices that affect oral health.  The realization that oral care can have a serious impact on systemic health drives the expansion of new professional opportunities each year. Additional training in dentistry beyond a Doctor of Dental Medicine (DMD) or Doctor of Dental Surgery (DDS) allows specialization in fields such as Endodontics, Periodontics, Orthodontics, and many more.


Dentists typically do the following:


Explore these resources and organizations about dentistry:

Source: ADEA: ExploreHealthCareers. 7 May 2013. American Dental Education Association. 17 May 2013 <>

Source: United States Department of Labor: Occupational Outlook Handbook.  29 March 2012. Bureau of Labor Statistics. 17 May 2013 <>.


Last updated on: