Dental insurance is insurance that pays for treatment by a dentist. Dental insurance covers the cost of routine dental care along with accidental damage to your teeth. Dental insurance includes benefits for oral examinations and X-rays as well as preventive dental treatment.