(HealthDay)—Having dental insurance doesn't mean people will actually take care of their teeth, a new study indicates.
The findings suggest patient outreach and education are needed to ensure that people understand the importance of good dental health and that they use their insurance coverage to get care, the University of Maryland School of Dentistry researchers said.
The researchers examined data from a 2008 survey of older Americans who either did or did not have dental insurance. They also looked at individual characteristics such as race, gender, marital status, age and health.
Their conclusion: Providing dental coverage to those without insurance who generally don't seek dental care does not necessarily improve the likelihood that they will see a dentist.
Getting these people to seek dental care requires going beyond simply providing insurance. Patient outreach and education are essential, according to the study, which was published online in the February issue of the American Journal of Public Health.
"You can't just hand people coverage and say, 'There, that's better,'" study first author Richard Manski, professor and chief of dental public health, said in a university news release. "You need to offer some inducements, some promotional campaign, to change people's attitudes and beliefs. We hope this starts the process of a new way of thinking about the problem."
But getting people to use dental insurance is not a short-term process, Manski said.
"We need to set long-term goals for such things and understand that dental coverage and use is a long-term issue, so we don't get frustrated that rates of use aren't going up right away," Manski said.
The study findings also apply to other types of health insurance, he said.
"Dentistry and dental coverage is a perfect experimental model for health care," Manski said. "There are lessons to be learned for overall health coverage and use as well."
Explore further: Why disparities in dental care persist for African-Americans even when they have insurance coverage
The American Academy of Family Physicians outlines the importance of caring for your teeth and mouth.