Skip to main content
Toggle Menu of ADA WebSites
ADA Websites
Toggle Search Area
Toggle Menu
e-mail Print Share

My View: Tennessee sealant experience at variance with Pew report

March 04, 2013

By James A. Gillcrist, D.D.S.

I am a long-standing member of the ADA, past president of the Nashville Dental Society and current member of ADA Council on Members Insurance and Retirement Programs. I am the TennCare dental director and former state oral health director for Tennessee. I am a diplomate of the American Board of Dental Public Health. My entire career has been devoted to improving the oral health of underserved Tennesseans with an emphasis on children. Although we have made great strides, there is still much work to do. The opinions I express here are based on my own experience and are my own.

Any time an organization like the Pew Center on the States publishes a national oral health report which draws conclusions and assigns grades to states based on incomplete data, old data, inaccurate data, improper interpretation or application of national oral health objectives, and use of invalid indicators, such reports need to be challenged. Whenever a respected professional organization like the ADA, the uncontested expert in the field of oral health, fails to respond to flawed national oral health reports portrayed as "scientific," it is assumed by the public that such reports are up to standard.

In the report, "Falling Short: Most States Lag On Dental Sealants," Pew indicates that it is a nonprofit organization that applies a rigorous analytical approach to improve public policy, inform the public and stimulate civic life. Under External Research Support, Pew states, "The following experts provided valuable guidance by reviewing the research design and methodology featured in this report." Under Acknowledgements, Pew thanks an individual with the National Academy for State Health Policy for guidance in data analysis and another individual with the American State and Territorial Dental Directors for guidance and assistance in data collection. In short, Pew implies its report is "scientific."

The title of the report implies that quantitative measures were used to demonstrate that many states have failed to achieve acceptable oral health standards. Pew asserts that the 50-state report focuses on prevention, examining states' efforts to improve access to sealants for low-income kids. Pew's grading of the states is based on four indicators or benchmarks that it maintains should be a key part of any state's prevention strategy: (1) having sealant programs in high-need schools; (2) allowing hygienists to place sealants in school-based programs without requiring a dentist's exam; (3) collecting data regularly about the dental health of school-children and submitting it to a national oral health database; and (4) meeting a national health objective on sealants. I will respond to these indicators beginning with the last one mentioned.

Healthy People's 2010 oral health objective related to dental sealants set a target of 50 percent as the proportion of all children aged 8 and adolescents aged 14 who received dental sealants. This national objective was not limited to low-income children, but Pew decided to use it as an indicator for a study that purports to improve access to sealants for low-income children. Pew inaccurately characterized this oral health objective as a "minimum threshold." It is a target, not a standard. States should strive to achieve the target and simultaneously improve the proportion of children who receive sealants. It was never envisioned to be used in a punitive manner should the states fail to reach the goal.

Healthy People national oral health objectives are designed to measure progress toward goals and are periodically re-evaluated and revised. It is interesting to note that Healthy People 2020 has set a new target for 6-9-year-olds at 28.1 percent and adolescents age 13-15 at 21.9 percent—far below the 50 percent established for 2010. Findings from Tennessee's 2008 Oral Health Survey revealed that by 8 years of age, 34.9 percent of children had dental sealants on at least one permanent tooth. Unfortunately, no one at Pew asked Tennessee or other states if they had sealant prevalence findings. Pew's use of Healthy People's 2010 objective as an "absolute" indicator not only ignored revised 2020 targets, but penalized all states that did not have data from the 2006-2007 school year forward showing over 50 percent of third-graders with sealants.

Pew's indicator related to the percentage of high-need schools with sealant programs assigned points to states based arbitrarily upon determined percentages of high-need schools that were reached with a dental sealant program without regard to the size of the state's underserved population, public health staff, number of eligible public schools, or cost to state and local public health programs.

Pew's indicator related to collecting and submitting data to the National Oral Health Surveillance System was assessed using publicly available Centers for Disease Control and Prevention data. Out of curiosity, on Jan. 23, after the release of Pew's report, I went to the NOHSS website ( and found that the Web pages for the State Profiles were most recently updated on June 16, 2009.

States were given zero points for never participating in NOHSS; one point for monitoring sealants, but having data prior to 2006-2007 school year; and two points for monitoring sealants and having recent data. Tennessee received zero points in spite of having verifiable, current and detailed information presented recently to the ADA for each year from 2001 through 2012. Regrettably, Pew failed to contact the states and request this information. Instead, they considered only one out-of-date national data source.

The final indicator used by Pew to grade states was whether a state requires an exam first by a dentist prior to allowing hygienists to place sealants. Pew contends, despite the fact this hypothesis has never been tested, that it restricts a hygienist's ability to provide sealants to more children. Further, the article cited by Pew to support its contention, though well-designed, was a paper that focused on economics, not barriers to sealant access. In fact, there is a statement in the article that actually contradicts Pew's contention: "Some studies have found that sealant retention rates do not vary between dentists and dental hygienists and that both types of operators take the same time to screen or apply a sealant."

Had Pew collected state sealant utilization information, it could have determined if a correlation existed between state statute affecting requirements for supervision of dental hygienists and sealant utilization. As pointed out to the ADA, the two states (Illinois and Tennessee) that have applied the most school-based sealants require an exam first by a dentist prior to placement of sealant(s) by hygienists. Since a sealant is a reversible preventive procedure, neither the Tennessee Dental Association nor I believe that in public health settings this requirement is necessary. However, the evidence tends to refute the validity of Pew's indicator.

It seems obvious to those of us who have operated programs at the state and local levels that if you are interested in measuring access to sealants for low-income children, you should focus on two outcome measures that are relatively easy to obtain. The simplest measure to obtain is the percentage of Medicaid and CHIP children by specific age range—6-9 and 10-14 years old—who have received one or more sealants on permanent molar teeth in a given year. This data is gleaned from Current Dental Terminology (CDT) procedure codes reported by dental providers on paid claims. The other important outcome measures are the number of underserved children who have received sealants and the number of teeth sealed by the state's public health school-based sealant program that we in Tennessee have already provided to the ADA. Again, this information was never requested or gathered by Pew.

It is essential that intellectual honesty and objectivity be inherent in the conduct of any scientific study. If an organization maintains later that the report was never intended to be scientific, it should have stated that it was meant to inform policy, not science. Additionally, it is disingenuous for a researcher to omit pertinent state information and demonstrate lack of transparency. Not ensuring that due diligence is exercised in gathering "all" firsthand information and failing to give the state that you are grading an opportunity to respond and provide critical feedback before results are released is more than problematic.

Those of us who work in state government are held accountable for oversight and conductof the programs we administer, as we should be. However, if the legislative branch believes that our programs are not effective or that the program is perceived as an embarrassment to the state, it can act to eliminate or reduce funding for such programs.

Finally, irresponsible reporting can have an adverse effect on the morale of public health staff that we rely upon to deliver oral disease prevention services. These staff members have devoted their entire careers to oral disease prevention only to learn that their best efforts were worth a "D" or an "F," according to Pew.

Dr. Gillcrist wrote this commentary in response to the Pew Center on the States' report released in January.