This post about 5 reasons why early detection of celiac disease is crucial contains affiliate links.
Early detection of celiac disease has baffled researchers for decades, particularly in patients with silent or asymptomatic celiac disease.
Unfortunately, in order to diagnose a patient with celiac disease, the patient must have what is known as “total villous atrophy.” This means the microvilli, which are the hair-like follicles surrounding the small intestine responsible for absorbing nutrients, are flattened and damaged.
Related Article: What is Celiac Disease?
As Dr. Tom O’Bryan says in The Autoimmune Fix, the microvilli should look like shag carpet, but in patients with celiac disease, it looks more like berber carpet.
In this article, I talk about why I believe early detection of celiac disease is crucial to the health of millions of Americans. I also offer up ideas for how we can better detect and diagnose celiac disease, particularly in at-risk populations.
Please note that I am not a doctor. Information included in this article is either with citation or is of my own opinion. Please consult your healthcare provider with questions. Do not self diagnose celiac disease. I suggest you read this article before you go gluten-free. Read my full disclosures.
5 Reasons Why Early Detection of Celiac Disease is Crucial
Right now, doctors are only able to detect celiac disease once a patient experiences total villous atrophy. The damage is done. But could we have detected the disease sooner? What are the warning signs we should be looking for? And is this limited diagnosis criteria too, well, limiting?
There are several reasons why we need to find a way to detect and diagnose celiac disease early:
(1) We need to arrest disease before it starts
As mentioned, doctors CANNOT diagnose people with celiac disease if they only have partial or mild villous atrophy; a patient must have full villous atrophy in order to “earn” a celiac disease diagnosis. This means millions of people with mild to moderate villous atrophy may be on the path to an eventual celiac disease diagnosis without even knowing it!
This also explains why the vast majority of people diagnosed with celiac disease are diagnosed as either an adult, middle aged and into their senior years. They may have had mild villous atrophy, but it took them an average of 8-11 years to officially get diagnosed with celiac disease (that’s often how long it takes for people to begin experiencing chronic symptoms worthy of a doctor visit).
One (of many) studies suggest a link between osteoporosis and undiagnosed celiac disease. In one study, researchers concluded the following: “Our findings provide direct evidence that reduced bone mineralization occurs in asymptomatic celiac patients before any other symptom becomes evident. Only early diagnosis and treatment of celiac disease can avoid the deterioration of the bone structure observed in all clinical status of celiac disease.”
If we can detect celiac disease early enough, we can prevent diseases such as osteoporosis as well as digestive, skin, joint and brain disorders (to name a few) from occurring.
(2) For children, the damage can be life-altering
Undiagnosed celiac disease in children can lead to growth and development disorders (which are non-reversible), and puts children at greater “risk of malignant diseases and the emergence of mental disorders.”
This is why it’s essential that a growing and developing child is able to properly absorb nutrients, which is virtually impossible to do with a damaged small intestine.
Doctors in Denver, Colorado followed 1,339 children born from 1993-2004. They followed the children for 20 years and regularly tested them for the development of transglutaminase auto-antibodies (tTGA), a marker for celiac disease. They found that 112 children developed celiac disease autoimmunity, and 66 of those patients eventually also met criteria for celiac disease. Of the children diagnosed with celiac disease, 30 percent were asymptomatic.
This study offers several important findings to understanding the critical nature of early detection in children:
(a) The study found that celiac disease was three times as prevalent in this test group than in the U.S. population. This either suggests that the prevalence of celiac disease is much higher than original researchers noted, or more likely, that the occurrence of celiac disease in this generation of children is triple the national average. Knowing that this upcoming generation of children might be more susceptible to celiac disease makes the urgency for early detection even more critical.
(b) The study’s findings also suggest that screening for celiac disease in children should be done throughout adolescence (and even into adulthood), as nearly half of diagnosed participants developed celiac disease after the age of five.
(3) Too many people are undiagnosed
While it’s known that just under one percent of the population or 1 in 133 people have celiac disease, what is lesser known is that 80 percent of people with celiac disease don’t even know they have it! This adds up to about 1.4 million people in the U.S. who have no clue that celiac disease is raging inside of them.
Without better screening techniques and recognition of celiac disease symptoms beyond digestive disorders, I don’t know that people will be able to detect the disease early enough before it turns into more serious disorders.
(4) Better long-term outcomes for patients
I believe early detection of celiac disease is critical to offering better long-term outcomes to patients. Early detection is essential to preventing the development of more serious conditions and even early mortality. The sooner the diagnosis, the sooner the patient can limit the damage and begin properly managing their disease via a gluten-free diet.
This is why I believe we need guidelines for monitoring the early onset of celiac disease.
Someone who is genetically predisposed to celiac disease, may, for example, get tested for celiac disease as a child or young adult. If the test is negative, they might consider themselves free from celiac disease. However, what we are learning is that while someone may not have tested positive for the disease at some point in their lives, it doesn’t mean that it won’t turn on when they become adults. (Remember, nearly half of diagnosed participants in the Denver study developed celiac disease after the age of five, but they tested negative prior to that.)
This is why we must establish formal guidelines for monitoring for celiac disease in high risk patients. For those genetically predisposed to celiac disease (via genetic marker or family history of celiac disease), I believe testing should occur every couple of years and monitored closely by a doctor, particularly if symptoms change.
Please note that I have not been able to find any research to support the idea of frequent testing and monitoring for the early onset of celiac disease in predispositioned patients; however it’s generally accepted by researchers that celiac disease can “turn on” later in life.
(5) More funding
Celiac disease is the most prevalent digestive disease in the U.S., yet it receives the least amount of funding of all digestive diseases. We need this to change and help researchers understand the just how pervasive the disease has become.
In order to fully understand the reach of celiac disease, we need to expand the parameters of diagnosis beyond full villous atrophy.
Can someone have, say, Stage 1 celiac disease, if they have only mild villous atrophy? We do this in cancer patients, why can’t we do this with celiac disease patients. Why do we have to wait for total villous atrophy, or as an analogy, Stage 4 cancer, to diagnose the issue?
I’m not a doctor, so I don’t know the answer to this issue, but it’s something that I want the celiac disease research community to consider. Let’s not wait for total villous atrophy to make the diagnosis.
We, therefore, need to look at a variety of factors to properly detect and diagnose celiac disease early.
Research published in the American Journal of Gastroenterology (2006) says that the diagnosis of early developing celiac disease should be based on a combination of clinical features, histology, serology, and genetics. I couldn’t agree more.
The first thing we need to consider is serology, or blood screen tests. While the current blood tests for celiac disease are very accurate for detecting celiac disease in someone with total villous atrophy, it does a poor job at detecting it in someone with partial villous atrophy. In fact, the blood test can be wrong seven out of ten times, says Dr. O’Bryan, in cases where there is only partial villous atrophy.
Additionally, the blood test also is not error-proof even when celiac disease is present. For example, celiac disease patients are more likely than the rest of the population to have an IgA deficiency. When an IgA deficiency is present, it creates a false negative on a celiac disease antibody test despite the fact that a person might legitimately have celiac disease.
We also need to be cognizant that histology (biopsy of tissue) isn’t perfect either. Celiac disease detection via intestinal biopsy can be marred by errors. In one study, researchers said that that “biggest problem” in the diagnosis of celiac disease is proper “interpretation of a biopsy specimen,” as well as analyzing an “adequate” number of samples and “poorly oriented biopsies.”
The American Journal of Gastroenterology (2006) reports that “conventional histology is not anymore a gold standard in the diagnosis” and suggests that the diagnosis criteria be revised.
One way a doctor can initiate early detection of celiac disease is through genetic testing. Ninety percent of celiac disease patients have the HLA-DQ2 gene while 10 percent have the HLA-DQ8 gene, and the HLA-DQ2 and HLA-DQ8 genes were detected in every early developing celiac disease patient. If a patient has a celiac gene, and mild or moderate villous atrophy is present, doctors can detect the early onset of celiac disease.
It’s important to note, however, that just because you have a celiac gene doesn’t mean you will have celiac disease. It just means you could have it; however, some sort of trigger must “turn on” the gene.
There’s no doubt that the earlier we can detect and diagnose celiac disease, the better the long term outcomes will be for patients. However, the current testing methods, while viable, need further research and methodology.
We need to consider diagnosis beyond a simple blood test, we need to expand the diagnosis to those with partial villous atrophy (not just full villous atrophy) and we need to come up with guidelines to monitor those genetically predispositioned to the disease (either because they have one of the two celiac genes or because they are related to someone with celiac disease).
I’ve laid out five reasons why I believe early detection of celiac disease is crucial. What do you think? And why do you think early detection has stumped researchers for so long? Please leave a comment to share your thoughts.