Cookie Preferences
By clicking, you agree to store cookies on your device to enhance navigation, analyze usage, and support marketing. More Info
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
X
A recent Wall Street Journal article raised alarms by concluding that many children who start medication for ADHD will later end up on several psychiatric drugs. It’s an emotional topic that will make many parents, teachers, and even doctors worry: “Are we putting kids on a conveyor belt of medications?”
The article seeks to shine a light on the use of more than one psychiatric medication for children with ADHD. My biggest worry about the article is that it presents itself as a scientific study because they analyzed a database. It is not a scientific study. It is a journalistic investigation that does not meet the standards of a scientific report..
The WJS brings attention to several issues that parents and prescribers should think about. It documents that some kids with ADHD are on more than one psychiatric medication, and some are receiving drugs like antipsychotics, which have serious side effects. Is that appropriate? Access to good therapy, careful evaluation, and follow-up care can be lacking, especially for low-income families. Can that be improved? On that level, the article is doing something valuable: it’s shining a spotlight on potential problems.
It is, of course, fine for a journalist to raise questions, but it is not OK for them to pretend that they’ve done a scientific investigation that proves anything. Journalism pretending to be science is both bad science and bad journalism.
Journalism vs. Science: Why Peer Review Matters
Journalists can get big datasets, hire data journalists, and present numbers that look scientific. But consider the differences between Journalism and Science. These types of articles are usually checked by editors and fact-checkers. Their main goals are:
Is this fact basically correct?
Are we being fair?
Are we avoiding legal problems?
But editors are not qualified to evaluate scientific data analysis methods. Scientific reports are evaluated by experts who are not part of the project. They ask tough questions like:
Exactly how did you define ADHD?
How did you handle missing data?
Did you address confounding?
Did you confuse correlation with causation?
If the authors of the study cannot address these and other technical issues, the paper is rejected.
The WSJ article has the veneer of science but lacks its methodology.
Correlation vs. Causation: A Classic Trap
The article’s storyline goes something like this: A kid starts ADHD medication. She has additional problems or side effects caused by the ADHD medications. Because of that, the prescriber adds more drugs. That leads to the patient being put on several drugs. Although it is true that some ADHD youth are on multiple drugs, the WSJ is wrong to conclude that the medications for ADHD cause this to occur. That simply confuses correlation with causation, which only the most naïve scientist would do.
In science, this problem is called confounding. It means other factors (like how severe or complex a child’s condition is) explain the results, not just the thing we’re focused on (medication for ADHD).
The WSJ analyzed a database of prescriptions. They did not survey the prescribers who made the prescriptions of the patients who received them. So they cannot conclude that ADHD medication caused the later prescriptions, or that the later medications were unnecessary or inappropriate.
Other explanations are very likely. It has been well documented that youth with ADHD are at high risk for developing other disorders such as anxiety, depression, and substance use. The kids in the WSJ database might have developed these disorders and needed several medications. A peer-reviewed article in a scientific journal would be expected to adjust for other diagnoses. If that is not possible, as it is in the case of the WSJ’s database, a journal would not allow the author to make strong conclusions about cause-and-effect.
Powerful Stories Don’t Always Mean Typical Stories
The article includes emotional accounts of children who seemed harmed by being put on multiple psychiatric drugs. Strong, emotional stories can make rare events feel common. They also frighten parents and patients, which might lead some to decline appropriate care.
These stories matter. They remind us that each data point is a real person. But these stories are the weakest form of data. They can raise important questions and lead scientists to design definitive studies, but we cannot use them to draw conclusions about the experiences of other patients. These stories serve as a warning about the importance of finding a qualified provider, not as against the use of multiple medications. That decision should be made by the parent or adult patient based on an informed discussion with the prescriber.
Many children and adults with ADHD benefit from multiple medications. The WSJ does not tell those stories, which creates an unbalanced and misleading presentation.
Newspapers frequently publish stories that send the message: “Beware! Doctors are practicing medicine in a way that will harm you and your family.” They then use case studies to prove their point. The title of the article is, itself, emotional clickbait designed to get more readers and advertising revenue. Don’t be confused by such journalistic trickery.
What Should We Conclude?
Here’s a balanced way to read the article. It is true that some patients are prescribed more than one medication for mental health problems. But the article does not tell us whether this prescribing practice is or is not warranted for most patients. I agree that the use of antipsychotic medications needs careful justification and close monitoring. I also agree that patients on multiple medications should be monitored closely to see if some of the medications can be eliminated. Many prescribers do exactly that, but the WSJ did not tell their stories.
It is not appropriate to conclude that ADHD medications typically cause combined pharmacotherapy or to suggest that combined pharmacotherapy is usually bad. The data presented by the WSJ does not adequately address these concerns. It does not prove that medications for ADHD cause dangerous medication cascades.
We have to remember that even when a journalist analyzes data, that is not the same as a peer-reviewed scientific study. Journalism pretending to be science is both bad science and bad journalism.