
The Rise and Fall of Synthetic Food Dyes
By Marina Zhang
In 1856, 18-year-old chemist William Henry Perkin was experimenting with coal tar-derived compounds in a crude laboratory in his attic.
His teacher, August Wilhelm von Hofmann, had published a hypothesis on how it might be possible to make a prized malaria drug using chemicals from coal tar, and as his assistant, Perkin was hoping that he would be the one to discover it.
The experiment was a failure. Rather than the prized drug, Perkin created a thick brown sludge. However, when he went to wash out the beakers with alcohol, it left behind a bright purple residue.
The residue became the world’s first-ever mauve synthetic dye.
Prior to the invention of synthetic dyes, people obtained dyes through organic materials like plants, clay, minerals, or certain animals like insects and squid.
Natural dyes such as those from clay tended to fade quickly, and those that were long-lasting, like natural indigo dyes, used an arduous extraction process that made them expensive.
Perkin’s mauve dye, however, was stable and easy to make.
Mauve dye became an instant hit in the UK and globally. Consumers were seized by “mauve measles.” Everyone wanted a piece of it, including Queen Victoria, a fashion icon at the time who ordered mauve gowns, hats, and gloves.
Perkin’s discovery and commercial success prompted chemists in Europe to find the next dye in coal tar: Magenta was discovered in 1858, methyl violet in 1861, and Bismarck brown in 1863.
Synthetic dyes would soon be added to everything—clothing, plastics, wood—and food.
Rapid innovation was not without consequences. Many dyes were discovered to be harmful and removed within decades of discovery. More than a century later, the United States has announced its removal of synthetic food dyes.
Dyes in Food
For centuries, people have colored food to make it appear more appealing, said Ai Hisano, associate professor at the University of Tokyo specializing in cultural and business history.
Butter, for example, is not always yellow. Depending on the cattle feed, breed, and period of lactation, butter color can fluctuate seasonally from bright yellow in the summer to pale white in the winter.
“Dairy farmers colored butter with carrot juice and extracts of plant seeds, called annatto, to give them a uniform yellow all year round,” Hisano wrote in her article published in the Business History Review.
Butter is made at a factory in a biological treatment plant in Albertville, France, on April 26, 2016 (L); Butter melts on toast. Jean-Pierre clatot/AFP via Getty Images, Scott Olson/Getty Images
Extracting natural colors is a tedious process, and the color of butter between each dairy farmer could also vary.
Natural colors, unlike artificial ones, are susceptible to changing pH, temperature, and moisture. Colors can change in hues and intensity, and yellows can become pale.
While coloring foods is not a new phenomenon, the practice of mass coloring and controlling food color likely emerged as a result of industrialization, when packaged and processed foods became widely available in the late 19th century in the United States, said Hisano.
“Mass production and industrialization required easier, more convenient ways of making food, and using coal-tar dyes was one of the solutions for creating more standardized food products,” Hisano told The Epoch Times.
Since packaged foods lose freshness, they may lose color or look less natural. So previously, some companies would add compounds like potassium nitrate and sodium sulfites to meat to retain their color. These compounds were relatively harmless.
More lurid examples include toxic metals like lead used to color cheese and candies. Copper arsenate was added to pickles and old tea to make them look green and fresh, with reports of deaths resulting from lead and copper adulteration.
Dye companies started producing synthetic food dyes in the 1870s. The first dyes were used to color butter, cheeses, and margarine, and significantly reduced the cost of coloring butter.
When individual farmers made their own coloring solutions, the shades of butter varied significantly. It was only when dyes came from commercial companies that the color of butter became standardized, such that we naturally expect butter to be that particular shade of yellow.
Use of natural food dyes subsequently rapidly declined.
Food regulation began in the 1880s. The Bureau of Chemistry, under the U.S. Department of Agriculture, a branch that would later become the FDA, looked into food adulteration and modification.
Dairy products such as butter and cheese were the first foods for which the federal government authorized artificial coloring.
Just as the current Health and Human Services secretary, Robert F. Kennedy, Jr., and commissioner of the FDA, Dr. Martin Makary, are working to ban synthetic food dyes, they were also not popular with Harvey Wiley, chief chemist of the Bureau of Chemistry.
“All such dyeing materials are reprehensible, both on account of the danger to health and deception,” Wiley wrote in his book “Foods and Their Adulteration,” published in 1907.
Despite his criticisms, by the time the book was written, practically all the butter on the market was artificially colored.
“The object of coloring butter is, undoubtedly, to make it appear in the eyes of the consumer better than it really is, and to this extent can only be regarded as an attempt to deceive,” Wiley wrote, arguing that if the cows were properly fed during winter, they would naturally produce butter of the appealing yellow shade.
“The natural tint of butter is as much more attractive than the artificial as any natural color is superior to the artificial,” Wiley wrote.
In 1906, Congress passed the Food and Drugs Act, prohibiting the use of poisonous or dangerous colors in food. Henceforth, the use of toxic food dyes was prohibited. The FDA was formed on the same day the bill was made into law.
After the prohibition, the FDA approved seven synthetic foods dyes—most of which would be banned in the 1950s after new animal studies indicated their toxicities.
In 1938, new laws were passed requiring all food dyes, whether synthetic or natural, to be listed on product labels.
In its regulation of food colors, the FDA has always given greater scrutiny to synthetic dyes over natural ones.
Synthetic food dyes must be given an FDA certification before they can be used, while there is no requirement for natural dyes. While the FDA regulates synthetic dyes as a food additive, natural dyes can be regulated as generally recognized as safe, which is a less stringent authorization procedure.
By the 1950s, as oil and gas replaced coal as the main sources of energy, food dyes were no longer made with coal tar derivatives, but with petroleum-based compounds instead.
These new petroleum-based food dyes are considered very similar in composition and chemistry to their earlier coal tar counterparts, food scientist Bryan Quoc Le told The Epoch Times.
“Petroleum is cheaper, safer, and available in greater quantities,” he said.The use of synthetic food dyes has been steadily increasing every decade. Data based on FDA dye certification suggests that food dye consumption has increased fivefold since 1955.
A 2016 study estimated that more than 40 percent of products in grocery stores contain artificial colors.
Many packaged snacks contain synthetic food dyes. Scott Olson/Getty Images, Justin Sullivan/Getty Images
Cancer Concern
Since the introduction of synthetic food dyes, the discussion on their potential health effects has persisted.
In 1890, a German physician reported that coal tar workers had a higher incidence of bladder tumors.
In 1950, many children fell ill after eating Halloween candy containing Orange 1, a then-authorized synthetic food dye. U.S. Rep. James Delaney began holding hearings that prompted the FDA to reevaluate all approved color additives.
The hearing also led to the passing of the Delaney Clause, which prohibits the FDA from approving any food additive that can cause cancer in either humans or animals.
By the time Wiley became the first head commissioner of the FDA, experts were in contention over which food dye was riskier than the other. Over the ensuing decades, dyes that were initially approved were gradually whittled down to the six remaining dyes of today, as of the most recent FDA announcement.
Orange 1 and several other approved dyes were removed after evidence of animal carcinogenicity.
The Delaney Clause was what prompted the removal of Red 3 in January.
Professor Lorne Hofseth, director of the Center for Colon Cancer Research and associate dean for research in the College of Pharmacy at the University of South Carolina, is one of the few researchers in the United States studying the health effects of synthetic food dyes.
“Synthetic food dyes are xenobiotics,” Hofseth told The Epoch Times, which are foreign to the human body. “Anything foreign to your body will cause an immune reaction. It just will. So if you’re consuming these synthetic food diets from childhood to your adulthood, over years and years and years and years, that’s going to cause a low-grade, chronic inflammation.”
He tested the effects of food dyes by sprinkling red, yellow, and blue food dyes on cells in his laboratory and observed DNA damage.
“DNA damage is intimately linked to carcinogenesis,” said Hofseth.
His research showed that mice exposed to Red 40 through a high-fat diet for 10 months led to dysbiosis—an unhealthy imbalance in gut microbes and inflammation indicative of damaged DNA in their gut cells.
“This evidence supports the hypothesis that Red 40 is a dangerous compound that dysregulates key players involved in the development of [early-onset colorectal cancer],” according to Hofseth and colleagues in a 2023 study published in Toxicology Reports.
The mechanism of how food dyes cause cancer remains to be elucidated.
Hofseth speculates that with the red and yellow dyes, their biological effects may be attributed to being azo dyes.
The gut hosts bacteria that can break down azo compounds into bioactive compounds that may alter DNA. Hofseth believes that if these bioactive compounds impair the gut, they may also contribute to the behavioral problems reported in some children after consuming food dye.
Behavioral Problems
While the link between food dyes and cancer may remain elusive, the link between food dyes and behavioral problems in some children is much more accepted.
Rebecca Bevans, professor of psychology at Western Nevada College, started looking into food dyes after her son became suicidal at the age of 7.
His suicidal ideations went away once food dyes were removed from his diet.
More surprisingly, Bevans noticed her own anxiety dissipated after she removed synthetic red and yellow food dyes from her diet.
“I had a little mini existential crisis at 52,” Bevans told The Epoch Times.
Colored foods are prevalent in supermarkets, attracting children and adults alike. Joe Raedle/Getty Images
The health concerns of behavioral issues were first noticed by pediatric allergist Benjamin Feingold in the 1970s, who proposed that artificial food dyes, food colors, and other additives were causing hyperactivity in children.
He proposed the Feingold diet—a diet free of additives—for children. His theories garnered widespread attention in the media, but the medical community and the American Academy of Pediatrics were unmoved at the time.
When Feingold died in 1982, interest in his hypothesis died away.
A randomized double-blinded trial by the University of Southampton reignited the discussion on food dyes. Researchers found that children given dyes and preservatives exhibited increased hyperactivity.
In an editor’s note discussing the study, the editors at the American Academy of Pediatrics said that “the overall findings of the study are clear and require that even we skeptics, who have long doubted parental claims of the effects of various foods on the behavior of their children, admit we might have been wrong.”
The University of Southampton study was also cited at the FDA’s press conference when it announced a phasing out of synthetic dyes. However, the study only explored the effects of mixtures of food dyes and included dyes not used in the United States. Therefore, the effects of individual dyes remain unclear.
“We don’t know exactly the mechanism of how these metabolites from these dyes or how these dyes themselves directly affect the brain,” Bevans told The Epoch Times.
She said some studies have shown that children with certain gene variations may be more susceptible.
One explanation is that the dyes cause behavioral problems by harming the gut, since the gut bacteria help produce and regulate the brain. Gut problems can also lead to nutritional deficiencies, which may also impair brain health.
“There’s just a lot of unknowns as to the mechanism of function in the body, but there is enough evidence demonstrating through studies that some individuals have much more negative reactions to these food dyes than others,” Bevans said.
Due to early studies indicating potential health issues, in Europe, some food dyes like Yellow 5 and Red 40 have a warning label that they may cause hyperactivity in some children.
Current Food Dyes Used in the US
Three synthetic food dyes have been banned so far in 2025, leaving six currently in use. Among them, red and yellow dyes account for 90 percent of all the dyes used in the United States.
Though evidence of carcinogenicity in these dyes is still inconclusive, the behavioral effects of food dyes are more supported by research.
Current research in children suggests that “there may be some small subset of children who appear to have altered behavior if they consume these synthetic food dyes,” Susan Mayne, former head of the FDA’s Center for Food Safety and Applied Nutrition, told The Epoch Times.
Mayne said that current studies are still murky as they do not study individual dyes, but mixtures.
The following are the six dyes the FDA is trying to eliminate by 2026 and their health effects based on research.
Phasing Out Colors
On April 22, the FDA announced a voluntary phasing out of petroleum-based synthetic food dyes in the United States, with plans to have them extirpated from the food supply by the end of 2026.
Whether this can be done is still unclear.
Mayne said that since it is not a legal requirement, it’s hard to ensure that the dyes are removed by all companies.
The International Association of Color Manufacturers said that it’s difficult for companies to switch to all natural by that time, stating that it takes around five years for companies to reformulate their products.
Natural dyes must be sourced from agricultural products, which places additional agricultural and supply pressures on food companies.
Companies would need to experiment with new formulations, potentially resulting in products that are less visually appealing and have shorter shelf lives—all of which could risk customer loss and increase production costs, said Renee Leber, food science and technical services manager at the Institute of Food Technologists.
“Sometimes we talk about this like it’s a one to one substitution. It is not a one to one substitution. There are so many things to keep in mind,” Leber told The Epoch Times.
Nevertheless, some large companies like Pepsi and Tyson Foods have announced that they will be removing synthetic dyes from their supply.
Leber said it is the consumers who decide and shape the market, adding that they need to be understanding of the products as the food industry goes through this change.
Consumers may be surprised by the changes to shelf life and food prices resulting from the switch in food dyes.
“I think we’re going to need to set new expectations,” Leber said.
“Most companies will try to make sure that they’re bringing their consumers along with them when they start to do this, and to give them the narrative that they are doing this in order to meet this public interest.”