As mentioned in my last issue, I’m taking the time to dive into how food and diet affect mental, physical, and frankly, spiritual performance.
Though this newsletter and most of my public-facing work is dedicated to all things psychiatric drug withdrawal education, I’ve spent my career in food and now work as a private performance chef to pro athletes. This is a rare, specific niche that requires a different skill set from those with nutrition degrees and career restaurant chefs. Restaurant chefs, Michelin-worthy or otherwise, have zero understanding of nutrition, elite sports, or performance. They are all physically falling apart and are usually miserable humans. Meanwhile, macro and micronutrient nerds can’t cook. They might be able to char a steak and flex in the mirror, but they can’t execute meal plans without extreme repetition or build flavor and texture in a way that makes you want to eat.
Meanwhile, I’ve got just the right history and education to make me perfect for the job:
- A serious ballet dancer in my youth, I learned early on how nutrition affects performance by developing an eating disorder that led to malnourishment which led to breaking both my feet at 18. I had the bones of a 70-year-old and my 4th and 5th metatarsals snapped like twigs.
- That eating disorder led to an obsession with food. It would be another 15 years before I fully divorced myself from the shadow side of that obsession, but the light side infused me with a deep sense of gastronomic curiosity. After college, I went to culinary school and worked in high-end kitchens in Manhattan. If you’re curious about my experience, watch The Bear.
- Over the years, as the eating disorder waxed and waned, I became more interested in nutrition for performance. I took a job creating recipes for a company that specialized in helping Olympic weightlifters cut weight safely. I watched my own body change as I rose the ranks in CrossFit and weightlifting. Turns out, mass moves mass. I learned to put on muscle and became, as one friend put it, “very hard to knock over.”
- When I went into withdrawal, I stopped competing. My body couldn’t handle the physical stress and my gut was wrecked. Once I stabilized and failed to fix my gut on my own, I reached out to Andy Galpin, who I’d met through CrossFit, and he ran me through a series of tests that identified the issues. I changed my entire culinary strategy based on his recommendations and finally, after twenty years of mental and physical ailments, I felt good.
- In the meantime, Andy introduced me to Joey Votto, MLB all-star and MVP. I was Joey’s private chef for the twilight years of his career, and I put my knowledge into practice. Now, I’m working with multiple 49ers as well as advising a handful of people who are coming off antidepressants. I continue to watch my culinary theories play out in the real world and whether for physical or mental performance, it seems to work.
But here’s the thing: my experience left me deeply disturbed. For so long, I thought I was doing everything right, yet I was still sick. And when I look around, everyone is sick. The reasons, of course, are complex. Some of it comes down to individual choices, but the more I learn, the more I argue that it’s rooted in forces well beyond the individual’s control.
So, I set out to figure out why.
Before I get into my actual food philosophy, it’s vital to understand how we got here. I’d go as far as to say that to make meaningful nutritional change, it’s required that you understand how we got here. Learning this history should permanently and irrevocably change the way you view food, which will in turn affect what restaurants you frequent, how you grocery shop, and who you listen to in the nutrition space. We will come back to it again and again.
Within the confines of this medium and in the absence of a PhD dissertation, I cannot explore every angle. But to understand why what you put in your mouth and why you’re (probably) always just a little bit sick has, in some ways, already been determined for you by the past. To me, there are four key historical points: World War I, 1937, World War II, and 1985.
Let’s go back.

World War I: Food Becomes a Weapon
The story of how the American diet came to be shaped by external forces begins with World War I. Before the war, food in the United States was relatively unregulated. Farmers grew a variety of crops, and diets were more closely tied to the seasons and what was locally available.
But as the United States entered the war in 1917, the need for food to support both troops and war-torn Europe became a critical national security issue. The government created the U.S. Food Administration, an independent federal agency that controlled the production, distribution, and conservation of food in the U.S. during the war. Future president Herbert Hoover was appointed as the director.
One of the agency’s tasks was to stabilize the price of wheat in the U.S. market. Hoover introduced concepts such as “meatless Mondays” and “wheatless Wednesdays” which were also implemented to help ration food, so that the government could prioritize the war effort by urging citizens to forgo certain foods so that they could be redirected to the front lines. This was an appeal to the American public asking for voluntary compliance in the formal absence of rationing. For the first time, the American public was asked to think about food not just as sustenance, but as a patriotic duty.
An Obscure Law: Marketing Orders and Checkoff Programs of The Agricultural Marketing Agreement Act of 1937
After World War I but just before World War II, a little known law passed during the Great Depression: the Agricultural Marketing Agreement Act (AMAA) of 1937.
The AMAA was originally designed to help farmers during the economic turmoil of the 1930s. It allowed producers of certain commodities—such as milk, eggs, and beef—to create “marketing orders” and “checkoff programs,” a complex aspect of U.S. agricultural policy with the overarching goal of ensuring fair marketing conditions for specific commodities by regulating quality standards, packaging, production, and advertising. It is the advertising angle of marketing orders that would go on to bite us all in the ass.
Marketing orders are only applicable to eligible commodities that tend to have perishable qualities or are produced in large quantities. Commodities aren’t “brands” in the way that “Pepsi” is a brand. Brands have companies, marketing departments, researchers, and lawyers to strategize advertising campaigns to set them apart from competitors, i.e., “Coke.”
But beef is beef regardless of the rancher who raises the cattle. A cattleman in Texas can’t really say his beef is better than Nebraska cattle. A steak to steak comparison only makes them compete with each other. Thus, marketing orders and checkoff programs were created to ensure the programs were fair to all producers.
The USDA oversees marketing orders for over 30 different commodities, including:
- Dairy products (milk, cheese, butter)
- Fruits and vegetables (citrus, apples, avocados)
- Nuts (almonds, walnuts)
- Specialty crops (honey, spearmint oil, tart cherries)
Some meats (California beef and lamb) are governed by marketing orders, while the rest fall under what’s called checkoff programs.
Checkoff programs originated alongside marketing orders in 1937, but they really took off in the 1950s and 60s. The specifics are irrelevant to this argument, but the point is that they are commodity-specific programs funded by mandatory contributions from producers. For example, cattle producers pay a mandatory assessment of $1 per head of cattle sold. The Cattleman’s Beef Promotion and Research Board receives about $42 million of the approximately $75 million in assessments collected, and that money is used for advertising, research, and promotion of beef.
However, the rules governing advertisements are incredibly restrictive. Checkoff-funded campaigns are prohibited from making comparative claims or disparaging other foods. Originally intended to keep the Texas cattleman from disparaging the Nebraska cattleman, these restrictions have only handcuffed the industry producing the food that actually keeps humans healthy.
For example, a campaign funded by the dairy checkoff program can say, “Milk is high in calcium,” but it cannot say, “Milk has more calcium than almond milk.” Similarly, beef producers can promote beef’s protein content, but they cannot compare it to chicken or plant-based meat substitutes.
You can probably see where this is going…
World War II: More Food Rationing and the Birth of the RDAs
On the heels of the Agricultural Marketing Agreement Act came World War II. By the time World War II began, the lessons of World War I and The Great Depression had been well learned. Food was a crucial element of the war effort. This time, instead of encouraging voluntary conservation, the government implemented strict rationing systems and sent nutrient-dense food and protein overseas. Sugar, butter, meat, and coffee were all rationed.
In 1943, our old friend Herbert Hoover famously declared to the public that “meats and fats are just as much munitions in this war as are tanks and aeroplanes…the same spirit in the household that we had in the last war can solve this problem.” His words reflected the reality that, during war, food was a strategic resource, one that was just as important as weapons and ammunition. Red meat, a staple in the American diet, hit the black market.
As Dr. Gabriel Lyon said in her book, Forever Strong, with less nutrient-dense food available for the general public, government research turned toward “preventing deficiencies and focused explicitly on boosting short-term performance rather than optimizing long-term health.”
In 1941, the National Research Council developed the first set of Recommended Dietary Allowances (RDAs). These were the precursors to today’s nutritional guidelines and were created with a very specific goal in mind: to establish minimum intake levels to prevent nutrient deficiencies in both troops and civilians during times of scarcity.
It’s crucial to understand the purpose of these early RDAs. They were never intended to guide people toward optimal health. Instead, they were designed to establish a baseline—just enough nutrients to keep the population from getting sick with diseases like scurvy (vitamin C deficiency), pellagra (niacin deficiency), or rickets (vitamin D deficiency).
This focus on preventing deficiency rather than promoting health would become a recurring theme in American nutritional policy. Even as the war ended and rationing disappeared, the emphasis on meeting minimum nutritional standards remained embedded in federal dietary guidelines. This foundational philosophy would have profound effects on how nutrition was communicated to the public and would shape future food policies.
Furthermore, with the end of World War II, the U.S. entered an era of economic prosperity. Food production ramped up to meet the needs of a booming population. Women entered the workforce and farming became more industrialized, leading to an abundance of cheap food that was easy to prepare.
Processed foods, which had been developed during the war to feed soldiers, found new markets among busy families. Canned soups, boxed cereals, and frozen dinners quickly became staples of the American diet. The government, meanwhile, continued to promote minimum nutritional standards rather than focusing on the potential health implications of these emerging food products.
1985: Big Tobacco Buys Big Food
Fast forward to the 1980s. The 1980s were a rough time for cigarette manufacturers. With increasing regulation, the rise of anti-smoking campaigns, and mounting legal challenges, companies like Philip Morris and R.J. Reynolds were looking for ways to diversify their portfolios. Their solution was to invest heavily in the food industry.
In 1985, R.J. Reynolds—maker of Camel cigarettes—acquired Nabisco for $4.9 billion. Nabisco’s brands, which included Oreos, Ritz crackers, and Chips Ahoy!, were already household names, so the merger of R.J. Reynolds and Nabisco created a behemoth in both tobacco and snack foods.
At the same time, also in 1985, Philip Morris—the maker of Marlboro cigarettes—purchased General Foods for $5.75 billion. General Foods was one of America’s largest food manufacturers, responsible for brands like Kool-Aid, Jell-O, Maxwell House, and Post cereals. But Philip Morris wasn’t done. In 1988, it acquired Kraft Foods for $12.9 billion, making it the largest food company in the United States.
These acquisitions weren’t just about expanding into food—they were about applying the tobacco playbook to a new product line. Tobacco companies already had a deep understanding of advertising, consumer psychology, and addictive products, and knew how to manipulate consumer behavior. They also had the best scientists in the world, and moved the cigarette scientists—the world’s leading addiction scientists— to the food world by the thousands. Those scientists weaponized processed food and turned it into ultra-processed food, intentionally creating it to be addictive.
Then, Philip Morris and R.J. Reynolds mobilized the cigarette lobbyists, giving processed food manufacturers both the resources and the motivation to reshape American eating habits. But they needed a little help from the government to give their products a stamp of nutritional legitimacy. Enter the Food Pyramid.
The Food Pyramid was introduced by the USDA in 1992 as a visual guide to healthy eating. At first glance, it seemed innocuous enough, with its broad base of grains and its recommendation for moderate consumption of proteins, dairy, and fats. But the pyramid was the result of intense lobbying by the food industry, particularly grain and processed food manufacturers. Its recommendations to consume 6-11 servings of bread, pasta, and cereals every day were not backed by independent scientific evidence. Instead, they reflected the interests of powerful agricultural and food processing lobbies.
This high-carb, grain-heavy diet stood in stark contrast to the American diet from 50 years before, when red meat was so prized as a source of nutrition that it was traded on the black market. We knew, during wartime, that meat and butter was key to keeping soldiers strong. Why would it be any different for the average American? And yet, in the 1990s, the government began promoting foods that, half a century earlier, were eaten in sacrifice to the war effort.
Of course, this just so happened to align perfectly with the product lines of Philip Morris’s new acquisitions, such as Post cereals and Kraft mac & cheese. Similarly, R.J. Reynolds’s Nabisco brands benefited from recommendations that emphasized carbohydrates as the foundation of a healthy diet.

Marketing the Food Pyramid
The introduction of the Food Pyramid was followed by a massive marketing push. Processed food companies jumped at the chance to label their products as “part of a healthy diet,” thanks to the pyramid’s emphasis on grains. This led to an explosion of “low-fat” and “high-fiber” claims on everything from breakfast cereals to snack foods.
But while the pyramid was supposedly about nutrition, its real impact was to validate the consumption of ultra-processed foods. Companies like Kraft, General Foods, and Nabisco spent billions on advertising to reinforce the message that their products were not only convenient but also healthy. The pyramid, which was supposed to guide Americans toward balanced nutrition, ended up serving as a marketing tool for some of the least nutritious foods in the American diet.
As Calley Means, former Big Food and Big Pharma lobbyist, said in his recent talk at the U.S. Senate, “The Food Pyramid was created by the cigarette industry through complete corporate capture and was an ultra processed food marketing document saying we need to eat carbs and sugar. We listen to medical experts in this country, so parents started giving their kids ultra-processed food. Carbohydrate consumption went up over 20% in the next 10 years.”
And because of the Agricultural Marketing Agreement Act of 1937, commodities could not advertise against it.
The restrictions placed on commodities beginning in 1937 stand in stark contrast to the free-for-all enjoyed by processed food manufacturers in 2024. Companies like PepsiCo, Nestlé, and General Mills spend hundreds of millions to billions of dollars each year marketing their products. For example, in 2018:
- PepsiCo spent nearly $1 billion on advertising sugary drinks and energy drinks alone.
- Coca-Cola spent approximately $4 billion on global advertising.
- General Mills spent $623 million on advertisements and capital investments.
In contrast, when the dairy industry launched the “Got Milk” campaign in 1993, all of $23 million was allocated from checkoff programs for advertising. That number increased over the years, but in 2016, the government contributed $562 million total to dairy, beef, pork, and lamb checkoffs.
Because commodities like beef, milk, and eggs are legally barred from making comparative claims and there is so much less money involved, they have virtually no way to compete with the marketing power of ultra-processed food companies. The folks who passed the 1937 Agricultural Marketing Agreement Act could not see far enough ahead to anticipate how this law would negatively impact the health of the most powerful nation on Earth, but we have all become victims of it. Today, the average American is bombarded with messages promoting artificially dyed cereals, “fortified” processed grains, and high fructose corn syrup laden beverages, while fresh, whole foods are conspicuously absent from the conversation.
Now you know why.
There is, of course, more to the story. But at a high level, the Standard American Diet is a story of war, corporate strategy, and regulatory oversight. It began with the food rationing campaigns of the World Wars, which emphasized patriotism and minimum nutrition rather than optimal health. It continued through the rise of tobacco companies in the 1980s, which applied their expertise in addiction and marketing to create a new generation of ultra-processed foods. And it is sustained today by an obscure 1937 law that limits the ability of whole foods to compete on a level playing field.
So the next time you’re grocery shopping and you’re bombarded by bright packaging and health claims, remember: the food products that dominate the shelves are there not because they’re healthy or nutritious, but because they have the deep pockets and regulatory freedom to out-market the actual food that keeps you healthy.
What you put in your mouth isn’t just a personal choice—it’s the result of a century of strategic decisions made by governments, corporations, and lobbyists, all vying to shape what’s on your plate. Understanding this history is the first step to taking back control of your diet and making choices based on nutrition rather than manipulation.
More articles from the blog
see all articles
July 2, 2025
“What do all fat, sick, unhealthy people have in common? At least this: they all eat.: An introduction to a new series about diet, psychiatric drug withdrawal, and performance.
read the article
June 25, 2025
Bad Medicine, Antidepressant Withdrawal, and the Incalculable Costs of Medicating Normal: My full talk at the University of Nevada, Reno Medical School
read the article
June 18, 2025
Smart things other people said, Part II: A big two weeks in the world of bad science, bad journalism, and why it’s good news for us.
read the article
June 11, 2025