Anecdote: I went on a detox cleanse last week: I felt great! I'd been drinking coffee daily for seventeen years, and managed without it for over a week! (I relented today because I had to get the family up at an earlier than usual time) I'm hoping I can maintain this great health and uplifted spirits and kick away depression and nervous anxiety by continuing to eat tons more fresh fruit and vegetables.
Was it in school or was it when you were trying to recover from an illness or disease that you learned about healthful foods? I'm an immigrant, and have only recently started in earnest to buy organic fruits and vegetables and increase the # of vegetarian meals in my family's weekly menu. I was motivated by the passage in the House of Representatives of the National Food Labeling Uniformity bill. I figured that if Congress voted for states' rights for food labeling (e.g., listing known carcinogens or toxic additives on the label) to be pre-empted, and the food labels to be dumbed down by the Food and Drug Administration, at millions of dollars of extra cost to the FDA, it must be that every American who can read an English language food label is deemed capable of reading and understanding what's in the food sold in the supermarkets. I didn't attend school here, nor do they cover this sort of thing in citizenship classes, nor do I remember seeing any educational shows on the American broadcast channels leaking over my country's airwaves, so I'm curious: when did you learn about nutritional healing foods, and start turning away from the processed convenience foods? What prompted you to learn? Does the U.S. cover this in elementary and high school?
P.S. Isn't it funny that a political party will be all for states' rights and smaller government when its opposition party controls the House and the Senate? But when it gets to control the House and Senate it starts pre-empting states' rights and expanding government?