Most organic food is USDA certified organic. It's a very strict certification and even farms in other countries are subject to the USDA inspections and rules in order to wear the certified organic seal on their foods. Major brands are selling organic food these days, it's not just in hole in the wall health food stores any more. Organic food has nothing to do with being "new and improved" or some sort of upscale way to eat. It has everything to do with eating the way we are supposed to and the way our grandmas probably did. Whole Foods has lots of good stuff and I can't wait to go!
More information about formatting options