Do you need to take vitamins?

26 September 2012
Angel Barroso

Vitamin or mineral supplements can be beneficial, but taking them often makes us feel like we are doing the right thing for our health. Unfortunately, a number of research studies--including the extensive Iowa Women's Healthy Study indicate that supplements are not always beneficial to your health. Worse yet, supplementing with certain vitamins or minerals may cause harm in the long term.

For some people, vitamins offer important health benefits. The Dietary Guidelines for Americans recommends certain vitamins for certain groups of people. However, the Mayo Clinic reports that healthy people who take extra vitamins as an "insurance policy" for poor eating habits might actually increase their risk of health problems.

So how do you know what is right for you? The best way is to consult your doctor. Experts at the USDA's Food and Nutrition Information Center (FNIC) recommend informing your doctor if you are taking a dietary supplement, including vitamins. This is especially important since certain supplements may interact with prescription medications, or lead to other health problems. If you are taking, or considering taking, a dietary supplement, you should take the time ensure that it is beneficial--not harmful--to your health.