When I was younger, my mom would often remind me to eat my vitamins. Back then it was always a pain because they did not taste good and it didn't seem like they helped in my health overall since I still got sick regardless of whether I took my vitamins or not. So do vitamins really improve people's health?
After some research, it turns out that I'm not the only one asking whether vitamins make one healthier or not. Some studies even claim that taking vitamins may actually cause one's health to decline or put one more at risk for certain diseases.
In theory, vitamins are meant to prevent diseases by supplying the body with vitamins or minerals that it may be deficient in such as calcium, Vitamin D, or Vitamin C. Although it is true that it is good to supply your body with these nutrients, taking supplements (vitamins) will not be enough. Supplements, such as for Vitamin D, may also have such a low dosage in them that it is not effective. It is much better to eat food that already contains these nutrients. Also, taking more than the recommended amount of vitamins a day can lead to more health problems rather than supplying your body with more nutrients. An overdose of vitamins can cause nausea, diarrhea, and stomach cramps. Some consequences of overdosing in vitamins are much more sever, for example an overdose in Vitamin A can lead to liver damage, hair loss, blurred vision and headaches. For more information on what happens when you overdose in certain vitamins, click here.
Although it is better to eat food that already contains nutrients that your body needs, supplements can help to a certain extent. For example, for those who do not drink milk or do not drink enough milk, a doctor may advise that they take calcium supplements so that they have an adequate amount of the mineral in their body. However overall, studies have shown that for a healthy person, taking supplements does little to nothing regarding their health. Eating healthy is much more effective.