Have you ever been to the doctor or dermatologist and they’ve told you diet has nothing to do with acne?

Why is that? And is it true? Should I listen to everything my doctor says?

Check out my video below where I rant about doctors and pharmaceutical companies, as well as the food industry.


What do you think about doctors? Trust ’em? Hate ’em? Let me know.