I just saw this sentence come through on Twitter: Is mental health just as important as physical health?

Mental Health IS Physical Health.  I don’t understand. The brain is a part of the physical body, right?

Why have we not moved past this already? This should not be a conversation. It should just BE. If your mind is unhealthy, your body is unhealthy.

Why do we as a society even ask these questions in the first place?

Is an optometrist just as important as a dentist?

Is there a stigma attached to someone who has pneumonia? Is a having a broken leg more acceptable than a thyroid disorder? A headache can be caused by a brain tumor or anxiety, either way it’s your physical health being compromised.

Why can’t we all just agree that all things related to the body and mind is considered physical heath and move on to ways to help people who need medical assistance FOR WHATEVER their body needs.

HEALTH WISE? I’m being treated by a doctor for depression. THE SAME as I’m being treated by a doctor for diabetes. One is not more “physical” than the other.

I guess what I’m saying in the end is I DON’T GET IT.

Rant over.