Gender Theory

Gender theory is the idea that a person’s feeling of being masculine, feminine, or neither, is more important than their physical sexed body, and those feelings should take precedence in law and in everyday life. Without any public consultation, this belief has taken hold in our institutions – education, health, sport, justice – and is causing widespread and sometimes irreversible harm to children, women, and lesbians.

 

Here are some FAQs to get you started.

 

Below you will find information, evidence, facts, studies, testimonies, and support groups, that will help you to better understand and counter the harmful effects of transgenderism.