originally posted in:Secular Sevens
View Entire Topic
This thread is inspired by another: view original post
I've been noticing people are wanting schools to talk about sexuality* within schools. But why the hell do you want that? Should schools be the one teaching kids to accept one another for who we are, or is it the parent's responsibility to do this? We learn about sex one way or another, be it through sex-ed in school, or discovering the internet lol. Do you think this is a necessity, a value that one really needs to learn? Why don't we teach kids how to live in the god damn wilds, make things, cook food, get fit.
If a school is supposed to be a place to get set for the world, then we'd never leave it. A school is made so that a citizen of the country can be a productive one that raises the GDP of the country.
*I meant sexuality as in homo, hetero, bi, lesbian, trans. There is nothing wrong with sex-ed, I just don't think schools need to tell students about the above terms, sex is sex no matter the sexual orientation.
English
#Offtopic
-
Edited by SweetTRIX: 1/31/2013 11:12:01 PMSchools are for teaching academics, which are to be taught by teachers based on the schools curriculum. Morality has no curriculum, unless you are talking about a religious institution. Thus, teachers should not be responsible/aren't being paid to teach the children morals. It doesn't matter what the parents are or are not doing, teachers should only be expected to do what they are paid to do, they are not there to be surrogate parents, even though that is what sometimes happens.