JavaScript is required to use Bungie.net

OffTopic

Surf a Flood of random discussion.
originally posted in:Secular Sevens

This thread is inspired by another: view original post

Edited by GrandmasterNinja: 2/1/2013 12:46:35 PM
34

Why do people think schools should teach you everything?

I've been noticing people are wanting schools to talk about sexuality* within schools. But why the hell do you want that? Should schools be the one teaching kids to accept one another for who we are, or is it the parent's responsibility to do this? We learn about sex one way or another, be it through sex-ed in school, or discovering the internet lol. Do you think this is a necessity, a value that one really needs to learn? Why don't we teach kids how to live in the god damn wilds, make things, cook food, get fit. If a school is supposed to be a place to get set for the world, then we'd never leave it. A school is made so that a citizen of the country can be a productive one that raises the GDP of the country. *I meant sexuality as in homo, hetero, bi, lesbian, trans. There is nothing wrong with sex-ed, I just don't think schools need to tell students about the above terms, sex is sex no matter the sexual orientation.
English
#Offtopic

Posting in language:

 

Play nice. Take a minute to review our Code of Conduct before submitting your post. Cancel Edit Create Fireteam Post

View Entire Topic
  • 0
    I think there's a fine line between teaching the [i]information[/i] about sex, and teaching about the [i]morality[/i] of it. I don't think it's a school's job to teach morals, but I totally understand teaching informative stuff about it in sex-ed, like about the risks involved and how to use various forms of protection.

    Posting in language:

     

    Play nice. Take a minute to review our Code of Conduct before submitting your post. Cancel Edit Create Fireteam Post

You are not allowed to view this content.
;
preload icon
preload icon
preload icon