All over the news I am hearing all about how teen pregnancies are on the rise and are the highest in a decade. Lots of issues are being blamed. Some people argue that it is because abstinence is being taught in place of birth control and other people say it is because parents are not around as much these days to teach their kids. My opinion is that these things are just more accepted today. It is everywhere. We are living in a very sexualized culture. Television shows like The Secret Life of an American Teenager make things like this seem not so bad. Teenage role models like Jamie Lynn Spears are making it seem okay. High schools with day care centers, I mean, really? I also think that teaching your kids right from wrong goes a long way, but why does society have to make it so acceptable? What do you think? Should we make it that easy and have day care centers provided in our high schools?