OK, so I am a Christian, and I have allways been in Private/Christian schools...(atleast they claimed to be, two were in between, one was 'full' christian) and as you have peobably figured out, I have been to three schools before, (Mainly because I moved twice) anyway, I was Homeschooled this year and am going to Public school for the first time next year!! I am half exited and half scarred! I see those shows like Degrassi, and Life with Derek, (which are still very different shows, but they still are in public school) and in Degrassi, BAD things happen, I know it's a drama show but still... do those things really happen? (FYI- 4 those of you who haven't seen Degrassi: People take drugs... and endup in the Hospital, One guy is shot, and the killer ends up being killed, Peole have bollimia, and smoke underage-that one I know is true, and people get preggo, and there is even a stabbing... where the SWEETEST GUY EVER get's stabbed and there is LOTS of drama from that! and the
Tags: