I mean especially Hollywood people, even other states away from CA, nashvilleTN, all around the world, people are treating life like it's one big party, living it up, everything is just so great for them, they hold events, the actors make movies, go to festivals, it's all so amazing yet,
for the "average" person, what you hear is "you B ETTER go to college, you BETTER work, you b etter do this , you better do that,"
like you're a f8cking slave, and like this other great lifestyle is for people who are better than you, or h**l, who even KNOWS how they do it, all I know is,
no one ever told THEM to really work hard, do this do that, they made their own paths in life, and made it work for them.
So why are we being lied to in a sense, and being TOLD what to do with our lives, and how to live?
WHy were we raised like that?
Tags: