Let me start off by saying that college is not going to be my peak in life, and that’s okay. We live in a society where there is so much emphasis on furthering your education. It is this thing that is super important and society says you MUST do this or you are going to be a bum for life. And yet they charge you an arm and a leg for it.
So here we are broke as hell working towards a degree we might not even know that is the direction in life we need to be going. This is apparently the most important thing. These four years, the college makes you feel like this is going to determine your life. That if you don’t make this your job right now, and care about every single thing you do, that you are going to go nowhere in life.
Also, in case you were wondering, I am an Art Major. And if you were in this major you would know that “it’s not about the grade” which is why we literally don’t know what our grades are until the end of the semester.
I just cannot stand that these people think this is all our life is going to be. Like there are more important things in life than your career. Personally, I don’t think that my career is going to be by far the most important thing in my life at all. There are so many more important things.
Like Jesus. I consider myself a Nondenominational Christian. So for myself, college is not the biggest picture in life by a long shot. These years will not be what wreaks havoc on my soul. I have faith that God is going to take care of me. If that means falling on my face, then that will be a lesson in life that I need to learn.
If you’re reading this, thank you for your time.