Skip to main content
4 answers
4
Updated 1432 views

Does it really matter where you get your degree?

Does it make a difference to get a degree from a community college rather than a big name university? #college #college-selection #degree

Thank you comment icon it does matter because you can not get a good job when you do have a degree and if you do have a degree you get a good job tireyona

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

4

4 answers


1
Updated
Share a link to this answer
Share a link to this answer

Ellen’s Answer

This is a tough question. My mother used to say that it didn't matter where you got you degree, but it does matter what you do with it. Yes, there will always be some employer or graduate school dean who is impressed by certain schools but prejudiced against others---Ivy League schools vs. State Colleges. However, there are so many other factors that add up to success in life, that in the long run, the answer to your question is "No".

Your grades are important..(straight A's from a state college vs. a C+ average from an Ivy League school?) Your extra curricular activities while in college, these say a lot about your social abilities. Your part-time work also says alot about you; being able to hold down a job while being a full time students takes energy and drive. Many employers and graduate school admissions deans understand the huge cost of private colleges today and would understand why a bright student might choose a less costly school over a prestigious one.

But the most important factor will be your personal references from your professors and your employers. Your work ethic, your character, how you treat other people, how you handle stress, your enthusiasm for your field of study, these are all independent of the school you attend.

As proof, google some famous people whom you admire and read about where they went to college...you might be surprised!

I hope this perspective helps.
1
0
Updated
Share a link to this answer
Share a link to this answer

Simeon’s Answer

It doesn't really matter where you got your degree. The further past graduation you are, the less the specific college you attended matters. If you can afford a better college, go for it, but it's not worth getting in massive amounts of debt to obtain. As long as you get the degree and can prove your employability to the companies looking for good people, you'll be fine.
0
0
Updated
Share a link to this answer
Share a link to this answer

Monica’s Answer

The answer is Maybe. It really depends on what industry you’re looking to get into and what the job seeker competition appears to make the norm. If you want to be a RN, no it doesn’t. If you want to be a Doctor, then yes.

some hospitals will prefer graduates of a certain alma mater. If you’re interested in business, it doesn’t matter where you get your undergrad. But achieving higher accolades at a “known” school will help you break through the herd. Ultimately what matters is that you earned a degree. It’ll be your drive that will propel you through your desired industry.

Monica recommends the following next steps:

LinkedIn is an awesome tool for research. You can search companies for a company you’re interested in and see the kinds of educations their employees have.
Never hesitate to spark a conversation with someone in your desired industry and asked them 2 important questions. 1. Do you enjoy what you do? 2. Would you have changed your major if you knew then what you know now?
Whatever your major - never stop learning.
0
0
Updated
Share a link to this answer
Share a link to this answer

F’s Answer

To be honest it depends, I am a strong believer of putting in the effort and then seeing where that lands you. These days, the name of your university is not as important as the skills that university teaches you and how you apply those in the real world. My advice would be to give it your best shot no matter what your educational background as the skills you have are worth more than the name of the university you attend.
0