Skip to main content
3 answers
3
Asked 950 views

Would the college help me look for a job, that's in my major, or at least point the way for me to get some experience and insight of my future career?

I want to get a career in dental hygiene or physical therapy, and I would like to know what the job is like from my view, and not someone elses. I would want the college that I'm going to, too show me some opportunities in the job field where I can earn some experience and maybe a little training. #work-life-balance #savings #health-education

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

3

3 answers


0
Updated
Share a link to this answer
Share a link to this answer

Ross’s Answer

Many colleges and universities have career centers that would help you with finding employment or internships in the field that you seek.

0
0
Updated
Share a link to this answer
Share a link to this answer

Lita’s Answer

When you are researching a major to study, look at a variety of different colleges and universities which offer the major. When you find a school that you are interested in review the required courses for the major and read the course descriptions for the core (major) classes. This will provide you with an idea of the knowledge required to do the job in that field. Also look for associations and resource groups (including volunteer opportunities) to gain a better understanding of the job opportunities and the industry.
0
0
Updated
Share a link to this answer
Share a link to this answer

Jabari’s Answer

Sure. In addition to to setting up recruitment events on campus, many colleges and universities also have career placement offices on campus.
0