Skip to main content
1 answer
1
Asked 815 views

Would it be better working in the United States or working in America?

Hello, i recently studied abroad in the United Kingdom and i see that the value of the dollar is way higher over here and everyone is making more, minimum wage is higher, education is easier to get, and theres a variety of living standards. I am from the United States so maybe someone can give me insight on whether or not i should strive to work in the United states or the United Kingdom. I am doing a double-major; one in Psychology and one in Biology with a concentration in Neurology. #psychology #biology #united-kingdom

+25 Karma if successful
From: You
To: Friend
Subject: Career question for you

1

1 answer


0
Updated
Share a link to this answer
Share a link to this answer

Jenn’s Answer

I am American and have worked in the UK. Yes, you get paid more but everything is almost twice as expensive in the UK versus the United States. So it becomes a wash. You might get paid less in the U.S. but everything is more affordable based on your salary. I think the bigger question is where do you want to live??? Having experiences in working outside of the U.S. are valuable for your resume as multi-national organizations/global companies would find them valuable if serving in a global role. You must also know that you will need a visa to work in other countries if you are a U.S. citizen. The local companies in that country will need to support your visa to work there, however, you may need to pay your taxes in the U.S. to keep your citizenship and pay taxes in that country, which could become a huge expense. Lots of things to look at before considering working abroad. Here is 7 simple steps: http://www.forbes.com/sites/alexandratalty/2013/09/13/seven-simple-steps-to-finding-a-job-abroad/13a5711e7b5c.


Hope that helps!

0