Radius content is developed by experts to increase your professional knowledge and lower your global operating risks.

6/25/2019

Many workers, perhaps especially those raised in the U.S., would balk at the idea of taking compulsory medical exams as a condition of work. U.S. employers are not even permitted to ask a job applicant to answer medical questions or take a medical exam before making a job offer. Outside the U.S., however, employers are often required to give their employees medical exams. For example, Japan requires employers to give their workers a medical exam upon employment, and then to require additional exams at least once a year.