A recent survey finds most Americans want employers to pay health insurance costs for workers.
Most Americans believe employers should play a role in providing health-care coverage for their workers, a recent survey shows.
Of more than 3, 500 adults surveyed nationwide, 81 percent said employers should pay all or part of their employees' health insurance costs, according to the Commonwealth Fund, a New York-based health-care policy group.
Smaller employers typically identify rising employee health-care costs as their biggest business concern. As a result, many have sought to diminish the role of employers in health-care coverage.
Divided by political party affiliation, the survey found support for employer provided health-care coverage among 88 percent of Democrats, 73 percent of Republicans, and 79 percent of Independents.