... racist?What % of black adults get insurance at work? What % of black kids get it through a parent's job? Is that?
Probably nearly 100% with jobs.What % of black adults get insurance at work? What % of black kids get it through a parent's job? Is that?
Sorry I don't have the statistics so I can't really answer that portion of your anawer.
But many americans are fortunate enough to work for companies that provide health insurance. For the companies that do provide insurance I don't think that just the black employee receive health insurance. For the companies that do provide insurance - I believe as long as you qualify for it (work a certain amount of hours) you are eligble.
But just because a job offers it, you are not required to accept it. Their plan may not work to ones benefit so they may choose to purchase insurance on thier own.
There are some smaller companies that cant afford to provide insurance to their employees but i'm pretty sure white people work for those companies too.
Does that answer your question?
If these people work at jobs that offer insurance 100% of them would be covered if they chose the option. You can lead a horse to water but you can't make them drink.
Since when did a job have to provide insurance? It's a benefit, not a right.
No comments:
Post a Comment