Even when you are a direct employee, your employer is not required to give you health insurance in the U.S. Most jobs that pay more than a certain amount offer health insurance, but not all do -- especially if the work is seasonal.
Take permanent employment. Your friends are doing the right thing. You'll pay fewer taxes yourself, you'll get unemployment and worker's comp, and so on. You need to find out, though, if your employer will want to lower your pay if you go regular; since he'll be paying taxes for you, his share of worker's comp for you, and so on, his costs for employing you go up. But if it all works out even based on what you get from unemployment and such, or even just almost even, I'd say, go for it. The worker's comp coverage is really worth something.
|