Join the Meeting Place for Moms!
Talk to other moms, share advice, and have fun!

(minimum 6 characters)

Do companies ever make money off of the health insurance they offer?

It's open enrollment at my husband's work and the health insurance plan that we were looking at is FOUR times more expensive than the exact same plan we can get on the open market. It's gotta be some kind of a scam doesn't it? Like the company gets huge kick backs from it or something?

Anyone else seen anything like this?

 
Erica_Smerica

Asked by Erica_Smerica at 5:39 PM on Oct. 26, 2010 in Money & Work

Level 28 (34,798 Credits)
This question is closed.
Answers (4)
  • Not sure about that one. The company I was working for offered it for about $550.00 every two weeks for just me!! When I decided I would still enroll, they proceeded to tell me it was not available in my area. I told them fine I would travel to the nearest Dr. and they told me No, I could not do that. I left the company after that. Sucks doesn't it.
    m-avi

    Answer by m-avi at 8:14 PM on Oct. 26, 2010

  • No legally
    rkoloms

    Answer by rkoloms at 5:57 PM on Oct. 26, 2010

  • Usually the company contributes to it.
    mompam

    Answer by mompam at 7:02 PM on Oct. 26, 2010

  • Not that I'm aware of. Usually the company pays a certain percentage of the insurance plan and the employee pays the rest.
    HotMama330

    Answer by HotMama330 at 11:56 AM on Oct. 27, 2010

close Join now to connect to
other members!
Connect with Facebook or Sign Up Using Email

Already Joined? LOG IN