My mom f***ed my DH and now I'm pregnant by her son.
So as most people know next year you will have to have health insurance or receive a fine from the government. I've seen that a lot of people are upset by this. Now, everyone knows that, at least in the majority of states, you have to have car insurance. It's the law. And if/when you are caught driving with out insurance you will get a ticket and usually a substantial fine.
So my question is, why is health insurance being mandated so different than car insurance? If you don't have health insurance and you go to the ER you usually can't pay your ER bill which leads to higher insurance costs to everyone else. The same as, if you have no car insurance and you get in a wreck with another person, it also raises car insurance costs.
(Changed the title so you drama whores would come in here lol)