I'd like to ask ALL liberals/Democrats a question. Where does it say in the Constitution or the Bill of Rights, that "Heath care" is or should be a right? If you think it is, you are really a disturbed individual. I OWE you nothing! I DON'T need to pay for YOUR health care, just like you don't need to pay for mine. I work and pay for my own. So, is it equal if I make 100K a year and pay through the nose, while you don't work and get the same Health care as I do? My business is NOT paying for you.