Bernie Sanders actually said...

The United States of America is the only major country on earth that does not guarantee health care to all people as a right.

Context

Sanders argues for universal health care as a fundamental right.

01/17/2017

https://www.congress.gov...

More Bernie Sanders Quotes

View All Bernie Sanders Quotes