Bernie Sanders actually said...
The United States of America is the only major country on earth that does not guarantee health care to all people as a right.
Context
Sanders argues for universal health care as a fundamental right.
01/17/2017