Bernie Sanders actually said...
Healthcare in America, in my view, must be a human right for every man, woman, and child in this country, and not simply an opportunity for billionaire investors to make huge profits.
Context
Sanders emphasizes healthcare as a fundamental right, criticizing profit motives in the system.
09/12/2024