Now, I don't care to discuss the alleged complaints American Indians have against this country. I believe, with good reason, the most unsympathetic Hollywood portrayal of Indians and what they did to the white man. They had no right to a country merely because they were born here and then acted like savages. The white man did not conquer this country.
"[The Native Americans] didn't have any rights to the land and there was no reason for anyone to grant them rights which they had not conceived and were not using.... What was it they were fighting for, if they opposed white men on this continent? For their wish to continue a primitive existence, their "right" to keep part of the earth untouched, unused and not even as property, just keep everybody out so that you will live practically like an animal, or maybe a few caves above it. Any white person who brought the element of civilization had the right to take over this continent." * Source: "Q and A session following her Address To The Graduating Class Of The United States Military Academy at West Point, New York, March 6, 1974"