Quote:
Originally Posted by dksuddeth
that would simply mean that what you are saying is 'white people rule the world, the rest just have to follow'.
Slavery didn't mean that those people didn't have natural rights, just that they were being denied them at the time.
|
dk, you're looking at history through the lens of common-times belief and morality. For slavers, slavery was entirely moral. It wasn't until it got said to be otherwise that it was looked at any other way. It really actually WAS TRUE that white people ruled the world and the rest just had to follow. And for a while there, the sun never set on the British Empire. And then the world changed and made that NOT TRUE ANYMORE.
Prior to the creation of the United States, the subjugation of lower classes by upper classes, nobles, and kings was moral, proper, and divinely ordained. Looking through the philosophical perspective our founders gave us and that we live in now, that looks exploitative and horrible, but THEN AND THERE, it was just how it was.
To now say, "Our founders were in touch with something that had always been there through tens of thousands of years of human history but nobody somehow noticed until Jefferson sat down in 1776 and started writing" is just silly. Doesn't it make more sense to say that our founders created a new view of the interrelation of government and the public? If you look at history, isn't that more or less what happened there?
EDIT: I just want to add that this is one of the most interesting conversations I've had in TP in recent times, and I sincerely thank everyone engaged in it for the opportunity for real thinking it's giving me.