Just read a book last night and it reminded me of something I learned at school but forgot somewhere on the way.
After the War of Independance the Americans formed a new kind of society. The fundament was the believe, that the state is formed by the people. So first there were people. Then they agreed to form a society. (Sounds obvious, but at that time, it was not!)
The individuals had certain right before they joined the new state. These rights are listed in the Bill of Rights. They were not given by a government nor can they be taken away. The only role of the government is to protect those individual rights. If the government fails, the people have the right to fire it. I think this is even a reason why having weapons is an important right in the US.
Did I get it right?
Bookmarks