Search
Imperialism
- jonahsaclausa
- Feb 8, 2016
- 1 min read

By definition, the word "imperialism" is the policy of extending a country's power and influence through diplomacy or military force. In Hawaii's case, imperialism wasn't a factor with wanting to take it over because in today's society, even back then, imperialism wasn't needed to get what people wanted. In my opinion, they must have felt that military force wasn't the answer in getting what they felt was necessary to have.
Comments