Post

Knowing What Happened To The Indians

Makes me wish America was never colonized. The Native Americans had such a wealth of culture and spirituality and they never choose to fight the colonists until they wanted to take the land for themselves and craft the evils of technology, metal and railroads. The lands of the Native Americans were once beautiful and free, but the colonists forged it into a land of austerity where nature is only seen by the outcasts who have lost their way and the time to appreciate such things as nature most of the time. The fire in the colonists' hearts was roused just as blindly as was the Germans in their pursuit of the Third Reich; they wanted a place for their desires and did not take in consideration the values of men born on a different path. The tremendous stampede was nearly unthwarted and the American feud quickly forced the image of Natives as a villainous, blood-hungry people when all they wanted was to ensure that they could at least keep their existence! Every person with this lost people's blood in their veins seems to carry a humble and benign nature that is only angered when there most important possessions are threaten. Even in their spirits, carried on through the bloodline, anyone should see that these were people forced into undying war, and just for being different. These true people were eradicated along with the values of the whole continent and their existence now is one of much dischord, isolation and resignation.
FunnyLookingCorpse FunnyLookingCorpse 18-21 Jul 11, 2012

Your Response

Cancel