Monday, July 11, 2011

Has wrestling in mainstream American wrestling companies lost its significance?

Since about 1998, mainstream pro wrestling companies in the US has put the actual wrestling aspect on the back of their minds and the dramatics in the front. WWE is most notible of this as they had a good focus on wrestling, until 1998 or 1999 when they focsed on the theatrics more than wrestling. This philosophy of entertainment before wrestling continues today in WWE and more than likely will never go away, seeing as most fans enjoy it being that way. WCW started to lose some focus on actual wrestling near the end, then they push poor writing, which lead to the demise. TNA was amazing until, it caught the "sports entertainment" bug in 2009. TNA slowly drifted downwards, though some may attempt to make wrestling into their main focus again. So, I ask, has wrestling lost it's significance? If so, do you think it could gain it back?

No comments:

Post a Comment