Why is Football So Important in America?
January 5, 2017
It’s that time of year where NFL is the main topic at the dinner table during the holidays. That being said, why is football so important in America, but not in other countries? The fact that we religiously celebrate this sport while other countries frown at the thought of it just shows how American football really is. Many people believe that football derived from the English sport of rugby; however, it was actually played by Greeks many years beforehand, except with no rules whatsoever.
Football was adopted as the most popular and important sport to Americans at an early year (1800) and has long since been used to express American values, passions and national identity. If the popularity of the sport is not an indication of importance, then the paychecks to its participants are. Professional football players make more than people who serve our country because of the fact that they are mainstream entertainment and people pay to see talent. While college football players don’t necessarily get paid to play, most of the time they are attending the college of their choice through a scholarship, usually associated with their talent of the game. I don’t think it’s that football provides importance or significance to our lives, but that we marvel it for entertainment.
America is home of many talents, and football just so happens to be the most important one to us. We celebrate the sport and even include some of the competitions, games and events as personal holidays. We take pride in our favorite sport, and we show that through games and endless amounts of news about our winning teams. America is truly the mother of all things football.