Film: An American Tradition

Hollywood is an American tradition. From the era of classic silent films, to the action packed CGI thrillers of the 21st century, movies have always captured Americans' hearts. During times of war, social unrest, or economic upheaval, movies have helped people make it through. They can provide necessary social commentary, the ability to laugh, cry, and sometimes just forget. Alfred Hitchcock even once said: "The only way to get rid of my fears is to make films about them."

By the early 1920's, Hollywood was already becoming world famous, producing movies that are now considered classic films. Soon after, Hollywood began to be known as the home of famous actors, and the center for the film industry. It has carried that stigma through the years. Even today, Hollywood is still world-renowned for its celebrities, and glitzy inhabitants. People line up in droves outside movie theatres to catch the latest Hollywood Blockbuster, hoping it becomes the newest classic film. Every year thousands of people move there hoping to make it big and realize their greatest dreams. The movie industry is as American, as apple pie, baseball, and democracy.

Not even Charlie Chaplin himself, one of the earliest and most famous movie stars, could predict how big movies would become. He was quoted once saying: "Movies are a fad. Audiences really want to see live actors on stage." However, today we are faced with crisis of live acting vanishing, and film quickly replacing it.

People need movies. As life gets difficult and hard, sometimes there needs to be an escape. Art has always been able to serve as an outlet for the masses. Classic movies can help to do this. Film crosses all sorts of boundaries, uniting many different kinds of people. At times, it serves as an excuse to forget our problems for a while. At other times, it helps us all love and grow together.

World famous actress, Ingrid Bergman once said: "No form of art goes beyond ordinary consciousness as film does, straight to our emotions, deep into the twilight of the soul." This held true then and still holds true today. Americans love film and will continue to love it, as an American art form.

Since the beginning of the film industry people everywhere have fallen in love with classic movies, teaching us how to live and love. Film has become an American art form, with Hollywood at its epic center. The aura surrounding Hollywood, and the film industry continues to grow in lore every year. As long as Hollywood produces classic films, people will still watch and love them.


Original article

No comments: