Does Hollywood Understand Relationships?



Today we are going to discuss Hollywood and how it portrays relationships. Romantic comedies have been a staple of the movies since the industry first began. Because of this, some of what romantic themed movies portray has been ingrained into our society. The story of two destined lovers meeting and falling in love is a powerful one that strikes a very strong and resonating tone within us. That is why these stories will always be popular.

That being said, is what is being portrayed in the movies accurate or misleading? Are the messages we are constantly being bombarded with helping or hurting our mindset when we think about romantic relationships? Does Hollywood truly understand relationships? I believe the answer lies in what era you look in.

It is my belief that at one time Hollywood had a very good understanding of relationships and how they work but at some point in time Hollywood lost its way. Hollywood used to accurately portray how men and women fall in love and properly court each other but has now since forgotten those principles. Hollywood used to accurately portray what the most important factor in determining the success of a relationship was but has now gotten it backwards. Hollywood used to accurately portray the roles and behaviors that sustain and balance a healthy relationship but has now blurred the lines.

Over the next few weeks I will look at the missteps Hollywood has made that has led it astray from what it so accurately portrayed in the past. That being the truth about romantic relationships and how they work in the real world.

Until next time.