I’m so sick of movies today. Why does every movie have to encourage bad behavior? It’s like every movie has a couple who’s confused and cheats on each other and pretends like that’s love.
I’m disgusted.
Because nobody in Hollywood has ever been in love. They just sexually harass each other until they find someone they don’t mind being sexually harassed by on a regular basis then get married for the prestige and then get divorces when they realize they hate each other more than they like making money off of paparazzi photos.