This Fall TV season is starting to arrive and the Emmys are just around the corner. Likely both will provide the media with new opportunities to preach to ordinary Americans about how diverse Hollywood is. But what those claims won’t tell you is that Hollywood is one of the most racist, sexist places to work.
Entertainment Tonight has already bragged about just how diverse and progressive the 2017 Emmy nominations list is. “In total, 25 people of color were nominated across the Emmy’s 18 onscreen acting awards,” Stacy Lambe wrote in July. The Fall TV season is also an annual reminder of Hollywood’s influence. Several new shows are slated to target President Trump and conservatives, while others will promote identity politics like Will and Grace.
But those shows won’t admit one essential fact. For all of Hollywood’s attempts to tell viewers how to live, sexism, racism, and bigotry in all forms run rampant throughout the industry. So why should the audience allow stars any moral high ground?
On a regular basis, the media awards Hollywood and entertainment as an industry that promotes values. Politicians, celebrities, and journalists alike use it as a pulpit to preach from. George Clooney, while accepting his Oscar in 2006, praised Hollywood for being “out of touch” because the industry promoted civil rights and LGBT rights when no one else did.
It seems that Hollywood is very much so “out of touch,” but not in a good way. Between the #OscarsSoWhite movement that started with comedian Chris Rock’s observation that there is very little diversity in the academy, the constant allegations of sexual harassment made by actresses on a regular basis, and the insane racism and sexism that actors are subjected to, Hollywood needs help. Actress Chloe Bennet, on September 1, 2017, complained that Hollywood “was racist” because it forced her to change her name to hide her Asian heritage. In 2009, Nicole Kidman testified before Congress that Hollywood “contribut[ed] to violence against women by portraying them as sex objects.”
It doesn’t seem to matter how many complaints are leveled by celebrities, politicians and moguls at Hollywood, TV shows and movies are still used as a bully pulpit to preach to the public.
In a speech given at Dreamworks studios in 2010, President Obama addressed the entertainment industry: “We have shaped a world culture through you. And the stories that we tell transmit values and ideals about tolerance and diversity, overcoming adversity, and creativity, that are part of our DNA. As a consequence of what you’ve done, you help shape the world culture.”
Obama was far from alone. Former Vice President Joe Biden argued that the television series Will and Grace helped people to understand what it meant to be homosexual. Even CNN’s Don Lemon, while accepting his GLAAD award in 2017, praised the entertainment industry for promoting equality and the LGBTQ.
The New York Times praised Hollywood in 2009 for “helping images of black popular life emerge … and creating public spaces in which we could glimpse who we are and what we might become.” By allowing African-Americans to part of the “rarified summit of the Hollywood A-list,” the Times argued that Hollywood, in fact, gave us an African-American president in real life.
In 2006, while giving an acceptance speech at the Oscars, actor George Clooney raved about Hollywood: “I would say that we are a little bit out of touch, in Hollywood. Every once in awhile, I think: it’s probably a good thing. We’re the ones who talk about AIDS when it was just being whispered. And we talked about civil rights when it wasn’t really popular. This academy, this group of people, gave Hattie McDaniel an Oscar in 1939 when blacks were still sitting in the backs of theaters.”
Yet somehow, in some way, many actors and actresses, along with singers, critics, and other Hollywood celebs have spoken out against the very business that adopted them. They have complained that Hollywood is one of the most racist, sexist, homophobic and violent, industries in the world.
On Box Office Mojo, out of the top 50 top-billing actors in Hollywood, 17 (34%) of them have spoken out against the major sins of their industry, repeatedly. 8 of those actors were African-American, and six were women.
Actors may talk about how much the industry has changed toward people of color, yet there are still many complaints that Hollywood is still racist.
Actor Kal Penn launched an explosive tweet storm in March of 2017 against Hollywood, accusing directors of asking him to have an “authentic Indian accent” (even though he was born in America) and calling the directors of Sabrina the Teenage Witch “such dicks.”
He also said, “Thank God things have changed since then.” Really? Have they? Three months later, in June of 2017, Variety reported that eight television series were cancelled that had “non-white and/or non-male leads or ensemble casts.” Maureen Ryan wrote that what was troubling about the industry was that there was a tendency to ignore an attempt to increase diversity and instead, most people “very easily and even reflexively turn a blind eye to the return of the status quo.”
And despite the #OscarsSoWhite controversy, in 2016 the Los Angeles Times reported that in fact, Oscar voters were still 91% white and 76% male.
In June 2017, BuzzFeed argued that nothing, absolutely nothing in the film industry has changed since 1957. Its reasoning? “Most big Hollywood movies still safely confine themselves” to the “straight white norm.” The praising of the few mainstream films that thwart the stereotype, such as Wonder Woman and Moonlight “does more harm than good” in Hollywood.
Think it’s better now at least for females in the entertainment industry ? Think again. Both in entertainment itself and in the actual business, women are subjected to harassment and violence, repeatedly.
BET has been sued this year for firing its female programming executive while she was on medical leave for breast cancer. Does intersectionality exist in the entertainment industry? Apparently not. According to the lawsuit, the executive was subjected to a culture fostered by the male leaders at both BET and Viacom “in which women are systematically discriminated against, harassed, intimidated, attacked, and exploited.”
To add to this list of wrongs, alleged ‘feminist’ director Joss Whedon was recently exposed by his ex-wife as someone who abused his ‘feminist’ reputation to have multiple affairs with his actresses, all while his unknowing wife was at home.
The recent smash movie It completely cut out Stephen King’s child orgy scene, originally written in the novel. Why Hollywood continues to give this psychopath story a pass when it contains a graphic, underage, gang rape is mystifying to say the least.
In 2010, Jessica Alba walked out of her own film screening, The Killer Inside Me, where a main character is displayed as “punching and kicking two female characters, played by Alba and Kate Hudson.” At the same screening, angry audience members “confronted” director Michael Winterbottom, “demanding to know why it was necessary to show such gratuitous violence toward women.”
The way women have been portrayed in films has been so bad, that someone testified to Congress. In 2009, actress Nicole Kidman appeared before Congress to “accuse Hollywood of contributing to violence against women by portraying them as sex objects.” This isn’t the 1950’s where The Fountainhead gave a positive portrayal of rape and no one responded. This is the 21st century. If you believe the media, we as a country are supposed to be above all that. And yet we are still stooping to pander with scenes in movies and television depicting violence, rape, and sexual assault towards women.
Teen Vogue just released a study on September 13, questioning why the movie industry was releasing films on white male serial killers. The line-up included Ted Bundy, who “sexually assaulted and murdered at least 36 women and girls.” Zac Efron will be starring as Bundy.
The Guardian summed up the situation perfectly in 2016, when it said, “When women are targeted for violence, that violence is overwhelmingly sexual.”
Hollywood doesn’t just glorify violence in fiction; it has a tendency to glorify rape and assault in real life. Casey Affleck won the Oscar in 2017 for best actor, even though in 2010 he was accused repeatedly of “sexually terrorizing female colleagues” by multiple women. Acclaimed director Roman Polanski actually pled guilty to rape 40 years ago and fled to Europe to avoid being arrested. And yet Whoopi Goldberg told everyone while he received a standing ovation at the Oscars, it wasn’t “rape-rape.”
In July 2017, actress Zoe Kazan told a story of being sexually harassed on set of films. She said, “There’s no HR department … We have our union, but no one ever resorts to that, because you don’t want to get a reputation for being difficult.” She also had a producer ask her “If I spat or swallowed.” Unacceptable and unprofessional behavior is overlooked in Hollywood.
Feminists constantly target Hollywood for its blatant unchecked sexism, as well. Hollywood is literally one of the most sexist industries in America.
In an interview with Marie Claire, Natalie Portman admitted that “Compared to men, in most professions, women make 80 cents to the dollar. In Hollywood, we are making 30 cents to the dollar.” In May of 2017, actress Jessica Chastain spoke out against the representation of women at the Cannes film festival. She said that the experience was “quite disturbing,” and that contrary to the films presented at Cannes, “women have their own point of view.”
In addition, actress Charlize Theron also recently criticized the industry for “rarely allowing women to command big-budget films.”
This isn’t just the criticism of a few individuals. Huffington Post reported in 2016 that while 50% of the graduating classes from film schools and art schools were female, they rarely make it to Hollywood or the entertainment industry, because “An overwhelming majority of film studio heads were white males, and the same was true for senior management.”
The issue was so serious that the ACLU opened a federal investigation into the issue of why female directors are almost never hired in Hollywood in 2013. As Melissa Goodman of the ACLU said, the statistics “alone suggest there’s a real problem here.”
In July of 2017, a study from the Media, Diversity, and Social Change Initiative at the University of Southern California’s Annenberg School for Communication and Journalism released a study on the representation of women and minorities in film. “Of the 4,583 speaking characters analyzed from 2016’s top 100 films, 31.4 percent were female, a number that is basically unchanged since 2007,” it reported.
While Obama and others praise Hollywood for teaching the world about “America’s values,” what they fail to realize is that Hollywood is not teaching -- or living -- the right values. Instead, the entertainment industry is consistently promoting a culture of racism, sexism, and violence against women.