Auto ads

Sunday 17 March 2013

Racism In Hollywood? What’s Happening To The African Americans? Published by @Lanredavies

Even in 2013 racism is a major issue, so it’s no surprise that Hollywood – the business that calls the shunned, BLACK-listed – still puts color first. According to comedian Steve Harvey, “Hollywood is more racist than America is…And television should look entirely different.” In my opinion it’s because the way most African Americans are portrayed to think that in order make it in the Hollywood industry is to be cast on a reality show, get involved (physically) with an already established actor and/or musician OR to be a rapper. Let’s get past this. Drop down bottom to see what else veteran Steve Harvey had to say and then give me your opinion on the subject.

No comments:

Post a Comment

Follow