TV and film play such a heavy yet unsuspecting role in poisoning “post-racial America.” As much as it helps the uninformed and ignorant to see the unthinkable plight and struggles that people of color have had to bear throughout time, all of the period-style movies and TV shows that touch on slavery or prejudice and white characters that are always in a position of power or whatever…they of course have positive impacts and benefits but I feel like they also set us back in a huge way.
If we want to live in a post-racial America why the hell aren’t we creating more and more stories & characters & settings that depict a post-racial America? (In addition to those period-style productions because I do believe they still hold up.)
Is it because (for people of color) the pull of stewing in anger and hatred of how we’re misrepresented is stronger than the hunger needed to take action? Also, are white people with mindsets from 1878 trying to cling to that mentality through the arts? That’s just fucking terrifying if it’s true.
Not to say that what we see on TV or hear in music & other forms of media & art are solely meant to guide us in life but they sure as hell affects change. Some people are ruled by what they’ve seen in media.
There are many ways to advocate for change but personally, I’ve always been better at putting my thoughts on paper & executing them that way. That’s how my voice works. And I know it’s the same formula for so many other people. That’s why I think we should take advantage of the fact that media’s influence on society has always been white on rice. There are platforms & podiums that can be used to promote a better future out of the things we can create.
That can be through any and all forms of art. I’ve found and enjoyed some genuine bodies of work that fit the post-racial bill but I shouldn’t have had to dig so far to discover them. I actually shouldn’t have had to dig at all.