It’s scary how all of entertainment is being subsumed under one company
What ever happened to Independent Movies? For a while it seemed like they were gonna be the new really big thing and change cinema for the better, but it seems like Indie Movies just kinda fizzled out, haven’t heard anything about them in years.
my favorite trend for movies that involve non-dystopic vision increasingly being filmed outside the U.S. and pretending to take place in the U.S. (like how Hallmark’s stand-in for small town USA is the Vancouver suburbs) while any movie needing to show a decaying shithole with rusted out infrastructure in a sad sack world is filmed in the US.
and I am here for it, waiting for my fellow citizens to notice we live in a dystopian set piece watching footage of other countries, thinking it’s our own.
dune pls