Best Documentaries About Hollywood
Filmmakers have been puncturing the myths — and celebrating the legacies — of Hollywood’s Dream Factory for decades. Documentarians have chronicled the history of the studio system, the stories of industry trailblazers, and reflections on the art of creating with shadows and light.
Here are noteworthy documentaries that reveal essential truths about Hollywood.






