In my opionion the film “The Birth of a Nation” coming out this early on in the film industry set the tones and stereotypes or standards, for other directors and writers. To be able to be display racism like there is nothing wrong with it. I Have been thinking of a specific movie but hoenstly the truth is this is we see this everyday in the media and on tv. they would never make people of color major roles, or if they did they played the role of being a caretaker, a prisoner, or aggressive. if you were a white kid back then you probably wasnt around many blacks if any at all so if the only time you seen one in action is from the perspective of a racist society you would feel that this is normal and would believe you had no reason to display what wasnt real. When African Americans would create films that were trying to kill some of those untrue stereotypes, the whites didnt flock to see it so it never got across to them the audience for the film were people who already understood that. In the actual film “Birth of a Nation” it immediately started with degrading African Americans it shows them “coming to america” and being bought and sold. this implies that whites are superior to blacks.