What are you babbling about? The US was like 90+% white till fairly recently. Same for England (actually, that may still be the case). India, Japan, Hong Kong, all have had thriving movie cultures with nary a white person in sight.It was 100 years of 99% white characters or ethnic characters being played by white actors. It’s not even been 10 years of this inclusion casting. Some for the better and some of it forced. You’ll survive
Get out of your bubble. Movies reflect the cultures that make them. Hollywood isn't, and has never been, 99% white. The question is, are they making movies NOW that actually reflect society? I'd argue that no, they are not. They are amplifying fringe groups at the expense of the actual majority population and often this disconnect is reflected at the box office.