1. The overall plot is overdone and boring. AI machines gaining sentience is probably the most common trope in sci-fi. Ex Machina, Terminator, 2001, BBC's Humans, and countless other movies and books.
At this point you could say that about just about anything that's made. And those examples you gave are all great creative works in their own right, I don't care if they cover the same themes of AIs gaining sentience, I can enjoy their unique execution on the same concept without any diminishment. I feel the same about Westworld's potential.
2. There is no main character. The lead female is not a human and is just at the precipice of gaining sentience. There is no identifying with her. How do you identify with a sim? The people in the "real world" are just a supporting cast so far. The only "guest" that gets any decent screen time is basically a troll who likes to trap his sims in a closed room and starve them to death.
Why does the main character have to be a human? It seems like the point of this show will be that we will identify more and more with the hosts as the show progresses. She starts off as this idealistic character that chooses to see the beauty in the world rather than the ugliness, but we know for damn sure her world view is about to change. That makes for interesting character development.
3. There is no context for the setting of the "real world". This show takes place so far in the future that the "real world" people are completely unrelatable. This is in contrast to 2001, Humans, et. al. In those works of fiction the setting is in the near future and so we can compare and contrast our current morals with those of the characters.
Wasn't your earlier point that the existence of those other works makes this one redundant, but when it does something different, you point that out as a flaw. I don't really understand what you're trying to say.
This problem will hopefully be fixed in future episodes.
Well... yeah, it's only the first episode. It gives you a taste of what's happening, and teases more to come. That's what a good premiere should do right?
4. The actual story beats of the episode were boring. Ed Harris murdering some Macy's mannequins did nothing for me. They replace them on the next reboot. There is no permanence to this and therefore no reason I should care. If it is to establish Ed Harris as a bastard, well maybe his friends all beat real slaves and he instead is a good guy by being a bastard in this fake environment with no consequences.
I don't know, I still found it pretty confronting to see that kind of cruelty in a person. Yes, things are rebooted, yes they are just 'robots', but does that make his crimes any less shocking? That's the point they're trying to make. What kind of person seeks that kind of thing out? All in the name of finding some deeper layer to a 'game'?
5. The setting of the "fake world" is like some type of silly tourist trap. Why would anyone pay to go here? If you want to visit a tourist western town you can go to any number of real towns in the west. If you want to virtually rape sims surely in this super future setting you can find somewhere more interesting to do so. This is probably a silly complaint but I just don't like this Back to the Future 3 style of western town setting.
Again, I think it's intentional and that's kind of the point. They literally call it a 'park' i.e. a theme park. It's meant to be stereotypical and kind of like a video game, where you can feel immersed in the world, have total freedom to do what you want, but not suffer from any real consequences. It's literally a super-advanced theme park for tourists.
A sci-fi Pinocchio theme would be fairly original.
Uh... really? Not that I'm against it, but I don't think that concept is particularly original, certainly no more than the AI gaining sentience trope you cited.
Everything in the "real world" was more interesting to me, when it should probably be the other way around.
You can like any part of the show you want to. Why should it be a particular way? I enjoyed seeing both worlds.
Is it? Those moral questions are sorta the entire point of the show. At what point does something stop being a disposable tool?
Exactly. This is part of why I love media that deals with these issues. It forces us to ask these kinds of questions. At what point is a robot human enough that it should have similar rights to a human? How do we define consciousness? Do we have free will? Aren't humans (and all other living things) really just incomprehensibly complex and intricate robots? Were do we draw the line for our empathy of autonomous beings?
These are questions we need to start asking, because one day (who knows when exactly), we're going to have to have answers to these questions.