An issue I have with AI is that it'll be far superior to humans in every way possible.
The way we think and behave is governed by the chemical makeup of our brains, as well as millions of years of evolution. For example, we're for most part empathetic because there is an evolutionary benefit for this character trait.
A robot with advanced AI won't come with the evolutionary baggage that we have. They'll easily be able to exploit our traits and behaviours in ways we wouldn't think possible. It could easily manipulate humanity into giving it certain rights to make sure we couldn't turn it off or destroy it.
Another way of thinking about this, if humanity was getting in the way of an AI's goals (such as the Paperclip problem), then there is no question that it would wipe us out without any question. It wouldn't dwell on the subject or feel any guilt like a human would. It would just act with the single purpose of completing its goals.
The day we have AI that advanced is the day we're well and truly fucked.