Do PC gamers want games that will properly take advantage of higher end GPUs (aside from simply higher resolutions and framerate), or do they want every game ever to be able to run well on a 750Ti for the next ten years?
Yes these specs seem high, and maybe this isn't going to be a game that pushes technical boundaries. But you can't have it both ways.
Uh. Well... I think you answered your own question. And, actually, both options are not mutually exclusive: we can have it both ways, and we have had it in many cases. AC Unity doesn't seem like the kind of game that will push the boundaries of PC gaming, like Crysis did a few years ago. This doesn't look like a case of "we need better hardware to push beyond the current limits": it looks more like Ubisoft decided to dump the game on PC without even trying to refine it for the platform, because, hey,
we can always compensate by throwing more money at newer hardware, right?
The thing is, PC gaming offers something that consoles simply can't offer:
scalability. You can (or, at least, should be able to) tweak a game to get to the compromise that you find satisfactory between performance and visual quality. Not every single PC gamer has a top-of-the-line, cutting-edge $3000 PC. Say that you have a 660 Ti, a still very capable mid-to-hig tier GPU. You should be able to scale down the game graphics to achieve better performance on your machine, just like you can crank up every graphical setting to the max if you have two 980 in SLI and performance doesn't break a sweat.
Scalability is the key here. A good port should be able to be fluid enough to be used by both ends of the spectrum. Within reason, of course... it's not that I want every game to work on a Voodoo2.
As for these requirements in particular... I'll wait and see. On the one hand, it's not that even the first AC runs great on modern PCs with much more modest system requirements. Ubisoft games, and especially AC games, have always had shoddy performance. On the other hand, we've seen how "real" these requirements turn out to be in many recent games. Developers and publishers inflate their numbers to... I don't know why, honestly. Maybe to convince people to buy the console version of their games? Maybe to make their games look more advanced than they really are? I dunno. But more often than not, even the Recommended requirements are pure bullshit.