• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

XBOX360 supports 1080p

FiRez

Member
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423&p=2

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.

This is a quote from the Steve Ballmer interview in engadget.com

Yesterday Sony unveiled the PS3, and at least on paper it seems to outclass the Xbox 360 in terms of teraflops and support for 1080p, and it’s going to have a next-generation optical drive which the 360 isn’t going to have—

No… we just haven’t announced anything yet. Sony may have announced a non-standard drive last night. We just haven’t announced anything.
 
Capabilities and support are 2 different things... 360 is certainly capable of 1080p, but that doesn't mean MS will support it. They don't feel that there is enough consumer demand for it.

No performance hit? I don't believe that for a second. That's NEVER been true...

After reading a couple tech papers... it certainly seems to be true.
 
timlot said:
Unless the plan to add hdmi or dvi it won't matter since component cable can't handle the bandwidth.


I will personally guarantee you that xbox 360 will support HDMI / DVI w/ HDCP through some HD-cable pack. If they don't, I will eat my shoe (and be very shocked/disappointed).
 
I sorta took this for granted, but really, I am not going to have a 1080p TV for like, five years.

Still, makes you wonder why MS didn't simply clarify that.
 
Give me a friggin' break.

If the XBOX 360 has a pretty major graphics capability that can be enabled "with no performance penalty," why the eff isn't it standard???
 
mashoutposse said:
Give me a friggin' break.

If the XBOX 360 has a pretty major graphics capability that can be enabled "with no performance penalty," why the eff isn't it standard???

uh, it is standard.
 
dark10x said:
Could you possibly point me to any references?

http://forums.gaming-age.com/showthread.php?t=48238

Keep in mind, ATI is not a stranger to adding memory to a chipset, but remember this is “smart” memory.

The Xbox 360 Smart 3D Memory is a relatively small piece of DRAM sitting off to the side of the GPU but yet on the same substrate. The Smart 3D Memory weighs in at only 10MB. Now the first thing that you might think is, “Well what the hell good is 10MB in the world of 512MB frame buffers?” And that would be a good line of questioning. The “small” 10MB of Smart 3D memory that is currently being built by NEC will have an effective bus rate between it and the GPU of 2GHz. This is of course over 3X faster that what we see on the high end of RAM today.

Inside the Smart 3D Memory is what is referred to as a 3D Logic Unit. This is literally 192 Floating Point Unit processors inside our 10MB of RAM. This logic unit will be able to exchange data with the 10MB of RAM at an incredible rate of 2 Terabits per second. So while we do not have a lot of RAM, we have a memory unit that is extremely capable in terms of handling mass amounts of data extremely quickly. The most incredible feature that this Smart 3D Memory will deliver is “antialiasing for free” done inside the Smart 3D RAM at High Definition levels of resolution. (For more of just what HiDef specs are, you can read here. Yes, the 10MB of Smart 3D Memory can do 4X Multisampling Antialiasing at or above 1280x720 resolution without impacting the GPU. So all of your games on Xbox 360 are not only going to be in High Definition, but all will have 4XAA applied as well.

The Smart 3D Memory can also compute Z depths, occlusion culling, and also does a very good job at figuring stencil shadows. You know, the shadows in games that will be using the DOOM 3 engine, like Quake 4 and Prey?
 
mashoutposse said:
Give me a friggin' break.

If the XBOX 360 has a pretty major graphics capability that can be enabled "with no performance penalty," why the eff isn't it standard???

Isn't it really only referring to the 4xAA on a 1080p image and not the graphical power to create the image in the first place?
 
anandtech

Because of the extremely large amount of bandwidth available both between the parent and daughter die as well as between the embedded DRAM and its FPUs, multi-sample AA is essentially free at 720p and 1080p in the Xbox 360. If you're wondering why Microsoft is insisting that all games will have AA enabled, this is why.

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.
 
sonycowboy said:
Isn't it really only referring to the 4xAA on a 1080p image and not the graphical power to create the image in the first place?

Yes, and I was referring to that. If AA can be done for free, it should be standard. Jedimike posted an article stating that it is in fact standard, so the point is moot anyway...
 
No… we just haven’t announced anything yet. Sony may have announced a non-standard drive last night. We just haven’t announced anything.

Microsoft is in no position to call out another company for jumping before a standard is set.
 
timlot said:
Unless the plan to add hdmi or dvi it won't matter since component cable can't handle the bandwidth.
That's not true. In fact, it's the opposite. Digital cables are limited in terms of bandwidth to their digital specification. Analog cables don't have the same kind of limits. People have been running 2048x1536 over VGA cables for quite some time. A VGA cable is nothing more that a component cable with HDB15 connectors on the end. Good component cables will go to 2K x 2K without any problem, if not even higher.

A type-A HDMI port corresponds to a single-link DVI port. It can't handle resolutions much over 1600x1200 at 60 hz. You need a dual-link port for that (either dual-link DVI or else type-B HDMI, which nobody supports). Perhaps you can do it by combining the PS3's dual HMDI type-A ports.
 
No 1080p or HDMI or DVI

ET: All Xbox 360 games have to support 720p at a minimum but will have standard definition modes as well, some games are supporting 1080i. What about 1080p? Some new displays can handle 1080p…

TH: We have developed a box that supports all the devices that are out on the market right now. And we'll continue to look if there are other things that are being developed, we'll continue to consider those things.

ET: So, to be clear, yes it can to 1080p, or no it can not?

TH: It does not support 1080p. It supports all of the TV sets that are out on the market right now. All the sets that people are using to play games right now.

ET: What about DVI or HDMI? Those are popular connections on HDTVs these days.

TH: We believe that we have the right set of outputs right now to meet the requirements of people who have HDTV sets today. We're continuing to look at what's going on in the future, and as things change, we've developed a very flexible system, so we can adapt to the different demands that are out there.

http://www.extremetech.com/article2/0,1558,1817031,00.asp
 
TH: It supports all of the TV sets that are out on the market right now. All the sets that people are using to play games right now.

in other words they only listed the standard features. the 1080p support in the HDTV's isvery uncommon right now. but the xbox360 supports it
 
anyone see this?

http://www.theinquirer.net/?article=23325

NVIDIA COULD not contain all the information about G70 forever and we finally found a way to learn some more about it.

It turns out that the G70, also known as the Geforce series 7, is going to be a very similar chip to the just announced Playstation 3 chip. We wrote about this here. Nvidia is still keeping quiet about the final spec of the chip but we know it's a 90 nanometre chip and that it's likely that it will have at least 10MB of cache memory. ATI's Xbox 360 chip has exactly the same 10MB as you need to have enough buffer to render HDTV picture quality games. That's what the new consoles are all about.

We also learned that the G70 is getting ready and that Nvidia is just a month or so away from releasing the chip. We expect that the G70 won't have 10MB of cache like its Playstation chip brother, as this is just too expensive to make at this point. ATI also has two version of the next generation chip - one called R520 for graphic cards and second codenamed R500 is Xbox 360 chip, very similar to the R520 but powered with 10MB of Edram cache memory.

The Playstation 3 chip runs at 550MHz and this can give you at least some idea about Nvidia's G70 clocks.

Stay tuned, as we believe that we can dig out even more information about this chip here at the show. µ
 
ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.

http://www.firingsquad.com/features/xbox_360_interview/default.asp

FiringSquad: You said earlier that EDRAM gives you AA for free. Is that 2xAA or 4x?

ATI: Both, and I would encourage all developers to use 4x FSAA. Well I should say there’s a slight penalty, but it’s not what you’d normally associate with 4x multisample AA. We’re at 95-99% efficiency, so it doesn’t degrade it much is what I should say, so I would encourage developers to use it. You’d be crazy not to do it.


FiringSquad: Microsoft has announced 1080i support, but are there any plans to add support for 1080p?

ATI: I think 720p and 1080i are the sweet spot that developers are going for and that’s what we’re going to see in the next few years, for the next five years really as the main resolutions. It will be awhile before 1080p becomes standard. I think 720p would be the best to go for, and 1080i is supported as well of course. So hopefully developers will be doing, or at least the best would be 720p, 4xAA. You’d get a teriffic image there.

Intresting...
 
FiRez said:
in other words they only listed the standard features. the 1080p support in the HDTV's isvery uncommon right now. but the xbox360 supports it

...the eff? Right, and I mean, RIGHT before that statement, he says in no uncertain terms that "It does not support 1080p."
 
accord 4 me said:
anyone see this?

http://www.theinquirer.net/?article=23325

NVIDIA COULD not contain all the information about G70 forever and we finally found a way to learn some more about it.

It turns out that the G70, also known as the Geforce series 7, is going to be a very similar chip to the just announced Playstation 3 chip. We wrote about this here. Nvidia is still keeping quiet about the final spec of the chip but we know it's a 90 nanometre chip and that it's likely that it will have at least 10MB of cache memory. ATI's Xbox 360 chip has exactly the same 10MB as you need to have enough buffer to render HDTV picture quality games. That's what the new consoles are all about.

We also learned that the G70 is getting ready and that Nvidia is just a month or so away from releasing the chip. We expect that the G70 won't have 10MB of cache like its Playstation chip brother, as this is just too expensive to make at this point. ATI also has two version of the next generation chip - one called R520 for graphic cards and second codenamed R500 is Xbox 360 chip, very similar to the R520 but powered with 10MB of Edram cache memory.

The Playstation 3 chip runs at 550MHz and this can give you at least some idea about Nvidia's G70 clocks.

Stay tuned, as we believe that we can dig out even more information about this chip here at the show. µ

Already proven wrong. PS3's RSX has no eDRAM...period. Too bad...so sad. ;)

And I was gonna post the same ATI thing too. That ws discussed on B3D yesterday. It doesn't seem like it will support 1080p, and more questionable with 4xMSAA. That said...who really cares? That's on'y gonna matter to the 99th percentile anyway. PEACE.
 
Pimpwerx said:
Already proven wrong. PS3's RSX has no eDRAM...period. Too bad...so sad. ;)

And I was gonna post the same ATI thing too. That ws discussed on B3D yesterday. It doesn't seem like it will support 1080p, and more questionable with 4xMSAA. That said...who really cares? That's on'y gonna matter to the 99th percentile anyway. PEACE.

thats what i figured, just thought that i would throw that out there because i couldn't remember seeing this in the specs. of course, its not like they couldn't still do it
 
The ATI part can do 1080p, just not the final MS-designed output chip. I suppose that, unless they are already manufacturing the chip, it could be changed to allow for it as it only currently outputs/scales for 1080i at max.
 
I think its kind of bad for M$ to do 720p standard. For one, most current HDTVs can't display 720p natively (fixed pixel HDTVs with 720p are the fastest growing segment of HDTV sales, but they aren't the highest selling - and obviously the majority of HDTVs sold previously can't do it).

Most HDTVs convert 720p to 1080i. While new TVs probably do this well, there's still the potential introduce artifacts, especially in some of the older displays. This isn't really a big concern, but 1080i does looks higher res. Maybe not as stable, but it does have higher apparent res.

If Sony is planning on doing games internally at 1080p, and then outputting at 1080i - they will have some more detail than 720p games (and it would be noticeable).

Does anyone remember how DC's super-sampling worked? Was it 480p internally, and then 'blurred' to get cheap AA when output at 480i via output hardware? If that's the case, maybe Sony will be doing something similar with 1080i? That would give them higher apparent resolution, plus some cheap AA. It might not be as good as 4XMSAA, but will it actually be that noticeable at 1080? 1080 should be enough resolution to hide most aliasing to begin with, and if PS3 can pull off what the DC does, it might just hid the rest? A little bit of anti-aliasing + higher res could be an advantage over X-box 360 output?

Any have any comments on this?


[editted for marginally better grammar]
 
Seeing the difference of 720p on a display that is native versus a 720p image signal being upscaned to 1080i...is pretty mostly non-existent, IMO. Granted, I've only seen it run that way a few times. Technically, the difference between the resolutions of the native display and what resolution the signal actually is before conversion should produce a bit more detail. Still, the scaling chip inside of the X360 converts on its own if you choose to output to 1080i from a native internal resolution of 720p, supposedly.

There's also no reason that devs could not create all of their assets for 1080i/p and then have the conversion process take it from there. As I understand it, the way that the embedded DRAM on the Xenos can deal with framebuffers that exceed 720p by splitting the data out and dealing with it internally, piece by piece. The only thing that looks like it stopping anything on a comparative scale with RSX is that its video display output can put out 1080p, while the MS-designed display chip/scaler in the X360 would need to be altered to allow for output at 1080p.
 
How many TVs out there can do 1080p?

How many people own such TVs?

Is 1080p that much a jump over 720p or 1080i?

Is there any point to talking about 1080p other than fanboyism?
 
For this generation, it's a good sacrifice to make.

There IS a difference between the resolutions, most graphics whores can see it. But it's an inferiority that the 360 has to live with. It's really not gonna make or break the console...but I just find it hilarious that the "HD is the future" crap Allard has been saying all along has been shoved back up his youknowwhat.
 
Tekky said:
How many TVs out there can do 1080p?

How many people own such TVs?
Its been announce that Xbox2 will support VGA output... I know of many monitors that support 1080p. I'm looking at a 24" wide screen monitor now for $500 that supports it...I know you wont get the big screen effect but the picture quality can't be beat!
 
MightyHedgehog,

Seeing the difference of 720p on a display that is native versus a 720p image signal being upscaned to 1080i...is pretty mostly non-existent, IMO. Granted, I've only seen it run that way a few times.

I haven't seen enough examples either, so I can't really make a determination. The following will detail some of my worries though.

Technically, the difference between the resolutions of the native display and what resolution the signal actually is before conversion should produce a bit more detail. Still, the scaling chip inside of the X360 converts on its own if you choose to output to 1080i from a native internal resolution of 720p, supposedly.

In the case of converting from a higher native resolution to a lower display one, you get some anti-aliasing (I think this is super-sampling). The output hardware does interpolation to approximate the picture. This basically softens (blurs) the picture, thereby reducing aliasing. I think its a relatively inexpensive computation, and is what the DC did? Normally you would want to go from a resolution that's higher in both directions, but I think the DC was 480p to 480i, and therefore only done horizontally? Also, I think the DC had special output hardware to do this, it wasn't using the GPU for this?

In the case of going from a lower native resolution to a higher output resolution however (which is what XBOX 360 would be doing when going from 720p to 1080i either internally or in the display itself), things get a little tricky. In this case, you are doing extrapolation, ie. you have to make up (guess) the pixel data to fill in the extra resolution. I'm under the impression that there are many ways to do this, with varying results. At the minimum, I believe it's more computationly expensive, and more likely to result in artifacts (mostly aliasing). I don't know if the 360 GPU does this conversion itself, or if there is dedicated hardware (I assume) ... but that brings up a few potential problems. Either there is potentially more aliasing, or if there is no dedicated output hardware, more aliasing and extra resource use. No matter what, the level of detail is going to be less than 1080p/i native.

In the case of PS3, there would be higher actual detail, less aliasing (due to the higher res, and lack of conversion artifacts), and less extra resource usage? Also, if I'm correct about the DCs horizontal super-sampling and PS3 does the same, some cheap AA.

There's also no reason that devs could not create all of their assets for 1080i/p and then have the conversion process take it from there. As I understand it, the way that the embedded DRAM on the Xenos can deal with framebuffers that exceed 720p by splitting the data out and dealing with it internally, piece by piece.

While they could, it would probably be the exception - not the rule, since its not mandated, and would be more expensive due to multiple passes for frame rendering, and then extra conversion? Granted the downconversion might not matter since it's handled by dedicated output hw I assume, but the extra passes doesn't sound all that promising.

The only thing that looks like it stopping anything on a comparative scale with RSX is that its video display output can put out 1080p, while the MS-designed display chip/scaler in the X360 would need to be altered to allow for output at 1080p.

I think my replies above cover my worries.
 
As far as I understand, and that's not a lot :lol, the image scaler/video output chip is free of resource drain on the rest of the system...it's free, free as a fixed function chip that does nothing but scale and create the final output signal for rasterization. Whether devs support creating content and software specific support for 1080i/p is up to them, yes. That's also true of devs on the PS3, as far as I understand. 1080p isn't mandatory, or am I wrong? If it is, and the system can't handle the bandwidth to display all of that, it's gonna mean lessened potential effects and even consistency in framerate, depending on the developer, I'd think. If the scaler is of high enough quality, the output to 1080p (if it were possible via the MS-designed output chip/scaler) wouldn't be all that bad, IMO. Developers will, at some point, have to consider the fact that a vast majority of the HDTV out there and that are sold currently are only capable of 480i/p and 1080i...so they'd hopefully do some work to tweak their visuals for the best image quality. I'm not concerned about AA...well, maybe on the PS3 since we haven't heard anything concrete about the GPU and its AA solutions. X360 should be more than fine, though.
 
Funny seeing all the usual suspects bashing 360 cause it "can't" do 1080p, when most likely none of you have a set even capable of it.
 
Funny seeing all the usual suspects bashing 360 cause it "can't" do 1080p, when most likely none of you have a set even capable of it.

Not sure if I'm one of the usual suspects, but I'm not concerned with whether it can or can't output 1080p (though I do plan to have a 1080p set within the next 2-3 years). I'm just wondering whether its likely they will render stuff internally at 1080p or i. 1080p would be preferable, as it skips a number of issues seen to some extent on early PS2 games, but the brunt of my concern is with lower detail levels and extra conversion artifacts when going from 720p to 1080i (versus native 1080i or p).
 
PanopticBlue said:
Funny seeing all the usual suspects bashing 360 cause it "can't" do 1080p, when most likely none of you have a set even capable of it.


In fact, I'd bet that no one that posts here has one.
 
As far as I understand, and that's not a lot , the image scaler/video output chip is free of resource drain on the rest of the system...it's free, free as a fixed function chip that does nothing but scale and create the final output signal for rasterization.

Yeah, that's what I suspected. I'm not really famliar with the overall architecture of most consoles, but I thought that's how DC and PS2 were.

Whether devs support creating content and software specific support for 1080i/p is up to them, yes. That's also true of devs on the PS3, as far as I understand. 1080p isn't mandatory, or am I wrong? If it is, and the system can't handle the bandwidth to display all of that, it's gonna mean lessened potential effects and even consistency in framerate, depending on the developer, I'd think.

I could be wrong, but I thought there was some comments that all PS3 games would be 1080p.

From the descriptions of the RSX I've read, there appears to be a fill-rate advantage over the Xenon (that's the name of the 360 GPU isn't it?), so I'm not sure if its necessarily an issue. It appears that it was designed with 1080p (actually * 2) in mind.

If the scaler is of high enough quality, the output to 1080p (if it were possible via the MS-designed output chip/scaler) wouldn't be all that bad, IMO.

In the home theater and A/V market, quality scalars to 1080p are few and far between, at least ones at reasonable prices. Though I'm not sure of the details of 720p to 1080p, the issue is with ones from 480i/p to 1080p.

Regardless, I'm not all that concerned about actual output of 1080p, I'm concerned with the lower detail, and extrapolation artifacts when upconverting from 720p versus native 1080i/p.

Developers will, at some point, have to consider the fact that a vast majority of the HDTV out there and that are sold currently are only capable of 480i/p and 1080i...so they'd hopefully do some work to tweak their visuals for the best image quality. I'm not concerned about AA...well, maybe on the PS3 since we haven't heard anything concrete about the GPU and its AA solutions. X360 should be more than fine, though.

I would assume the AA Microsoft has talked about is applied before output scaling, so some aliasing could still happen when upconverting to 1080i, but hopefully it won't really be noticeable. The lower detail may be however.

I agree that developers may end up moving to internal 1080p at some point, I just hope the extra processing for the multiple passes doesn't come at too much of a toll.

It will be interesting to see what Sony's solution (if any) is for AA. Will it do horizontal super-sampling ala DC? Something more that uses extra resources? Something built in we don't know about? Nothing?
 
I saw something about PS3 doing 1080p for all games too, but to me that seems excessive at this point in the game. Granted, 1080p might be fairly common at the end of next gen, but at this point the vast majority of gamers are still using RCA cables for a 480i output. I expect this to continue for most of next generation as well. The jump to 720p should be enough to keep the masses happy until next-next gen starts to rear its ugly head. 1080p support in all games would be nice, but not if it affects the overall graphical quality of effects or framerate in-game for the other 90% of people that still won't have 1080p sets by the end of next generation.
 
Top Bottom