• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Gamespy: Inside the new Xbox (speculative?)

1.) there is not much info in this Gamespy article.... new article, mostly old info

2.) not much detail.

3.) some things are probably wrong. not that I know or anything ( i don't )


want greater detail, even if the info is outdated and not 100% accurate ?

here you go:
http://forums.xbox-scene.com/index.php?showtopic=231928

Jun 23 2004, 12:49 AM

Xenon Hardware Overview

By Pete Isensee, Development Lead, Xbox Advanced Technology Group

This documentation is an early release of the final documentation, which may be changed substantially prior to final commercial release, and is confidential and proprietary information of MS Corporation. It is disclosed pursuant to a nondisclosure agreement between the recipient and MS.
“Xenon” is the code name for the successor to the Xbox® game console from MS. Xenon is expected to launch in 2005. This white paper is designed to provide a brief overview of the primary hardware features of the console from a game developer’s standpoint.

Caveats
In some cases, sizes, speeds, and other details of the Xenon console have not been finalized. Values not yet finalized are identified with a “+” sign, indicating that the numbers may be larger than indicated here. At the time of this writing, the final console is many months from entering production. Based on our experience with Xbox, it’s likely that some of this information will change slightly for the final console.

For additional information on various hardware components, see the other relevant white papers.

Hardware Goals
Xenon was designed with the following goals in mind:

•Focus on innovation in silicon, particularly features that game developers need. Although all Xenon hardware components are technologically advanced, the hardware engineering effort has concentrated on digital performance in the CPU and GPU.

•Maximize general purpose processing performance rather than fixed-function hardware. This focus on general purpose processing puts the power into the Xenon software libraries and tools. Rather than being hamstrung by particular hardware designs, software libraries can support the latest and most efficient techniques.

•Eliminate the performance issues of the past. On Xbox, the primary bottlenecks were memory and CPU bandwidth. Xenon does not have these limitations.

Basic Hardware Specifications

Xenon is powered by a 3.5+ GHz IBM PowerPC processor and a 500+ MHz ATI graphics processor. Xenon has 256+ MB of unified memory. Xenon runs a custom operating system based on MS® Windows NT®, similar to the Xbox operating system. The graphics interface is a superset of MS® Direct3D® version 9.0.
CPU

The Xenon CPU is a custom processor based on PowerPC technology. The CPU includes three independent processors (cores) on a single die. Each core runs at 3.5+ GHz. The Xenon CPU can issue two instructions per clock cycle per core. At peak performance, Xenon can issue 21 billion instructions per second.

The Xenon CPU was designed by IBM in close consultation with the Xbox team, leading to a number of revolutionary additions, including a dot product instruction for extremely fast vector math and custom security features built directly into the silicon to prevent piracy and hacking.

Each core has two symmetric hardware threads (SMT), for a total of six hardware threads available to games. Not only does the Xenon CPU include the standard set of PowerPC integer and floating-point registers (one set per hardware thread), the Xenon CPU also includes 128 vector (VMX) registers per hardware thread. This astounding number of registers can drastically improve the speed of common mathematical operations.

Each of the three cores includes a 32-KB L1 instruction cache and a 32-KB L1 data cache. The three cores share a 1-MB L2 cache. The L2 cache can be locked down in segments to improve performance. The L2 cache also has the very unusual feature of being directly readable from the GPU, which allows the GPU to consume geometry and texture data from L2 and main memory simultaneously.
Xenon CPU instructions are exposed to games through compiler intrinsics, allowing developers to access the power of the chip using C language notation.
GPU

The Xenon GPU is a custom 500+ MHz graphics processor from ATI. The shader core has 48 Arithmetic Logic Units (ALUs) that can execute 64 simultaneous threads on groups of 64 vertices or pixels. ALUs are automatically and dynamically assigned to either pixel or vertex processing depending on load. The ALUs can each perform one vector and one scalar operation per clock cycle, for a total of 96 shader operations per clock cycle. Texture loads can be done in parallel to ALU operations. At peak performance, the GPU can issue 48 billion shader operations per second.

The GPU has a peak pixel fill rate of 4+ gigapixels/sec (16 gigasamples/sec with 4× antialiasing). The peak vertex rate is 500+ million vertices/sec. The peak triangle rate is 500+ million triangles/sec. The interesting point about all of these values is that they’re not just theoretical—they are attainable with nontrivial shaders.

Xenon is designed for high-definition output. Included directly on the GPU die is 10+ MB of fast embedded dynamic RAM (EDRAM). A 720p frame buffer fits very nicely here. Larger frame buffers are also possible because of hardware-accelerated partitioning and predicated rendering that has little cost other than additional vertex processing. Along with the extremely fast EDRAM, the GPU also includes hardware instructions for alpha blending, z-test, and antialiasing.

The Xenon graphics architecture is a unique design that implements a superset of Direct3D version 9.0. It includes a number of important extensions, including additional compressed texture formats and a flexible tessellation engine. Xenon not only supports high-level shading language (HLSL) model 3.0 for vertex and pixel shaders but also includes advanced shader features well beyond model 3.0. For instance, shaders use 32-bit IEEE floating-point math throughout. Vertex shaders can fetch from textures, and pixel shaders can fetch from vertex streams. Xenon shaders also have the unique ability to directly access main memory, allowing techniques that have never before been possible.

As with Xbox, Xenon will support precompiled push buffers (“command buffers” in Xenon terminology), but to a much greater extent than the Xbox console does. The Xbox team is exposing and documenting the command buffer format so that games are able to harness the GPU much more effectively.

In addition to an extremely powerful GPU, Xenon also includes a very high-quality resize filter. This filter allows consumers to choose whatever output mode they desire. Xenon automatically scales the game’s output buffer to the consumer-chosen resolution.

Memory and Bandwidth
Xenon has 256+ MB of unified memory, equally accessible to both the GPU and CPU. The main memory controller resides on the GPU (the same as in the Xbox architecture). It has 22.4+ GB/sec aggregate bandwidth to RAM, distributed between reads and writes. Aggregate means that the bandwidth may be used for all reading or all writing or any combination of the two. Translated into game performance, the GPU can consume a 512×512×32-bpp texture in only 47 microseconds.

The front side bus (FSB) bandwidth peak is 10.8 GB/sec for reads and 10.8 GB/sec for writes, over 20 times faster than for Xbox. Note that the 22.4+ GB/sec main memory bandwidth is shared between the CPU and GPU. If, for example, the CPU is using 2 GB/sec for reading and 1 GB/sec for writing on the FSB, the GPU has 19.4+ GB/sec available for accessing RAM.

Eight pixels (where each pixel is color plus z = 8 bytes) can be sent to the EDRAM every GPU clock cycle, for an EDRAM write bandwidth of 32 GB/sec. Each of these pixels can be expanded through multisampling to 4 samples, for up to 32 multisampled pixel samples per clock cycle. With alpha blending, z-test, and z-write enabled, this is equivalent to having 256 GB/sec of effective bandwidth! The important thing is that frame buffer bandwidth will never slow down the Xenon GPU.

Audio
The Xenon CPU is a superb processor for audio, particularly with its massive mathematical horsepower and vector register set. The Xenon CPU can process and encode hundreds of audio channels with sophisticated per-voice and global effects, all while using a fraction of the power of a single CPU core.

The Xenon system south bridge also contains a key hardware component for audio—XMA decompression. XMA is the native Xenon compressed audio format, based on the WMA Pro architecture. XMA provides sound quality higher than ADPCM at even better compression ratios, typically 6:1–12:1. The south bridge contains a full silicon implementation of the XMA decompression algorithm, including support for multichannel XMA sources. XMA is processed by the south bridge into standard PCM format in RAM. All other sound processing (sample rate conversion, filtering, effects, mixing, and multispeaker encoding) happens on the Xenon CPU.

The lowest-level Xenon audio software layer is XAudio, a new API designed for optimal digital signal processing. The Xbox Audio Creation Tool (XACT) API from Xbox is also supported, along with new features such as conditional events, improved parameter control, and a more flexible 3D audio model.
Input/Output

As with Xbox, Xenon is designed to be a multiplayer console. It has built-in networking support including an Ethernet 10/100-BaseT port. It supports up to four controllers. From an audio/video standpoint, Xenon will support all the same formats as Xbox, including multiple high-definition formats up through 1080i, plus VGA output.

In order to provide greater flexibility and support a wider variety of attached devices, the Xenon console includes standard USB 2.0 ports. This feature allows the console to potentially host storage devices, cameras, microphones, and other devices.

Storage
The Xenon console is designed around a larger world view of storage than Xbox was. Games will have access to a variety of storage devices, including connected devices (memory units, USB storage) and remote devices (networked PCs, Xbox Live™). At the time of this writing, the decision to include a built-in hard disk in every Xenon console has not been made. If a hard disk is not included in every console, it will certainly be available as an integrated add-on component.

Xenon supports up to two attached memory units (MUs). MUs are connected directly to the console, not to controllers as on Xbox. The initial size of the MUs is 64 MB, although larger MUs may be available in the future. MU throughput is expected to be around 8 MB/sec for reads and 1 MB/sec for writes.

The Xenon game disc drive is a 12× DVD, with an expected outer edge throughput of 16+ MB/sec. Latency is expected to be in the neighborhood of 100 ms. The media format will be similar to Xbox, with approximately 6 GB of usable space on the disk. As on Xbox, media will be stored on a single side in two 3 GB layers.

Industrial Design
The Xenon industrial design process is well under way, but the final look of the box has not been determined. The Xenon console will be smaller than the Xbox console.
The standard Xenon controller will have a look and feel similar to the Xbox controller. The primary changes are the removal of the Black and White buttons and the addition of shoulder buttons. The triggers, thumbsticks, D-pad, and primary buttons are essentially unchanged. The controller will support vibration.

Xenon Development Kit
The Xenon development environment follows the same model as for Xbox. Game development occurs on the PC. The resulting executable image is loaded by the Xenon development kit and remotely debugged on the PC. MS® Visual Studio® version 7.1 continues as the development environment for Xenon.

The Xenon compiler is based on a custom PowerPC back end and the latest MS® Visual C++® front end. The back end uses technology developed at MS for Windows NT on PowerPC. The Xenon software group includes a dedicated team of compiler engineers updating the compiler to support Xenon-specific CPU extensions. This team is also heavily focused on optimization work.
The Xenon development kit will include accurate DVD emulation technology to allow developers to very precisely gauge the effects of the retail console disc drive.

Miscellaneous Xenon Hardware Notes

Some additional notes:
•Xenon is a big-endian system. Both the CPU and GPU process memory in big-endian mode. Games ported from little-endian systems such as the Xbox or PC need to account for this in their game asset pipeline.

•Tapping into the power of the CPU is a daunting task. Writing multithreaded game engines is not trivial. Xenon system software is designed to take advantage of this processing power wherever possible. The Xbox Advanced Technology Group (ATG) is also exploring a variety of techniques for offloading graphics work to the CPU.

•People often ask if Xenon can be backward compatible with Xbox. Although the architecture of the two consoles is quite different, Xenon has the processing power to emulate Xbox. Whether Xenon will be backward compatible involves a variety of factors, not the least of which is the massive development and testing effort required to allow Xbox games run on Xenon.


more recent info:
http://news.teamxbox.com/xbox/7421/Xbox-2-Patent


xbox2patent_01.gif


Xbox 2 Patent
By: César A. Berardini - "Cesar"
Jan. 4th, 2005 11:38 am
Although there were no Xbox 2 announcements at last year’s E3, a few tidbits regarding the next generation hardware were revealed by an effusive J Allard. In an interview with CVG, Allard talked about the “unbelievable amount of raw computing power” that will be available in the Xbox successor and how next generation software will take advantage of it.

Particularly, Allard described a new technique called “procedural synthesis”; special programs that create realistic bricks, trees, and other environmental objects, leveraging game artists from spending long hours creating repetitive geometry.

Today, we are bringing you a detailed analysis of a recent patent application made by Microsoft, which was approved on December 30th by the United States Patent and Trademark Office.

The patent not only exposes a real-life implementation of this “procedural synthesis” technique but also corroborates some of the leaked Xbox 2 specs.

The patent relates to a “System and method for parallel execution of data generation tasks,” which in layman terms refers to specialized hardware that uses different processing units to simultaneously perform a series of tasks. The system described in the patent can be implemented on any kind of platform and the document mentions both the PC and Microsoft’s Xbox as examples.

To our surprise, the first drawing in the patent looks almost identical to some of the leaked Xbox 2 schematics that were released last year:



The patent describes a system comprising of a CPU module (102), a GPU module (104) and some additional components, but focus mostly on the interaction between the CPU and GPU modules.

The CPU module, which basically acts as a multi-core chip, has different CPUs and the system can assign different functions to each core. In one of the implementations, CPU 1 is a Host processing unit while the other CPUs act as “geometry processing” units. The following is an example application mentioned in the document: (check the above drawing for references)

In a typical gaming application, the host CPU 1 (108) performs the high-level tasks associated with the game, such as receiving a player's input, performing scene management, performing the computations used to simulate the physical phenomena represented by the application, performing any artificial intelligence provided by the game, and so on. The CPUs 2 to n (110, . . . 112) perform more fine-grained processing associated with a game.


This basically means that the host CPU (108) plays a similar role to what a CPU does in current consoles; the CPU handles the game code and everything related to game physics, AI, etc. In the next generation, the computational power will be big enough to handle traditional tasks run by the CPU and still have available resources for other types of tasks. The following is the description to what is specifically the new invention patented:

In one application, these CPUs (110, . . . 112) generate geometry data associated with one or more objects in the scene. For instance, as will be described, each of these processors may include logic for performing procedural geometry. Such logic receives input data defining tasks to be performed, and then executes such tasks to provide output geometry data (e.g., a collection of vertices). To provide merely one example, a game designer could provide procedural logic to generate geometry data associated with individual leaves on a tree. Such procedural logic would receive a relatively limited amount of information associated with such a task, such as a location of an individual leaf, a direction of any simulated wind in the scene, and so on. Based on this information, the procedural logic could generate vertices that define an individual leaf on the tree. The CPUs that perform geometry-related tasks are referred to as geometry-generating CPUs.


This last paragraph sounds a lot like the “procedural synthesis” techniques that Allard discussed last year. Then, the paper gives other potential applications such as the generation of higher-order surfaces, LOD processing and even GPU commands, which means that a CPU could run code that is originally meant to be executed by the GPU.

The diagram, and the different examples described in the patent, also confirm information revealed in the Xenon white paper leaked last year that claimed that “the Xbox Advanced Technology Group (ATG) is also exploring a variety of techniques for offloading graphics work to the CPU.”

Finally, the patent continues describing the interaction between both modules and the necessary hardware and programming instructions to avoid bottlenecks and take maximum advantage of the system’s bandwidth to handle the geometry data between the CPU and GPU modules.

After reading the whole patent and analyzing the drawings, we have no doubt this new invention will be implemented in some form in the next Xbox.
 
Bacon said:
So what happens when people buy the version without the hard drive and can't play Halo 3..?

Games won't require the HDs, or most won't. As I read it, the HD will present itself as a caching opportunity for most games, but that's just a bonus, not a requirement. Maybe some games will need it, but you gotta remember the memory cards could hold a fair bit of downloadable game data that may have gone on the HD otherwise. I think the HD will be more of a convenience than a necessity for 90+% of Xenon games (maybe some MMORPGs will need the HD).

border said:
But there is no hardware to support it....so will all the audio encoding have to be done in-software?

According to gamespy, yes, at least on Xbox 360. I'm not sure how much performance that'll take.

Gotta get used to typing Xbox360, i think. X360 is better.
 
Bacon said:
So what happens when people buy the version without the hard drive and can't play Halo 3..?

The game will still play just fine. However, if you want A) faster load times B) downloadable content C) custom soundtracks and D) game saving abilitiy, etc. you'll have to pick up some sort of add-on storage. I imagine developers will create the games so they work without a HD present, but work even better with a HD present :)
 
gofreak said:
Cheers for the info. I did a little google, and this is what I make out:

1x DVD = 1.35MB/sec

So, 12x DVD = 16.2MB/sec (Xbox 360)

1x Bluray = 4.5MB/sec

So I guess PS3 would need a 4x drive to beat Xbox 360's transfer rate..(?)

Yeah and even if a 4X bluray drive exists when PS3 goes into production, it'll be insanely expensive and untested.
 
I guess we'll know more later, but if MS is going for the entertainment hub approach, then they are going to have to make it compatable with existing products. I wouldn't think that using a USB storage device would be a huge concern for MS. The only exploits I could think of are like the game save exploit (Bond AUF). I seriously doubt MS would make that mistake twice.

Basically... I don't see how an external storage device automatically equates to a hackable machine. You're just transfering .mp3 and .wma files or gam saves.
 
sol5377 said:
The game will still play just fine. However, if you want A) faster load times B) downloadable content C) custom soundtracks and D) game saving abilitiy, etc. you'll have to pick up some sort of add-on storage. I imagine developers will create the games so they work without a HD present, but work even better with a HD present :)

Um, Halo is going to terrible if they aren't loading things in the cache.
 
"CPU - Xenon's CPU has three 3.0 GHz PowerPC cores. Each core is capable of two instructions per cycle and has an L1 cache with 32 KB for data and 32 KB for instructions. The three cores share 1 MB of L2 cache. Alpha 2 developer kits currently have two cores instead of three."

Two instructions?If we're talking about floating point instructions that's 18 Gigaflops.
Anyway I'm not too impressed with these specifics.PCs coming in 2006 will already own Xenon graphics.
 
Nerevar said:
I'm not disputing it wasn't widely used, but it was part of the standard and therefore was public knowledge. Therefore you should have known you were getting a TV that wasn't fully HD compliant. Considering the relative ease of sideconverting content from 720p to 1080i, the fact that companies didn't include the capacity to do this in their televisions (and still don't because consumers don't know any better) is retarded to me.

As I said, at the time it wasn't an issue because set-top boxes and HDTV cable boxes can easily convert it to 1080i and there was literally nothing at the time using 720p. I guess that's the reason why some manufacturers still don't so it - simply because a conversion can be done in the converter box.

I'm interested in finding out how the newer HD sets with ATSC tuners already built-in handle it. It would make much more sense to just have a signal run 720p straight from the tuner than to include hardware in the set to convert it. I honestly don't know how they are handling it now.
 
I was hoping for some sort of largish internal scratch file in the 360 that games could use to ease loading times. I *hate* loading times.
 
bill0527 said:
I'm interested in finding out how the newer HD sets with ATSC tuners already built-in handle it. It would make much more sense to just have a signal run 720p straight from the tuner than to include hardware in the set to convert it. I honestly don't know how they are handling it now.

It's different for every set. Sony LCDs, for example, aren't true 1280 x 720 - they're like 1366 x 750 or something crazy like that. So everything that is input goes through an analog conversion. Some sets pass the digital signal directly to the screen. And the set doesn't have to have an ATSC tuner built in to do sideconversion. I have an RP LCD tv that lacks a tuner but accepts both 1080i and 720p signal and does the conversion itself.
 
Elios83 said:
"CPU - Xenon's CPU has three 3.0 GHz PowerPC cores. Each core is capable of two instructions per cycle and has an L1 cache with 32 KB for data and 32 KB for instructions. The three cores share 1 MB of L2 cache. Alpha 2 developer kits currently have two cores instead of three."

Two instructions?If we're talking about floating point instructions that's 18 Gigaflops.
Anyway I'm not too impressed with these specifics.PCs coming in 2006 will already own Xenon graphics.

Two cores, not instructions. Each core has a VMX unit, likely packing 8flops/clock. So in the dev kits, that'd be 48Gflops. In the final system, it'd be 72Gflops.
 
Bacon said:
Um, Halo is going to terrible if they aren't loading things in the cache.

Just like every single PS2 and Gamecube game (and alot of Xbox games) before it? Factor in faster drive speed and more power and I don't think load times will increase all that much for non-HD Xenon users :D
 
sol5377 said:
Just like every single PS2 and Gamecube game (and alot of Xbox games) before it? Factor in faster drive speed and more power and I don't think load times will increase all that much for non-HD Xenon users :D

Have you ever played a first party gamecube game? :P
 
gofreak said:
Two cores, not instructions. Each core has a VMX unit, likely packing 8flops/clock. So in the dev kits, that'd be 48Gflops. In the final system, it'd be 72Gflops.

Ok thanks but in the Gamespy article is written:

"Each core is capable of two instructions per cycle".
So 6 instructions per cycle.
 
gofreak said:
Have you ever played a first party gamecube game? :P

Yes, but the VAST majority of games this gen haven't had any sort of additional cache to rely on and most have load times that are just fine. Ideally a built-in HD would be nice, but if you really think about it (I mean really analyze it from every angle) I think you'll realize that having the HD be an optional add-on that enhances games (but is not required by games) is the best approach for Microsoft.
 
Good read in this thread. I have purposely been avoiding any XBox successor info until now. Since the GDC is in effect I expect to start getting some real info, as opposed to rampant speculation. I guess we won't know the real deal until E3. I'm glad I decided to wait until the next gen until I statred shopping for an HDTV. I will have to look into a nice Dolby Digital Plus reciever though. ;) Dolby and DTS 5.1. are more than enogh for game when used properly. Anyway keep the info ad educated speculation coming.
 
Blimblim said:
Mostly true, some stuff they got wrong or did not detail enough though.
Funny how today it's mostly stuff from the leaked specs or we all knew about already, and tomorrow it will be about something that will be mentionned at the GDC. I wonder if MS approved the part released today, as the one from tomorrow will most definitely be greenlighted by MS.
Ijoel: Xenon will scale to whatever resolution you ask.
I already did, pay attention
I think maybe the two "nn" in mentionned or the "greeenlighted" commment. Hmmmm two nn's ................NiNtendo .......... no ............greenlighted.................celery with every xenon.............no...... I give up
 
android said:
I think maybe the two "nn" in mentionned or the "greeenlighted" commment. Hmmmm two nn's ................NiNtendo .......... no ............greenlighted.................celery with every xenon.............no...... I give up
Yeah well english is not my 1st language, any spelling error in not intentional ;)
 
The reason behind not having an internal drive is more marketing than anything.

When people compare the price today of the XBOX, PS2, GCN they compare the retail price of the box. Yet today the PS2 and the GCN are basically useless without a memory card that costs about $20.

So not including the HD or a Memory card in the XBOX 2 allows MS to advertise the XBOX 2 at $299.99 when in reality (just like the other consoles) the real price is more. So it will be the consumers choice to pick up a Memory card for say $25 or a HD at say $50.
 
Elios83 said:
Ok thanks but in the Gamespy article is written:

"Each core is capable of two instructions per cycle".
So 6 instructions per cycle.

I think they're confusing that with either two threads per core, or each core issuing two instructions simultaneously. Undoubtedly each core will have a VMX unit, which is capable of 8 flops/cycle.
 
gofreak said:
Methinks there will be an "information release" (for want of a better term) re. Xbox 2 at GDC tomorrow? Specs? Specs would make sense at a show like this..

edit - and just to get it out of the way: assuming 8 flops/clock, 3Ghz tri-core CPU = 72Gflops? ;)

More accurately, each core = 1* VMX + 1*FPU units

Assuming FMADD ops,

VMX ~ 8 Flops per cycle
FPU ~ 2 Flops per cycle

1 Core ~ 10 Flops per cycle
3 Cores ~ 30 Flops per cycle

3 Cores @ 3 GHz ~ 90 GFlops
 
j^aws said:
More accurately, each core = 1* VMX + 1*FPU units

Assuming FMADD ops,

VMX ~ 8 Flops per cycle
FPU ~ 2 Flops per cycle

1 Core ~ 10 Flops per cycle
3 Cores ~ 30 Flops per cycle

3 Cores @ 3 GHz ~ 90 GFlops

But that's like including the VMX unit in Cell flops calculations ;) :D Oh, ok, fair enough :P
 
Nerevar said:
I hate to tell you, but you're wrong. The FCC has not done anything regarding mandating whether stations must broadcast in 1080i or 720p - they only adopted the ASTC standard (which includes both 1080i and 720p). The FCC approved the digital television standard on December 24, 1996, many years before you got your HD tv. In fact, ABC and Fox both broadcast in 720p (whereas CBS and NBC do 1080i). My point is that you're bitching about not being able to get HD content when you willingly and knowingly didn't get a TV capable of displaying all HD resolutions. Just pointing out that you're complaining about something which is, by all accounts, your own fault.

So Nerevar, tell me again about how forward looking you were getting a TV that displays ALL HD resolutions? How is 1080p/24 looking on your set these days? 1080p/30? What, it doesn't display 1080p at all?? Jesus, that was dumb of you! Boy are you gonna be sad when 1080p source material starts to come out!!!111!

By the way, even the newest direct-view (read:CRT) sets don't do 720p, except for a few professional monitors. When you pick CRT, you pick 1080i over 720p, for the most part. This is a part of the display technology selection process based on what you like in your image, and has nothing to do with not being "forward looking enough". As long as CRT's have the best PQ, 720p-1080i conversion will be around for a long time. I don't think CRT HD monitor owners have any reason to fear. :)
 
Okay, about this optional hard drive bullshit: Are developers going to be forced to make the game use the hard drive for caching if it's present? Because that would be really asstastic if they didn't. It would be sort of like the N64 RAM Expansion Pak. Some games had an optional high-res mode, but the vast majority didn't. Everyone who's a hardcore gamer is going to buy the hard drive anyway; they may as well just not make it optional. I demand that all games utilize the drive for caching purposes.
 
Inumaru said:
So Nerevar, tell me again about how forward looking you were getting a TV that displays ALL HD resolutions? How is 1080p/24 looking on your set these days? 1080p/30? What, it doesn't display 1080p at all?? Jesus, that was dumb of you! Boy are you gonna be sad when 1080p source material starts to come out!!!111!

1080p isn't part of the ATSC tuner standard. Sorry. Therefore, a "true" HDTV doesn't have to display 1080p content. Trust me - I've had this argument with KLee before, except from the other side. Broadcasters aren't moving to 1080p until a new standard has been adopted (and considering our current pace - I'm not too worried about it).

Inumaru said:
By the way, even the newest direct-view (read:CRT) sets don't do 720p, except for a few professional monitors. When you pick CRT, you pick 1080i over 720p, for the most part. This is a part of the display technology selection process based on what you like in your image, and has nothing to do with not being "forward looking enough". As long as CRT's have the best PQ, 720p-1080i conversion will be around for a long time. I don't think CRT HD monitor owners have any reason to fear.

I'll try using your tone for this response:
Gee, you think? So my LCD RP isn't displaying 1080 scanlines on it's locked 1280x720 fixed-pixel display?
Oh, right, it's got a simple side-converting chip in there that can display that. And, what's that, you can go the other way too, so a CRT can accept a 720p signal and convert it to 1080i to display? I never realized that! Or, I'm an educated consumer who investigates this sort of stuff before he drops $1000+ on a piece of technology.
 
Error Macro said:
Okay, about this optional hard drive bullshit: Are developers going to be forced to make the game use the hard drive for caching if it's present? Because that would be really asstastic if they didn't. It would be sort of like the N64 RAM Expansion Pak. Some games had an optional high-res mode, but the vast majority didn't. Everyone who's a hardcore gamer is going to buy the hard drive anyway; they may as well just not make it optional. I demand that all games utilize the drive for caching purposes.

It's always like that with optional peripherals.
If the hard dsk is optional developers will have to code their games as if it weren't there.
 
Elios83 said:
It's always like that with optional peripherals.
If the hard dsk is optional developers will have to code their games as if it weren't there.

And despite that, when you run a game entirely from the Xbox HD it runs much faster. My point being that games can experience speed ups without explicit code. Caching doesn't have to be explicit for the developers, it can be handled automatically by the OS.
 
Elios83 said:
Ok thanks but in the Gamespy article is written:

"Each core is capable of two instructions per cycle".
So 6 instructions per cycle.

Instrucions per cycle != Flops per cycle

Each core = VMX + FPU

Each core being dual-issue and 2-way SMT; one instruction can be issued to a VMX unit and the other to the FPU unit on two seperate threads simultaneously. See above...
 
Nerevar said:
1080p isn't part of the ATSC tuner standard. Sorry. Therefore, a "true" HDTV doesn't have to display 1080p content. Trust me - I've had this argument with KLee before, except from the other side. Broadcasters aren't moving to 1080p until a new standard has been adopted (and considering our current pace - I'm not too worried about it).



I'll try using your tone for this response:
Gee, you think? So my LCD RP isn't displaying 1080 scanlines on it's locked 1280x720 fixed-pixel display?
Oh, right, it's got a simple side-converting chip in there that can display that. And, what's that, you can go the other way too, so a CRT can accept a 720p signal and convert it to 1080i to display? I never realized that! Or, I'm an educated consumer who investigates this sort of stuff before he drops $1000+ on a piece of technology.

Good Lord. You're so wrong. 1080p IS part of the ATSC standard:

atsc2.gif


atsc3.gif


Sorry if I don't "trust" you. There're the facts. Notice the '24p' and the '30p' in the second table. Your HDTV doesn't have to display 1080p to be considered an HD set, but an an ATSC tuner does have to be able to decode those formats. And yes, my point is that obviously you don't have a set that displays 1080p. Almost no one does. But ESPN has publicly said they're considering moving their pre-production/camera equip to 1080p/60, and there has been some talk of HDNET carrying 1080p/24/30 in the future, so it's not like it's never going to happen.

Careful about touting yourself as having all the facts if you don't have every "i" dotted, etc. The original poster said that he had a STB box that converts 720p to 1080i, as most in his situation do. If they didn't buy a separate ATSC tuner already, then they probably have a cable or satellite STB, and the sideconversion is done for them, for $5 a month. I just get a little tired of your continual rants about how great you are since your set sideconverts 1080i to 720p. Big fucking deal. It's a non-issue. Stop trying to rub people's nose in it who have HD monitors without HD tuners. The idea that they have no foresight, or made an idiotic purchase, is rubbish.
 
Love_In_Rio wrote:
the article says the alpha 1 developer´s kit has 3 power pc cores and the alpha 2 developer´s kit has 2... so ? probably the final hardware will finish having only two cores ???

Titanio wrote in reply:
No, the alpha kits have 2 cores, final hardware will have 3.

They seem to have gleaned this info from developers, basically. Editors at other sites are saying it's for real, but they're annoyed at Gamespy for publishing it, because they didn't first (evidently this info has made its way around journo circles, but only Gamespy has been brave enough to stand up and publish it - I guess they think an official announcement is incoming, and thus feel safer putting it out there now).


http://www.beyond3d.com/forum/viewtopic.php?t=21038

interesting.
 
This is all fine and dandy, but I can't actually be the only one who doesn't care for specs. Can I?

Seriously, show the system and the games already. That goes to Microsoft, Sony, and Nintendo.
 
Inumaru said:
Good Lord. You're so wrong. 1080p IS part of the ATSC standard:q

That's a pretty ugly graph to quote (I seem to recall Gateway of all places having a cleaner one :) ) but you're absolutely right. 1080p24 and 1080p30 are both part of the 18 resolution ATSC standard, and there will likely be increasing pressure in the future for 1080p60 to join that spec.

What some of the other guys here might not realize is that 1080i support on direct-view TV's isn't "because it's better", it's "because it's technologically easier". (Although even then there isn't a true 1920x1080 set on the market anyway). Driving a true 1280x720 image is so hard for CRT technology that it makes more sense to just include a chipset to upconvert the image, and earlier/cheaper TVs settled for skipping the tuner and not supporting 720p at all. (As mentioned in various parts of this thread and other threads, most other TV technology types are just the opposite, finding 720p easier to support than 1080i).

1080p/60, and there has been some talk of HDNET carrying 1080p/24/30 in the future, so it's not like it's never going to happen.

That would be interesting for satellite folks, but bandwidth is definitely becoming an issue otherwise. Bastardized-1080i (the 1400x1080 stuff usually seen on american HD channels) and full 720p have similar bandwidth requirements, whereas progressive 1080 is a real jump. For the time being I think digital convergence and game consoles are going to have to spearhead that drive over the next ten years.
 
Crazy moogle, the Sony Qualia 004, Qualia 004, the Faroudja DVP1080HD/DILA-1080p combo and the JVC DLA-HD2K are all shipping products that natively display 1080p/60

Oh yeah,....a 1X BRD-ROM drive is 54mb/sec, a 1X BRD-RE drive is 36mb/sec.....


1080p isn't part of the ATSC tuner standard. Sorry. Therefore, a "true" HDTV doesn't have to display 1080p content. Trust me - I've had this argument with KLee before, except from the other side. Broadcasters aren't moving to 1080p until a new standard has been adopted (and considering our current pace - I'm not too worried about it).

I don't remember having any argument with you Nerevar??

You are a knowledgeble fellow who really knows his shit!!

Perhaps I was being an bitch that day....it is known to happen :D


At any rate, here are the *CURRENT* ATSC 1080 x 1920 DTV standards, straight from the horses mouth :)

http://www.atsc.org/document_map/interfaces.htm#1920 x 1080

SMPTE 274M: 1920 x 1080 Scanning and Interface
This standard defines a family of raster-scanning systems for the representation of stationary or moving two-dimensional images sampled temporally at a constant frame rate and having an image format of 1920 x 1080 and an aspect ratio of 16:9. This standard specifies:

R'G'B' color encoding
R'G'B' analog and digital interfaces
Y'P'BP'R color encoding and analog interface
Y'C'BC'R color encoding and digital interface
An auxiliary component A may optionally accompany Y'C'BC'R; this interface is denoted Y'C'BC'RA.

SMPTE 295M: 1920 x 1080 50 Hz - Scanning and Interfaces
This standard defines a family of raster scanning systems for the representation of stationary or moving two-dimensional images sampled temporally at a constant frame rate and having an image format of 1920 x 1080 and an aspect ratio of 16:9. This standard specifies:

R'G'B' color encoding
R'G'B' analog and digital interfaces
Y'P'BP'R color encoding and analog interface
Y'C'BC'R color encoding and digital interface
An auxiliary component A may optionally accompany Y'C'BC'R; this interface is denoted Y'C'BC'RA.

SMPTE RP 211: Implementation of 24P, 25P, and 30P Segmented Frames for 1920 x 1080 Production Format
This practice defines the changes to SMPTE 274M to implement various 1920 x 1080 progressive systems in their segmented frame format: 24sF, 25sF, and 30sF. Only the changes to the appropriate clauses of SMPTE 274M are contained herein. The same clause, table, and figure numbering system, as used in SMPTE 274M, is employed in this practice.

It is news to me that 1080p 50Hz and 60Hz will be supported by the ATSC, it was always my understanding that it was:

1080i/60

1080p/24

1080p/30

Then again, ATSC standards are updatable by design so with DirecTV announcing they will support AVC H.264 in the near future, it doesn't surprise me that the ATSC is including the 1080p loophole...

At any rate, this is TRANSMITTED HDTV, unlike Masked ROM BRD which can and will have the option to ouput native 1080p/60fps to a compatible display via HDMI...


Do you see this Black Sony BRD-ROM player right here?

blurayplayer.jpg


That was a prototype player being displayed at CES that was playing 1080p (not "i") video output to a Qualia 004 SXRD projector (which outputs video @ 1080p/60, BTW) via HDMI....it is going to happen...

Although it is true there are few 1080p digital displays out there and even fewer (three) that will accept/display 1080p at 60fps....more are on the way from JVC, Panasonic, Samsung, Toshiba, Hitachi and others.....in fact JUST TODAY SpatiaLight announced they will supply 1080p LCOS panels to LG Electronics

So imminent is 1080p/60, that there is an entire boutique industry of video processors which upscale/deinterlace video and output it as 1080p/60:

http://www.lumagen.com/new_products.htm

http://www.gennum.com/ip/press/Gennum_Optoma_press.pdf

http://www.algolith.com/index.php?id=137&L=0

I own a CenterStage CS2 video processor...great product http://www.focusinfo.com/products/centerstage/centerstage.htm

http://www.faroudja.com/products/DVP-1010_brochure.pdf
 
PCs coming in 2006 will already own Xenon graphics.
How so? Developers will always be able to push console graphics farther then PC graphics, for the simple fact that everyone has the same hardware on console.
There will be some PC games that are similar, but I wouldn't say OWN.
 
border said:
But there is no hardware to support it....so will all the audio encoding have to be done in-software? How much of a performance hit will that be then?
I think it takes like a small fraction of one core for in excess of 100 channels. Most sound processing is not exactly all that intensive task relative to the amout of CPU power this thing has.

Don't expect PS3 to have dedicated audio hardware either - with both machines having cpus that offer a ton of DSP-like power with a much better flexibility, adding custom DSPs just wouldn't make much sense.
 
Xbox 360's online stuff sounds exciting, but nothing entirely new.

However, I hope that camera will be utilized in games like Unreal Tournament. I can imagine characters not only using face mapping technology, but perhaps the ability to have the face animate with footage from the camera. Or if that's too difficult, have a tiny box over each player showing live (albiet choppy) video of the player.
 
How much you wanna bet the Xbox 360 logo has some kind of circle showing some kind of connection being made... It's Xbox 360... community!

I really hate that name.
 
Kleegamefan said:
Crazy moogle, the Sony Qualia 004, Qualia 004, the Faroudja DVP1080HD/DILA-1080p combo and the JVC DLA-HD2K are all shipping products that natively display 1080p/60

Yeah I know (we talked about this in an OT thread), but you also know that 1080p60 isn't a part of the common ATSC spec yet. The question becomes whether the currently released super-high-end sets will be kosher when the ATSC finally gets around to ratifying everything and slapping the industry with a heads-up.

It is news to me that 1080p 50Hz and 60Hz will be supported by the ATSC, it was always my understanding that it was: 60, 24, 30

50hz is a surprise to me too (is this some sort of PAL/NTSC merger guideline?) but since 24 and 30 are already in the books, and 60 is going to be forced sooner or later, why not 50, right?

That was a prototype player being displayed at CES that was playing 1080p (not "i") video output to a Qualia 004 SXRD projector (which outputs video @ 1080p/60, BTW) via HDMI....it is going to happen...

No doubt. Price versus quality is a war that will continue to go on, and we'll definitely see more 1080p60 sets, but without the content end being there...next gen probably won't support 1080p, not entirely convinced 1080p is going to be set for the new disc formats, and broadcast 1080p seems out of the question for non-satellite unless some frequency shuffling happens.

If PS3/XB2 push forward on this 720p plan, expect it to get pushed even harder than 480p ever was. But 1080p will probably need a similar push another generation down the line. I'm not sold on 1080p60 rushing into the mass consumer HD sets anytime soon if it has no benefit. (I would love to be proven wrong, though, as 1080p60 is the only 1080 standard that should really be game supported).
 
The End said:
As a developer, i could see plenty of situations where i wouldn't want to use custom soundtracks (MGS3, RE4, etc)

I know lots of gamers that listen to their own music no matter what the game is. There's no harm in making custom soundtracks a TCR because gamers will listen to whatever they want anyways. Might as well provide that functionality right within the console and make those gamers happy.

But yeah, it does kinda suck for developers. Many more gamers will now probably just listen to their own music as oppposed to the game soundtrack because it will be that much easier to do.
 
Nerevar said:
I'm not disputing it wasn't widely used, but it was part of the standard and therefore was public knowledge. Therefore you should have known you were getting a TV that wasn't fully HD compliant. Considering the relative ease of sideconverting content from 720p to 1080i, the fact that companies didn't include the capacity to do this in their televisions (and still don't because consumers don't know any better) is retarded to me.

Stop acting like everyone is a nerd who studies resolutions and whatnot and knows that 720p is common knowledge. I'd be surprised if even 0.001% of the population knows this.
 
WOW! The online stuff about the profiles etc. sounds absolutely Amazing! MSFT is really pushing the boundaries with online. I am so glad they are too. It's like a dream come true for me. I also love how they are really attacking the software side of things... something that has needed to be done for awhile now. I love the whole centralized idea, one gamertag, one profile, etc. I just hope their hardware is not too far behing the others.

I am thinking, knowing Nintendo, they won't be behind Revolution in terms of hardware performance at all. Nintendo is all about efficiency and elegance and making money. They I imagine will be right in line with MSFT's Xbox360 I think. Sony on the other hand sounds like they are attacking the one thing they got beat at in this gen, hardware performance. This makes sense if you think about it... Sony wants to crush their competition and win at every single thing. They got the sales wrapped up, so now this next gen they want to be able to say they also have the best hardware.

Worst case for MSFT I think will be if Nintendo and Sony both put a hard drive in their systems by default, use a lot more ram (say 512 or 1 gig instead of Xbox360's measly 256 and if they both support Blu-Ray or HD-DVD playback). To be honest, I think Sony may just do that. Nintendo probably not. But it will be a tough sell for Xbox360 after that. I certainly would prefer the other 2 systems. I have to have a hard drive. Kind of pissed MSFT isn't putting one in by default, and I also think 256megs of ram is a monumental mistake that is going to completely bite them in the ass in a big way.
 
Fafalada said:
I think it takes like a small fraction of one core for in excess of 100 channels. Most sound processing is not exactly all that intensive task relative to the amout of CPU power this thing has.

This isn't just normal sound mixing we're talking about - it's encoding for a proprietary audio output. Considering DD5.1 encoding is so computationally and time-frame important that it's not feasible to use on any current system without a hardware decoder, I would definitely question offloading that process onto an already busy system bus and CPU, no matter how many cores it has.

If true, it will be interesting to see how the process works without throwing a wrench in the CPU pipeline like we saw back with the N64. I mean, the PS2 did something similar, but that was working through the vector units. (I readily confess that I don't know exactly how streamlined these multicore processors will be versus dedicated, indepedent audio hardware though.)
 
Crazymoogle said:
This isn't just normal sound mixing we're talking about - it's encoding for a proprietary audio output.
I am aware of that ;)

Considering DD5.1 encoding is so computationally and time-frame important that it's not feasible to use on any current system without a hardware decoder, I would definitely question offloading that process onto an already busy system bus and CPU, no matter how many cores it has.
If current systems had extra CPU resources of that magnitude, you wouldn't need hardware decoders either. The VU0&EEcore can encode DD5.1 just fine - it just takes up majority of its processing to do it, which makes it useless for games.
A single Xenon core is at least 10x faster then that, and you have 3 of them.
So even if something took 100% of EE time, that would only be ~3% of XCPU time.

(I readily confess that I don't know exactly how streamlined these multicore processors will be versus dedicated, indepedent audio hardware though.)
You have the option to lock L2 cache which effectively allows the cores to work as streaming processors (ala PS3 SPUs). So the processors should be very effective at any kind of multimedia processing.
 
Top Bottom