ATI reveals Longhorn tidbits in presentation slides.

marsomega

Member
00109570.jpg


00109571.jpg



Although that is not inevitably the role first of ATI, the originator of graphic chips revealed some information on the next operating system of Microsoft, at the time of a press conference in Munich. ATI thus learned us that the commercial version of Longhorn would be equipped with Windows Graphis Foundation 1.0 (WGF) an alternative of current DirectX 9.0 necessary for the posting of the graphic topics Aero Express, Aero Glass and Diamond for the office. Because under Longhorn, the office of Windows will be metamorphosed to pass in 3d. Longhorn should also include WGF 2.0, the equivalent of DirectX 10 on which ATI and Microsoft work in concert to the development of the specifications. Of course NVIDIA also states him to work with Microsoft on the specifications of WGF 2.0. ATI also learned us that the first version of Longhorn should require the support of the BASIC Model Driver whereas Advanced Driver Model would be finally introduced only with the Service Pack 1 of Longhorn. It would seem that Advanced Driver Model, whose contours are not very clear, requires to be supported a new hardware. ATI indicates as of now that it will provide pilots compatible with Longhorn as of the launching of the operating system by Microsoft.


:lol Sell your Geforce cards it's all over. :lol

(just having fun... :) )

http://www.clubic.com/actualite-17464-quand-ati-nous-en-apprend-plus-sur-longhorn.html


EDIT
For those of you doubters, here's Sun's Project Looking Glass. Their next generation desktop.

http://wwws.sun.com/software/looking_glass/index.html
http://wwws.sun.com/software/looking_glass/demo.html <==== must see.

EDIT EDIT EDIT

I'm pretty much sold on this. While everyone has compelling arguments, I think the reason I'm still sold on this is because I've dealt with programming GUI’s in C++ and the MFC classes. Also, the pitch I heard wasn't that it's all going to be in cool 3D, the pitch I was explained to first and sold on was that everything would be moving towards the 3D cards making the whole windows GUI Direct X and ditching out the old way of programming the GUI. The new Hardware 3D aspects of it are really just a bonus to working with something much easier and less confusing then Windows GUI.


I think mostly everyone here overlooked the programming aspect and completely went for the visual aspect of it. If it's one thing that Java showed everyone was that you don't need a bastardly complex GUI system like Window's to design an intuitive graphically pleasing GUI. Programming a GUI in VC++ was like pulling teeth out with rusted pliers, you could almost taste the mix of rust and blood in your mouth. On the other hand, Java showed that even the beginner in an introduction to Java course can create a nice looking GUI for their simple programs. One of the most powerful aspects of Java that the only way it could be the same for Windows would be to rip the whole Win GUI system out.

I see WFG 1.0 as not a break through in how you use windows, it's more of a break through in how you design and program GUI's. Open GL was the homebrew favorite as it's pretty easy to learn (once you get over trying to program a Windows "window" for it, bastardly difficult that most skip that part and just copy some generic window code.) Direct X on the other hand has evolved now where we have homebrew programmers making their 3D visuals (See Humus at beyond3D.) What if you extended that to the windows GUI? In fact since DirectX is so more accessible then the Windows GUI (by the butt loads) why not just adds the option to bypass Windows GUI completely? Hell why not rip it out, bury it, and pretend it never happened.
 
They're killing the DirectX name? :o

and holy crap at that sidebar! They have it tilted with the image :o
 
If longhorn has a 3D gui, wouldn't that be hard on graphic cards? I mean, the video card would have to be rendering 3D all the time, that can't be good.
 
Pimpbaa said:
If longhorn has a 3D gui, wouldn't that be hard on graphic cards? I mean, the video card would have to be rendering 3D all the time, that can't be good.

Buy stock in GPU cooling technology companies :) But seriously, its not going to be a big deal.
 
Can anyone explain why an OS really needs some intensive 3D rendering horsepower?
Milhouse31 said:
You didn't know they were killing Directx to go with WGF ?
DirectX --> Xbox

Launching in Fall 2005...

WGFBox
 
Pimpbaa said:
If longhorn has a 3D gui, wouldn't that be hard on graphic cards? I mean, the video card would have to be rendering 3D all the time, that can't be good.

The amount of rendering necessary will be FAR less than what is required for even today's games. And what else does your graphics card do when browing the internet and all that stuff? It's just making use of an idle resource to make things look spiffier, Macs have been doing it for a while now.


And yes, pretty much all of us are going to have to upgrade to totaly new computers when Longhorn comes out.

One thing I find extremely interesting is the close relationship between ATI and MS. It seems like ATI's work on Xenon is giving them MAJOR leverage when it comes to Windows as well which could make nVidia totally obsolete, but also provide a true homogeneous environment between Xenon and the PC.
 
Damn i didn't know anything about this WGF stuff too.
Im pretty excited about longhorn, but im sure ms will find ways to fuck it up.
 
rastex said:
One thing I find extremely interesting is the close relationship between ATI and MS. It seems like ATI's work on Xenon is giving them MAJOR leverage when it comes to Windows as well which could make nVidia totally obsolete, but also provide a true homogeneous environment between Xenon and the PC.


While I have no particular hate for nVidia, they had become exceedingly arrogant. If you've dealt with their developer relations people you'd notice that they had become almost as 'holier than thou' as 3Dfx had back when they used to dominate. I'm just glad to see them be humbled, if only for a little while.
 
Phoenix said:
When they rename all the headers, tools, and constants then I will call it something else.

All those stuff are probably gonna stick for awhile. They will probably clean old legacy function since pixelshader/vertexshader are mandatory under WGF, ....
 
"More useless bloat. "

actually, if they're offloading the entire GUI on the graphics card, it should leave more of the processor to do other things, instead of having this ugly static 2D windows interface all done on the CPU and just left to the graphics card to display.

It IS more bloatware, but really, it'll just be using that ]-[ardcore video processor that previousely went unused, so the overhead on the CPU, IMO at least, should stay within reason.

It's like that pic of the interface viewing a picture folder. getting the CPU to fetch all those thumbnails and then texturing them and displaying them in 3D like that, it'd take an insane amount of essources, but as things are going, for a graphics card that's not only a small matter, but the cards themselves are starting to have more and more access to the rest of the system, and it'll only go further down that route in the future, so I wouldn't be surprised if the gfx card could stream those thumbnails and GUI elements straight from disk all by itself, never bothering the CPU.
 
mr2mike said:
"More useless bloat. "

actually, if they're offloading the entire GUI on the graphics card, it should leave more of the processor to do other things, instead of having this ugly static 2D windows interface all done on the CPU and just left to the graphics card to display.

It IS more bloatware, but really, it'll just be using that ]-[ardcore video processor that previousely went unused, so the overhead on the CPU, IMO at least, should stay within reason.

It's like that pic of the interface viewing a picture folder. getting the CPU to fetch all those thumbnails and then texturing them and displaying them in 3D like that, it'd take an insane amount of essources, but as things are going, for a graphics card that's not only a small matter, but the cards themselves are starting to have more and more access to the rest of the system, and it'll only go further down that route in the future, so I wouldn't be surprised if the gfx card could stream those thumbnails and GUI elements straight from disk all by itself, never bothering the CPU.

Ya When I goto search, instead of seeing some animated dog I will see some Doom 3 demon with 8aa/16af.

No thanks.
 
It is bloat, but useless?! I don't think so.


I'm dying for a windows where all the graphical pooha won't slow down my computer. A 3D hardware accelerated windows will run faster than the shit we're using now. Especially those with wallpapers! (kinda ironic, people who turn of all the graphical stuff in WinXP but use a 1280*1024+ res wallpaper)

I'd say bring it on! And give us some nice skinning possibilities as well without a hacked dll. (oh, pixel shaders on the desktop :D)
 
While I have no particular hate for nVidia, they had become exceedingly arrogant. If you've dealt with their developer relations people you'd notice that they had become almost as 'holier than thou' as 3Dfx had back when they used to dominate. I'm just glad to see them be humbled, if only for a little while.


IAWTP :)
 
This is most definitely not useless. The speed difference between rendering menus and windows with a graphics card instead of the CPU is amazing. I can't see why anyone would want to handicap their computers by rendering even simple desktop graphics using only the CPU.
 
"Ya When I goto search, instead of seeing some animated dog I will see some Doom 3 demon with 8aa/16af.

No thanks."


a Doom 3 demon WOULD be better than the damn animated dog.

But this isn't about accelerating little animated helpers, it's about offloading the entire interface off of the CPU.

Think of it this way:

This text that you're reading right now, it's interpreted and rawn by the CPU. the only time the video card comes into the picture is just to send the completed desktop imge to the screen.

On the other hand, if you've got a grapics card taking care of your frontend, the CPU would receive the text, but instead of doing anything with it, it could shoot it straight to the graphics card, wich would render that text into a texture, and paste that texture on a flat polygon that would be the internet explorer window and then shoot it off to the screen without bothering the processor.
 
What if you don't own a grafx card?
 
If you dont have a GFX card, that means your rig is so old you probably have a hard time running win3.1. dont try to install longhorn
 
I have Intel Extreme Graphics 2.
 
Milhouse31 said:
All those stuff are probably gonna stick for awhile. They will probably clean old legacy function since pixelshader/vertexshader are mandatory under WGF, ....


They may just leave it in there and just define new functions etc. Like Java 2, the vector function is obsolete and inefficient however it is still left in there for compatibility with older software etc.

Hajaz said:
soooo

will my x800xt run longhorn fine?

Actually, all video cards currently in the market may be useless though. It all depends on what the difference is between WGF 1.0 and WGF 2.0 with respect to the shader model. WGF 2.0 is aimed specifically at videogames while WGF1.0 takes care of the windows interface. We already know games on longhorn will be using the uniform shading model (SM4.0), what we don't know is if that will apply to WGF 1.0 as well. Least I'm not sure.

If both do, then they will need a whole graphics card line from entry to enthusiast for longhorn. This makes more sense as to why NVIDIA would cancel the NV50. ATI has been very vocal about wanting a unified shader model and working with Microsoft on the specs for WGF1.0 and 2.0 doesn't come as a surprise. NVIDIA is completely uninterested in the technology and word has it that they've been trying to persuade Microsoft to see it their way. The cancellation of the NV50 (if true not official yet) would leave me to believe they gave up and are now focusing on a brand new chip and its derivatives to have a whole line ready for longhorn.

Society said:
Ya When I goto search, instead of seeing some animated dog I will see some Doom 3 demon with 8aa/16af.

No thanks.
:lol :lol :lol :lol


For those of you doubters, here's Sun's Project Looking Glass. Their next generation desktop.

http://wwws.sun.com/software/looking_glass/index.html
http://wwws.sun.com/software/looking_glass/demo.html <==== must see.
 
Sun, like anything else related to Java, can't design shit. Interesting though.

I can't wait for Longhorn. It's going to be a huge change. XAML alone has me excited.
 
The sun thing is theorically interesting, but in reality, that shit is so painful to look at I'd never want to use it. On the other hand, longhorn look gorgeous.
 
Anyone complaining about "bloat" must be a Linux user because there's nothing on the Linux to bloat it with.

All this "bloat" Microsoft is providing will benefit you in one way or another. I really don't even see the bloat, aside from the sidebar, which can be toggled off or on.
 
I remember seeing that Sun demo for the first time and being impressed. So I showed it to my g/f who is a software engineer and a gui designer, and she pretty much was unimpressed. Her biggest gripe was what was the point. As she disected it, I saw what she was saying and ya, I sorta feel the same. You gain almost nothing from the new Sun interface. Sure it's flashy, and maybe the notes on the back is cool, but other than that, its all useless as far as UI goes. Trasparent windows? I have that already. 3D windows that you can rotate, what do you really gain? I honestly have yet to see an effective use for a 3D interface thus far. I can see the advantage of offloading stuff from the CPU to GPU, but a lot of the actual interface itself is pretty pointless.
 
God's Hand said:
Anyone complaining about "bloat" must be a Linux user because there's nothing on the Linux to bloat it with.
Uh, you haven't visited Slashdot in the past four years. ;)
 
mr2mike said:
The sun thing is theorically interesting, but in reality, that shit is so painful to look at I'd never want to use it. On the other hand, longhorn look gorgeous.


looks very similar to longhorn to me.
 
Marty Chinn said:
Her biggest gripe was what was the point. As she disected it, I saw what she was saying and ya, I sorta feel the same. You gain almost nothing from the new Sun interface. Sure it's flashy, and maybe the notes on the back is cool, but other than that, its all useless as far as UI goes.

Exactly. After watching that Sun demo, I couldn't help but feel disgusted. It's a total cop out on their part. "Here's a great new OS in the works that all ya'll hackers out there can make something cool for us!" and they don't bring any ideas or better ways to interact to the table. 3d is all fine and dandy, but at this point, no ones made anything that uses it to improve upon how we do things. I don't see a revolution in UI coming from the current keyboard and mouse world that we live in, but instead from touch panels and voice control. The tablet pc's sounded like such a great idea, but unfortunately, they're hampered by all the baggage of the past. I really hope that MS uses all the power they have in the market to try something different that actually improves the experience. They've got the R&D, and the Market share, the only question is do they have the will and the passion. If longhorn is just a more resource intensive windows to sell more hardware, I'll be very disappointed
 
mr2mike said:
"More useless bloat. "

actually, if they're offloading the entire GUI on the graphics card, it should leave more of the processor to do other things, instead of having this ugly static 2D windows interface all done on the CPU and just left to the graphics card to display.

It IS more bloatware, but really, it'll just be using that ]-[ardcore video processor that previousely went unused, so the overhead on the CPU, IMO at least, should stay within reason.

It's like that pic of the interface viewing a picture folder. getting the CPU to fetch all those thumbnails and then texturing them and displaying them in 3D like that, it'd take an insane amount of essources, but as things are going, for a graphics card that's not only a small matter, but the cards themselves are starting to have more and more access to the rest of the system, and it'll only go further down that route in the future, so I wouldn't be surprised if the gfx card could stream those thumbnails and GUI elements straight from disk all by itself, never bothering the CPU.

Using the video card like that is just superfulous(sp?) and won't help 1 ioata when you're trying to run multiple applications. Just imagine haing Photoshop and Dreamweaver open in separate windows at the same time with a bit of the desktop showing in the bg; is your video card going to be stencil-buffering the application windows so that you can see your clock in 3d?
 
"looks very similar to longhorn to me."

ss10.jpg

longbig.jpg


Looking glass looks more clutter-ish, sideways windows just look like a transparent, space consuming, wallpaper spoiling pixely mess. The 'thick' windows with their contents written on the side are an eyesore.

longhorn = smooth and clean.

"Using the video card like that is just superfulous(sp?) and won't help 1 ioata when you're trying to run multiple applications."

So you'd rather have the CPU keep on doing everything? at least with a video accelerated desktop, you at least take the drawing of the interface out of the CPU's hands. if you want to stencil buffer windows to keep the 3d clock in view, and your 3d card can handle it, then MORE POWER TO YA. personally, I'd turn transparencies off in the settings, as there's sure to be a way to do that.
 
"So you'd rather have the CPU keep on doing everything? at least with a video accelerated desktop, you at least take the drawing of the interface out of the CPU's hands. if you want to stencil buffer windows to keep the 3d clock in view, and your 3d card can handle it, then MORE POWER TO YA. personally, I'd turn transparencies off in the settings, as there's sure to be a way to do that."

If I'm running Photshop, or better yet Maya, then I don't want a 3D 32-pipeline pixel-shaded Mail Truck skidding onto my screen to tell a new e-mail arrived in Outlook. I want those resources to be available to the applications I am working on, not for a Windows sponsored fireworks display.
 
Like I said, it's 99% assured that you'll be able to turn of crap like that, like all windows versions for the last 10 years have allowed you to disable graphical fluff.

So that means right now, you're whining just for the sake of whining.
 
Jeffahn said:
"So you'd rather have the CPU keep on doing everything? at least with a video accelerated desktop, you at least take the drawing of the interface out of the CPU's hands. if you want to stencil buffer windows to keep the 3d clock in view, and your 3d card can handle it, then MORE POWER TO YA. personally, I'd turn transparencies off in the settings, as there's sure to be a way to do that."

If I'm running Photshop, or better yet Maya, then I don't want a 3D 32-pipeline pixel-shaded Mail Truck skidding onto my screen to tell a new e-mail arrived in Outlook. I want those resources to be available to the applications I am working on, not for a Windows sponsored fireworks display.

I seriously doubt that's what will be happening.

Look at the mac. Works fine yes? Well it's already been doing this for more than 2 years. It's a step up, performance wise, not down.

Besides, you're already having 2D clog the CPU pipeline to render the interface right now... which is actually what's more important for those programs.
 
mr2mike said:
"looks very similar to longhorn to me."

ss10.jpg

longbig.jpg


Looking glass looks more clutter-ish, sideways windows just look like a transparent, space consuming, wallpaper spoiling pixely mess. The 'thick' windows with their contents written on the side are an eyesore.

Looking Glass is a proof of concept for developing a 3D OS UI and it goes whole-hog to make everything 3D, because its a proof. This Longhorn screenshot isn't of the same scope and appears to just be an Explorer application for viewing Pictures. That Longhorn shot is far from smooth and clean. Neither Sun (because they have a CD library interface that is similar) nor Microsoft understand that the best way to display this information is NOT 3D. Displaying large volumes of data layered in 3D is not intuitive and adds nothing to the usability of the application.
 
Zaptruder said:
I seriously doubt that's what will be happening.

Look at the mac. Works fine yes? Well it's already been doing this for more than 2 years. It's a step up, performance wise, not down.

Yes no kinda. Its not as simple as 'just render it in 3D'.

There are a couple of large issues:

1) Hardware Drivers: If you think having good drivers was a big deal before, wait until you can't type text because you have some video memory driver issue so your text calls return error values. Even on OSX 10.4 this is a problem because the more you want to accelerate - the better the drivers have to be. In addition, not all of Aqua is hardware accelerated. Moving your entire OS UI into textures and polygons is no small challenge.

2) Market Drivers: There really isn't a compelling application on the OS that begs to be done in 3D. Now I did like a few of the concepts that Looking Glass brought out (like being able to put notes on the back of a web browser window and such), but there isn't yet a lot of concepts that lend themselves well to being done in 3D. Does it look cool? Sure. Does it work better than it did in 2D - NO. Why? Because controlling things in 3D isn't particularly intuitive either. The metaphor of displaying things 'as they are in real life' actually doesn't work well because people don't CONTROL them like they do in real life. A 3D OS interface actually needs a better controller than the mouse/keyboard combo in order to be most effective and easy to use.

And there is one small issue

1) There is no point talking about clogging 2D pipelines and such because the 2D pixel pushing engines on video cards are more than fast enough to do any renderings necessary.

Then there are two EVIL issues

1) Support. Once we go to an (optional) 3D interface we have some new pain in the ass things to deal with. First up, not all UIs will look the same because not all hardware will support the same features

2) You're being led by the nose to upgrade. There is a reason why ATI, nVIdia, Matrox,etc. are all happy to work with Microsoft and make the OS as cool as humanly possible- it will obsolete all your hardware and you will have to upgrade. Now, not only do gamers need to get fast 3D accelerators - but much of the Windows population will as well. Its a sweetheart deal. Everyone is going to be quick to upgrade to get to a bunch of features that they won't really ever use :)
 
Top Bottom