And why not?
There are multiple blurred reflections in the demo.Because the form of reflection in that demo is just 1 ray cast along the reflection vector. That's why they are perfect mirrors. It's very cheap to compute and test for intersection.
![]()
Whereas in path tracing, you fire many rays along that reflection vector in a cone shape. Giving you blurred reflections.
![]()
I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........Are there any leaks when the new cards will be revealed? I mean they surely can't wait till October. Even if they launch end of October it would be nice to know if waiting would be wise. :/
That would be such a stupid decision on AMD's part, especially if they do have a response to 3080. People were waiting for a reasonable upgrade path since 1080ti came out, 3080 is going to accommodate this demand and because of the hype and being better value it will probably get a lot of people that would normally by a 3070 level card.I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........
It's being teased already, so you will hear something soon....
![]()
https://wccftech.com/amd-radeon-rx-6000-big-navi-graphics-card-teased-in-fortnite/
There are multiple blurred reflections in the demo.
3070 TI may be a product for you
I seriously think people are gonna get raped in the ass by the 3070 is they buy it for 4k or downsample.
My 8gig in my Vega is on the edge now and that's HBM2.
Reactions like this puzzle me.That would be such a stupid decision on AMD's part, especially if they do have a response to 3080.
Regardless, hit that 2080 is taking at 4k is quite notable.i'm suprised he confuse allocated ram and real usage
I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........
It's being teased already, so you will hear something soon....
![]()
https://wccftech.com/amd-radeon-rx-6000-big-navi-graphics-card-teased-in-fortnite/
I seriously think people are gonna get raped in the ass by the 3070 is they buy it for 4k or downsample.
My 8gig in my Vega is on the edge now and that's HBM2.
Regardless, hit that 2080 is taking at 4k is quite notable.
if real usage is not over 8gb the hit is not because of ram thoughReactions like this puzzle me.
Exactly what would happen 1 month after GPU release? Everyone who wanted to upgrade would manage to upgrade and it would be too late for 6000 series?
Regardless, hit that 2080 is taking at 4k is quite notable.
![]()
How that turned out...
Avengers with ultra textures eats ~7.5GB of VRAM in 2560x1080, and ~14GB of RAM (first game that uses 16GB)
Yeah, 8GB won't be enough for anything higher than 1440 native.
Difference is probably more about memory BW and ROPs than amount of VRAM. Doom is caching alot of stuff.
i'm suprised he confuse allocated ram and real usage
This is a stupid hot take, lolAMD is left only as the bargain basement option for poor gamers who cannot afford a proper gpu (nvidia only)![]()
yeah this graph is badHe didn't confuse anything. There's real limits to vram and sometimes you hit them. It doesn't mean the game is unplayable but it does mean at the very least lower averages, or usually - stutter (from streaming and swapping). 8 GB for Doom Eternal at 4K is NOT enough for great experience. Same story for other games. Too many people plug their ears because they don't want to hear it, lest it makes the new cards look gimped (as they are).
![]()
How is 2080 getting notably slower at 4k not "real usage"?if real usage is not over 8gb the hit is not because of ram though
I seriously think people are gonna get raped in the ass by the 3070 is they buy it for 4k or downsample.
My 8gig in my Vega is on the edge now and that's HBM2.
The 2080 Ti does have significantly more ROPs and TMUsFrom what I understand, if you’re VRAM limited your performance drops off a cliff.
Looking at that, that’s an acceptable drop in performance for the 2080. It doesn’t look like it’s VRAM limited, but it may be limited in other ways.
From what I understand, if you’re VRAM limited your performance drops off a cliff.
Looking at that, that’s an acceptable drop in performance for the 2080. It doesn’t look like it’s VRAM limited, but it may be limited in other ways.
That would be such a stupid decision on AMD's part, especially if they do have a response to 3080. People were waiting for a reasonable upgrade path since 1080ti came out, 3080 is going to accommodate this demand and because of the hype and being better value it will probably get a lot of people that would normally by a 3070 level card.
How is that obvious?It's not a choice of either okay or dead, there's steps in between. Depending on other factors you see different effects. It should be obvious that missing 0.5-1 GB isn't the same as needing another 5 GB.
i never said there wasn't a dropHow is 2080 getting notably slower at 4k not "real usage"?
How is that obvious?
I thought that the frame would need to be drawn twice if you’re trying to use 9 GB or 13 GB on an 8GB card.
Whether it’s 9 or 13 GB it still need 2 passes to draw the frame.
There's a hierarchy to what's loaded, and what's used, then there's the hierarchy to where it comes from, and to everything along the chain, and then to engine-level decisions whether lower-res assets get swapped in when your vram is choking (like what happened with FF Remake on PS4 in spring), and so on. We have lots of simplistic discussions on this forum but there's A LOT going on behind the scenes and these things have been well thought out. It would make no sense in 2020 for game performance to completely plunge when running in a small vram choke, I mean just at the rendering level, having culling and all that, it just would be a very primitive way to make games. So it may have been true decade+ ago but is not true any longer.
Nonetheless, not having enough vram can still be detrimental to the experience, it's just not a complete catastrophe.
Yeah, AMD is much better off without him. He seems to be bad luck since Intel has had major issues since he joined the company.These rumors never pan out. Big Navi barely beating a 2 year hardware? I don’t believe it.
With raja gone, I expect this to be competing with the 3080.
You seem to be having a difficult time explaining something that "should be obvious" tbh.There's a hierarchy to what's loaded, and what's used, then there's the hierarchy to where it comes from, and to everything along the chain, and then to engine-level decisions whether lower-res assets get swapped in when your vram is choking (like what happened with FF Remake on PS4 in spring), and so on. We have lots of simplistic discussions on this forum but there's A LOT going on behind the scenes and these things have been well thought out. It would make no sense in 2020 for game performance to completely plunge when running in a small vram choke, I mean just at the rendering level, having culling and all that, it just would be a very primitive way to make games. So it may have been true decade+ ago but is not true any longer.
In regards to the 3070 8GB vs. 1080 Ti/2080 Ti 11GB. A lot of you are assuming that just because there is more memory that it will bottleneck the 3070, but are failing to acknowledge how big of an advantage this could provide.
![]()
It is quite possible that could more than make up the difference when it comes to 8GB vs 11GB deficit and may even end up being non-existent for the 3080 10GB.
You seem to be having a difficult time explaining something that "should be obvious" tbh.
FixedI was wrong.
That helps you to get assets into vram, but it does nothing for vram itself. If you need to hold 1.5L of water but you have a 1L bottle, it doesn't matter how much faster one tap is than the other, you still need a bigger bottle.
It's the same mistake people make with compression technology - it helps with bandwidth but more bandwidth by itself isn't enough to totally compensate for lack of space.
Yes, but if you can fill and empty the one liter and refill it a quicker rate than filling the 1.5L bottle, that difference may be irrelevant.Not to mention that Ampere could take advantage of PCI-Express 4.0 in a way that Turing couldn't.
I am not saying you're not correct, but I am open to seeing with my own eyes rather than outright dismissing.
No need to get frustrated. I'm simply trying to understand. Is there any way you can describe how you tell the difference?Low res hiearchial geometry. Not blurred reflections dude. I know what I'm looking at and what is capable on these machines and it's not that.
No need to get frustrated. I'm simply trying to understand. Is there any way you can describe how you tell the difference?
If people use their brain they will realise than the 3080 is in fact the 3070. So that claim makes total sense.
Again use your brain.
Now this actually isnt doing much beyond RT reflections, and there is a bit of IQ reduction going on inside the reflections (geometry and resolution being scaled down), but it still looks quite good and I'd imagine this will be what most PS5/XSX raytracing will look like and imo that's still quite amazing
The context of this discussion is "is 8GB sufficient". We already have examples demonstrating that it is not.i never said there wasn't a drop
This is more of a "maybe this will make it suck less", which is quite naive a take, given that even RAM to GPU RAM transfer hurts 8GB 2080.but are failing to acknowledge how big of an advantage this could provide.
a tiny big faster
I would still get
ll other features
Chuckle.
Don't forget DF uber exclusive objectivity karma, it improves satisfaction by up to 97%, leather jacket skill by 6% (stacks up to 18%)
On a serious note, Biggest NAVI (which is about 505mm2 or 485mm vs 3090's 627mm2) is said to beat 3070 so decisively that 3070Ti as a counter to it is not viable. Also likely to come with 16GB.
Did they mention anything about a minimum storage spec to take advantage of Direct Storage and RTX I/O?In regards to the 3070 8GB vs. 1080 Ti/2080 Ti 11GB. A lot of you are assuming that just because there is more memory that it will bottleneck the 3070, but are failing to acknowledge how big of an advantage this could provide.
![]()
It is quite possible that could more than make up the difference when it comes to 8GB vs 11GB deficit and may even end up being non-existent for the 3080 10GB.
No. There is still a lot of unknowns. But I think it’s premature to assume that memory size is the only factor. Even Steve from GamersNexus mentioned that there were other factors.Did they mention anything about a minimum storage spec to take advantage of Direct Storage and RTX I/O?