So I'm getting stuttering (framerate drops from 60 to 57, 58) all the time on frigging Rayman Origins on a 4670k+280X. It ran perfect on my old C2D with a ATI 4670. Why the shit could this be happening? :/
You using keyboard and mouse?
So I'm getting stuttering (framerate drops from 60 to 57, 58) all the time on frigging Rayman Origins on a 4670k+280X. It ran perfect on my old C2D with a ATI 4670. Why the shit could this be happening? :/
The voltage you want to be looking at is VCore. VID is something else that you can feel free to ignore.
The temps are super low, looking great.
To enable throttling, just enable Intel Speed Stepping in BIOS.
Pre-rendered video is easy peasy for any modern PC. No monster GPU required.Do I need a really good video card if I want to be able to watch 4K shows on my computer? Or is that all solely on the monitor I buy?
Pre-rendered video is easy peasy for any modern PC. No monster GPU required.
NoRéN;117338936 said:You using keyboard and mouse?
Yes you need a 4k monitorI see. What about the monitor?
Yes you need a 4k monitor
I'm on NCIX and they seem pretty expensive compared to the standard LCDs. Almost a 3-400 jump in terms of price.
that's the way the cookie crumbles my friend
Would you say it is worth it to go to 4k gaming right now? Or is it better waiting a couple years when the 4k monitor prices die down a little?
If you have the funds, sure. Plan on $1000ish for video cards at the least.Would you say it is worth it to go to 4k gaming right now? Or is it better waiting a couple years when the 4k monitor prices die down a little?
Hey now, it's not that expensive for a lot of people. Vitriol like that is best left for other threads.4K is a complete waste of money right now.
i am considering building a pc when Maxwell gpus come out... i am new to this so i dont know how much money is enough for a strong pc capable of 1080p60fps for the next 5 years.. the budget i have is between 2300$ and 2600$... how good of a pc can this money get me ?
Generally the only good way to "futureproof" is to not overspend, and to build a foundation that is easy to swap in a new GPU a few years down the line. The $1000 build in the OP with a 4790K is pretty much what you're looking for.i am considering building a pc when Maxwell gpus come out... i am new to this so i dont know how much money is enough for a strong pc capable of 1080p60fps for the next 5 years.. the budget i have is between 2300$ and 2600$... how good of a pc can this money get me ?
If you have the funds, sure. Plan on $1000ish for video cards at the least.
Hey now, it's not that expensive for a lot of people. Vitriol like that is best left for other threads.
Huh, I didn't know that could be considered vitriol. I stand by what I said, it's not a worthwhile purchase. On the gaming side, I'd wager you need a monstrous rig to run new and upcoming "next-gen" games at 4k@30fps+.On the movies side, despite of how they're filmed, most movies are edited on a "digital intermediate" of 2K, meaning that's their quality ceiling. For example, The Hobbit movies are filmed at 5K, but are edited at 2K, meaning all that extra resolution is lost to the consumer. So there goes that. 4K could be useful if you're going to do some technical work like 3D rendering or graphic design.
Then again you are recommending the guy multiple $1K video cards.![]()
Huh, I didn't know that could be considered vitriol. I stand by what I said, it's not a worthwhile purchase. On the gaming side, I'd wager you need a monstrous rig to run new and upcoming "next-gen" games at 4k@30fps+.On the movies side, despite of how they're filmed, most movies are edited on a "digital intermediate" of 2K, meaning that's their quality ceiling. For example, The Hobbit movies are filmed at 5K, but are edited at 2K, meaning all that extra resolution is lost to the consumer. So there goes that. 4K could be useful if you're going to do some technical work like 3D rendering or graphic design.
Then again you are recommending the guy multiple $1K video cards.![]()
Anything you want. You can make an amazing single gpu ITX box with that kind of money that'll play anything you throw at it for years.
(Spend 1800 and save 700 for gpu upgrade in two years)
Generally the only good way to "futureproof" is to not overspend, and to build a foundation that is easy to swap in a new GPU a few years down the line. The $1000 build in the OP with a 4790K is pretty much what you're looking for.
Well, he didLots of good reasons for 4K, he didn't say 4K gaming.
Would you say it is worth it to go to 4k gaming right now? Or is it better waiting a couple years when the 4k monitor prices die down a little?
Well, he did![]()
Agreed. Just waiting for them to throw out some monitors at work so I can setup my own fake super wide.21:9 is the real wave of the future.
Call me when it's at 120Hz, and then I'll agree.21:9 is the real wave of the future.
Call me when it's at 120Hz, and then I'll agree.
Yeah I'm going to jump in at 120hz + VBLANK support.
Team green! Freesync isn't really making strides.
Down to my last one.
US48 Only Paypal
R9 290 Asus DirectCU II $300 *No mining & Original Box*
![]()
![]()
Also 2 AMD Gold Never Settle Code $25 each
![]()
Asus 144hz Modded to Remove AG Coating for more vibrant colors. Screen is now gloss and not matte. $180 shipped
R9 290X Lightning $520 Basically new, have had it for a month, works great.
![]()
I want that VG248QE if it is in good working order. Paypal?R9 290 DCUII, AMD Never Settle Gold Codes, Glossy Modded VG248QE, R9 290X Lightning
What is it doing? I am currently testing and been using it with current drivers and it seems like some glitching, not necessarily artificating. Running other programs such as Heaven, 3dmark I see none of this.
I want that VG248QE if it is in good working order. Paypal?
The FG2421 is very tempting, but it's too expensive for me (upwards of 550$ iirc) and it seems a lot of people have flaws in their panels.
Spreadsheets. If you need to see a lot of numbers, 4k is the way to go.
Photo editing. Lots of high end photographers using 4k monitors to view and edit photos.
Early adopter. Some people like having the latest greatest even if it costs a premium. It's these people that make it worthwhile to make 4k screens so the rest of us can enjoy it later when 4k ramps up. If no one buys them when they are expensive then they will stop making them and we would be stuck at 640x480 forever.
Lots of good reasons for 4K, he didn't say 4K gaming.
Ok Guys! Someone at the AVS Forums PM'd me about something hidden in the new Nvidia Drivers. He claimed that it allowed him to run 4k resolution, obviously on a 4k set at 60fps via HDMI. However, as we all know HDMI on our GPU's are only 1.4 which are limited to 4k at 30fps. So, I happen to check into this and I seriously need someone to check my sanity because it is working!!!!
I don't know how or what, but when you go to the Screen Resolution under the Nvidia Control panel, it shows Ultra HD and the 4k resolution, but when I hit this it showed the refresh rate at only 30, but I scrolled down and there was a separate listing of resolution for PC and one of them was 3840x2160 at 60. So I selected this and sure enough the resolution changed to 4k. But figured no way a game would play at 60fps. So I booted up fraps and since it is my wifes system and she only had Elder Scrolls, I booted that up and sure as fuck it played at 4k at 60fps. I had to turn some stuff down as she only has a 770, but I am able to get 60fps at 4k through the HDMI port on the GPU. My Sony 4k was updated a while ago to have the HDMI 4 port capable of running at HDMI 2.0 specs.
How is this working guys? It goes against what everyone was told that HDMI 1.4 was not capable of doing this, but yet I am playing a game at 4k at 60fps? Can anyone that has a 4k set and PC along with the Beta drives test this please? ensure you scroll down to the PC resolutions.
Are you *sure* a 60fps image is being transmitted to the screen? Run blur test http://www.testufo.com/#test=framerates
The reason sony was able to update their HDMI ports to 2.0 is they intentionally put in overkill signal processing chips to ensure that they would handle the extra bandwidth. Nvidia has not so I don't see how this possible without effectively overclocking the signal processors for the HDMI port on the GPU (maybe similar to overclocking a screens processor for >60Hz?). My guess is 60Hz image generated, feed to display at 30Hz.
I just ran that test you linked and I get two rows, one at 60fps and one at 30fps. I also tested it with games. I have a new Panasnic 4k AX800 in my gaming room, sole purpose was that it had a Display Port to allow 4k gaming at 60fps. I played plenty of Elder Scrolls at 4k at 60fps. With this beta driver, I am using my wifes computer in our living room which has our Sony 4k set and it plays just as smooth at 60fps at 4k. Albeit I had to turn stuff down because she has a 770. But it sure works.
How is this really possible and not documented or brought up in the release notes for the beta driver?
Maybe its not supported and someone snuck it into the driver?