• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"I Need a New PC!" 2014 Part 2. Read OP, your 2500K will run Witcher 3. MX100s! 970!

Status
Not open for further replies.

Genio88

Member
So guys, i'm almost ready to pull the trigger and buy a Gigabyte GA-Z97P-D3 + i7 4770k to replace the fx 8350, am i doing the right thing?
 

mkenyon

Banned
Crossfire issues (specifically micro-stuttering) are largely resolved. You're not going to have optimal performance on day one, but that's true for any multi-card setup.
This is true, as long as the game isn't DX9. Then there's still issues. But there's still issues in a number of other games.

arkham-50ms.gif


arkham-r9.gif


gw2-r9.gif


gw2-50ms.gif

I get a kind of screech sound through my headset when my PC powers down (headset connected via front panel usb) but nothing when headset is disconnected, related?
Yeah, as Kennah pointed out, that's pretty normal. Front panel USB/Audio connectors are notorious for picking up random noise as they're not shielded. So if you have something like capacitors discharging, you'll hear it on them.
 

SHADES

Member
Hey what's a good utility for tying a forced fan speed to a temperature until it drops to your desired level? MSI Afterburner isn't working. My GTX 660 (Twin Frozr MSI) is really overheating, getting up to 90C in some cases. It's a humid week, yes, but central AC is on, no idea why it is getting so hot, my case does suck when it comes to airflow but it never got this bad. Dusted everything out too. I hope the card isn't dying.

Get one of those little silicon packets that come with shoes/tv's/white goods and put it in the bottom of your case, should help with humidity.
 

Akai__

Member
So it doesn't matter much that I have bought a SteelSeries Rival to decrease some of the input lag?

I'm not sure, if that'e even a thing?
Maybe if you have have a USB 1.0 port. :p
Input lag basically means, how long it will take, until your action on the PC will be displayed on your monitor. In your case it's 5ms. On normal TV's, the input lag can be much higher and very noticeable.

If anything of the above is incorrect, feel free to correct me guys.

Okay, make fun of me for not reading the OP in detail haha. Going with the Define R4. Thanks for the suggestion!

Sorry :p

Make sure you post a before and after comparison. Also, impressions, if it's silent enough for you would be great for other users.
 

Irobot82

Member
I have a few questions PC GAF.

1. If a monitor supports Display Port 1.3 (when it is finalized) does that mean it inherently support 1.2a (Adaptive-sync)?

2. Intel, AMD and Nvidia have said they plan on phasing out the old display connections by 2013-2015, do you believe the AMD R9 300's and Nvidia 800's will be the first series to only support Display Port and HDMI?


Thanks.
 

Pachimari

Member
So I just talked with a guy at the store, is it correct that I should connect my monitors with DisplayPort or DVI because it gives a much better picture than VGA?
 

Stubo

Member
What brand of AMD cards would you suggest?
Sapphire, MSI, Asus, Gigabyte

Beware of coolers which have been re-purposed from other cards as mentioned previously about the Gigabyte 290 being straight from the 780, and I think one of the Asus coolers? I'm not very well read on this.

Checking reviews for the specific card you're looking at is always a good idea!
So I just talked with a guy at the store, is it correct that I should connect my monitors with DisplayPort or DVI because it gives a much better picture than VGA?
DisplayPort, DVI and HDMI are digital - in short, yes! Definitely use one of these.
 

mkenyon

Banned
What brand of AMD cards would you suggest?
MSI, Gigabyte, ASUS. The warranty transfers so they're easier to resell. Sapphire has great non-reference coolers though, and Club3D seems to have a high number of cards that OC really well.
I have a few questions PC GAF.

1. If a monitor supports Display Port 1.3 (when it is finalized) does that mean it inherently support 1.2a (Adaptive-sync)?

2. Intel, AMD and Nvidia have said they plan on phasing out the old display connections by 2013-2015, do you believe the AMD R9 300's and Nvidia 800's will be the first series to only support Display Port and HDMI?


Thanks.
1. No.

2. No one has any idea except AMD and NVIDIA.
 

Irobot82

Member
MSI, Gigabyte, ASUS. The warranty transfers so they're easier to resell. Sapphire has great non-reference coolers though, and Club3D seems to have a high number of cards that OC really well.

1. No.

2. No one has any idea except AMD and NVIDIA.

Lame. How can you support 1.3 and not 1.2a. :-(
 

Pachimari

Member
Oh, I guess I gotta order 2 cables then. Just to be real sure, a wired mouse like the SteelSeries Rival is better to play games with than a wireless mouse, right?
 

The Llama

Member
Something something G700S

Probably the only wireless mouse I'd want to use for gaming (I have a G700 vanilla)

Just got one of these on sale at Best Buy for $60 and I love it. I used to be an MX500/MX518 devotee (used those mice for 6+ years), and this is even better.
 

riflen

Member
I have a few questions PC GAF.

1. If a monitor supports Display Port 1.3 (when it is finalized) does that mean it inherently support 1.2a (Adaptive-sync)?

2. Intel, AMD and Nvidia have said they plan on phasing out the old display connections by 2013-2015, do you believe the AMD R9 300's and Nvidia 800's will be the first series to only support Display Port and HDMI?


Thanks.

1. 1.2a? Yes. Adaptive-Sync? No. Adaptive-Sync is an optional feature of 1.2a. Monitor manufacturers will choose to support it or not.

2. Possibly. You can always adapt HDMI and DP to DVI.
 

mkenyon

Banned
To further complicate what can cause input lag, or a sluggish response.

1) Any sort of V-Sync solution, as input is almost always polled when a frame is created. When Dx essentially throws out some of those frames, you're going to have lag.

2) The actual display, as the time it takes to show that frame on screen with what you are physically doing can be different.

3) Slow frame times, as your input is being polled less frequently.

4) Large spikes in frame time can cause intermittent latency as you go from low frame times to 50ms or more.

5) The input device itself.
That's why I used the word "largely." I feel like any DX9 game should be fine on a single GPU.
At 4K?
 
Anyone know where i can find a list of commands that do things? For example i just found out that if i put "-w" in a shortcut target then i can run games in a window. I want to know more commands like that. Anyone have a site that has a list?
 

tarheel91

Member
To further complicate what can cause input lag, or a sluggish response.

1) Any sort of V-Sync solution, as input is almost always polled when a frame is created. When Dx essentially throws out some of those frames, you're going to have lag.

2) The actual display, as the time it takes to show that frame on screen with what you are physically doing can be different.

3) Slow frame times, as your input is being polled less frequently.

4) Large spikes in frame time can cause intermittent latency as you go from low frame times to 50ms or more.

5) The input device itself.

At 4K?

Maybe everything except for Witcher 2.
 

SHADES

Member
That's the capacitors discharging.

Yeah, as Kennah pointed out, that's pretty normal. Front panel USB/Audio connectors are notorious for picking up random noise as they're not shielded. So if you have something like capacitors discharging, you'll hear it on them.

Thanks gents.

Next question, is the GeForce logo on the GTX 670 FTW supposed to be illuminated? I could of sworn when the seller demonstrated the GPU the logo was bright green, but in my system it's not illuminated.

The card it's self is awesome, can't knock it (a near 50% improvement in DayZ )and the warranty transfer with EVGA was a piece of cake so it's just this little niggle I' m pondering ATM.

Also, does the 670 clock speed always sit at idle (374mhz) until more power is need in-game?
 

riflen

Member
Thanks gents.

Next question, is the GeForce logo on the GTX 670 FTW supposed to be illuminated? I could of sworn when the seller demonstrated the GPU the logo was bright green, but in my system it's not illuminated.

The card it's self is awesome, can't knock it (a near 50% improvement in DayZ )and the warranty transfer with EVGA was a piece of cake so it's just this little niggle I' m pondering ATM.

Also, does the 670 clock speed always sit at idle (374mhz) until more power is need in-game?

EDIT: I don't think your GPU has an illuminated GEFORCE logo. The first to use this was the TITAN in early 2013.

Yes, modern GPUs save power and extend lifetime by reducing clock speeds in low-usage scenarios. I'd be surprised if the clock was that low while playing a game though. How/when are you checking the core clock speed?
 

Irobot82

Member
1. 1.2a? Yes. Adaptive-Sync? No. Adaptive-Sync is an optional feature of 1.2a. Monitor manufacturers will choose to support it or not.

2. Possibly. You can always adapt HDMI and DP to DVI.

Can you show me an article that says it is an optional feature of 1.2a? All I can find that it is the only thing in 1.2a that is different from 1.2
 

Dries

Member
Lately a strange smell has been emerging from my case. It smells like plastic, but not a burnt smell. My temps are also fine. It happens at completely random times. What could this be?
 

SHADES

Member
EDIT: I don't think your GPU has an illuminated GEFORCE logo. The first to use this was the TITAN in early 2013.

Yes, modern GPUs save power and extend lifetime by reducing clock speeds in low-usage scenarios. I'd be surprised if the clock was that low while playing a game though. How/when are you checking the core clock speed?

Ok thanks, I could of sworn when the guy opened up his rig it was illuminated, I guess it might of just been a well place fan led. Oh well I'm just glad there isn't any issues.

The 670 jumps up to full throttle in game it's just me as my only other Nvidia card is a GTX 750 ti SC and when monitoring that via EVGA precision x it's always at full clock speed.

Thanks for answering my noob questions :)
 

Gambit61

Member
There's one like 5 min from me... Might go take a look. You want me to grab you a bundle for cost + tax + shipping? Probably about $125?

Thanks for the offer! I'm just wondering if it would be a decent upgrade from my X4 955 before I jump on it. Right now I keep getting weird frame drops in games even though CPU usage usually sits around 50 - 60 percent and GPU is never maxed out.

Edit: hmm, OC forum says that MB wouldn't be good for overclocking an i5.
 

riflen

Member
Can you show me an article that says it is an optional feature of 1.2a? All I can find that it is the only thing in 1.2a that is different from 1.2
There's a lot of work needed to revise a standard like DisplayPort. It's not going to be updated just to add a single feature. The standard itself has to be purchased, so I can't confirm it beyond all doubt, but there are several articles that state the function is optional.

Anandtech

To that end, the VESA sends word today that they have done just that with the latest update to the DisplayPort 1.2a standard. Adaptive-Sync (not to be confused with NVIDIA’s Adaptive V-Sync), the eDP feature that allows for variable refresh monitors, has been added to the DisplayPort 1.2a standard as an optional feature.

It's also implied in the press-release from VESA themselves:

VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync.

It doesn't really matter if it's optional in 1.2a or required in order to support 1.2a. Going forwards, the feature will be optional. You cannot say to manufacturers, "To support DP 1.4, you must implement Adaptive Sync.". The feature may not be of any use to the target market for the product, or the product might be designed to be entry-level and feature a very basic scalar to reduce cost.

You should think about Adaptive Sync as an optional feature of DisplayPort 1.2a and above. That is to say, the feature will always be optional as there will be additional cost involved for the manufacturer in supporting it.
 
EDIT: I don't think your GPU has an illuminated GEFORCE logo. The first to use this was the TITAN in early 2013.

Yes, modern GPUs save power and extend lifetime by reducing clock speeds in low-usage scenarios. I'd be surprised if the clock was that low while playing a game though. How/when are you checking the core clock speed?

The GTX 690 was the first to use the LED Visualizer.
 

Irobot82

Member
There's a lot of work needed to revise a standard like DisplayPort. It's not going to be updated just to add a single feature. The standard itself has to be purchased, so I can't confirm it beyond all doubt, but there are several articles that state the function is optional.

Anandtech



It's also implied in the press-release from VESA themselves:



It doesn't really matter if it's optional in 1.2a or required in order to support 1.2a. Going forwards, the feature will be optional. You cannot say to manufacturers, "To support DP 1.4, you must implement Adaptive Sync.". The feature may not be of any use to the target market for the product, or the product might be designed to be entry-level and feature a very basic scalar to reduce cost.

You should think about Adaptive Sync as an optional feature of DisplayPort 1.2a and above. That is to say, the feature will always be optional as there will be additional cost involved for the manufacturer in supporting it.

Thanks for the more in-depth information. I hope they won't charge a large premium for it.
 

khaaan

Member
Hey guys, stupid question but my first build is almost completed and I need one more thing to finish it up.

I'm trying to install a wireless PCI adapter (Linksys WMP45G) and my mother board (ASRock Z97 Extreme3) has some available PCI 3.0 slots BUT none of the notches seem to match up. Pics for reference:


Sorry for the poor pic, but from my understanding the mobo has two PCI 3.0 and one PCI 2.0 slot. Is there anything I can do or am I out of luck?

Edit: Trying to get the pic to work, bear with me :p
 

riflen

Member
Thanks for the more in-depth information. I hope they won't charge a large premium for it.

According to AMD, the prices will likely vary in line with the ability of the panel more than the presence of Adaptive Sync support. Here's a recent interview with AMD's Gaming Scientist where he talks about pricing.

http://youtu.be/8uoD8YKwtww?t=1h50s

If you're thinking about buying a setup now with the hope of using Adaptive-Sync in the future, be aware that the only supported GPUs right now are the R7 260/260x and R9 290/290x. All future AMD GPUs will support it however.

Hey guys, stupid question but my first build is almost completed and I need one more thing to finish it up.

I'm trying to install a wireless PCI adapter (Linksys WMP45G) and my mother board (ASRock Z97 Extreme3) has some available PCI 3.0 slots BUT none of the notches seem to match up. Pics for reference:


Sorry for the poor pic, but from my understanding the mobo has two PCI 3.0 and one PCI 2.0 slot. Is there anything I can do or am I out of luck?

Don't confuse PCI-E with PCI. Your board has 2 x PCI-E 3.0, 1 x PCI-E 2.0 and 3 x PCI slots. The Wi-Fi adapter is PCI and will only fit in one of the three PCI slots.
 

kennah

Member
Hey guys, stupid question but my first build is almost completed and I need one more thing to finish it up.

I'm trying to install a wireless PCI adapter (Linksys WMP45G) and my mother board (ASRock Z97 Extreme3) has some available PCI 3.0 slots BUT none of the notches seem to match up. Pics for reference:

[QUOTeE]
2014_07_01_16_09_28.jpg

Sorry for the poor pic, but from my understanding the mobo has two PCI 3.0 and one PCI 2.0 slot. Is there anything I can do or am I out of luck?

Edit: Trying to get the pic to work, bear with me :p[/QUOTE]
Looks like your board has three PCI slots. It should fit fine in the bottom one

And upload your pic somewhere else.

Edit for fixed pic. It's fine. Don't worry about the missing pin.
 

mkenyon

Banned
Hey guys, stupid question but my first build is almost completed and I need one more thing to finish it up.

I'm trying to install a wireless PCI adapter (Linksys WMP45G) and my mother board (ASRock Z97 Extreme3) has some available PCI 3.0 slots BUT none of the notches seem to match up. Pics for reference:



Sorry for the poor pic, but from my understanding the mobo has two PCI 3.0 and one PCI 2.0 slot. Is there anything I can do or am I out of luck?

Edit: Trying to get the pic to work, bear with me :p
PCI =/= PCI-Express, they're totally different pin configurations. It will not work.
 
Status
Not open for further replies.
Top Bottom