Can you guys educate me on why samsung ram is preferred
Its a 30nm process node vs I think 40nm for all other RAM. That means they are physically smaller, run cooler and faster with less voltage
Can you guys educate me on why samsung ram is preferred
It would be like putting a cheese grater to your eyeballs.
Save up for a month or two. Get the right screen.
Hey guys, thought I'd plug in my Samsung 32inchLA32 M61B to see how the games look on it. I have it connected via HDMI through my 560ti.
I have both displays working, however I can't seem to get games running on it. The TV apparently has a resolution of 1366x768, but I ran Chivalry at 800x600 and nothing happens. However if I run Hotline Miami it runs fine. What's the issue?
If you want to get a good look at the panel, just look at your iMac. It is literally the same one.Thank you for the recommendation on the Crossover 27q. I am trying very hard to contain drool while watching videos of it.
Installed the new hardware today. Took way longer than it should have owing to the stupid proprietary front panel wiring, but luckily I was able to work around it. The case is not ideal, but my next priority is the GPU.
I'm currently working with a GTX 460 (768MB)...it has been a very solid card, especially considering that I only paid $80 for it new. However, I would like to pick up a newer card in the near future, and I'd like to spend $150-$200.
So...when is a good time to buy, and which models should I be looking at? My monitor is 1080p.
NoRéN;46309582 said:
I really don't understand vram usage at all.. I'm running my GTX670 (2GB) at 2560x1440 and the usage is really inconsistent, Crysis 2 used 2GB, Sleeping Dogs used 1GB and Chivalry used 600MB, is there any actual way to know how much vram is really required?
I think I'd prefer to stick with nvidia.
Performance per dollar has to be accounted for though. Some of the AMD GPU's are a far better proposition.
I just saw a 670 for 400 and that seems like a great deal and not far off from a 680. I'm basing my next system off of that GPU as a matter of fact.
On a side note - I decided to go with the h100i water cooler. Thanks to mkenyon's always handy advice, I decided to go with with the Corsair 500R. I really liked this case and a side bonus is it has a top cooling intake/outtake spot for dual fans or basically one of their h100 water coolers.
The only remaining question regards air flow. If I'm using the radiator with fans on the top, is it best to use the top fans as intake or outtake? And if that does change, should I change any of the other case fans like the one in the back to an in or out? This all seems simple but air flow is serious business /headache
Are people having good experiences with the current generation of amd cards? Solid driver support and all that? I didn't have a great experience with my last amd card, but it has been awhile.
Are people having good experiences with the current generation of amd cards? Solid driver support and all that? I didn't have a great experience with my last amd card, but it has been awhile.
He was saying 150-200 for a new card. Don't know if he would like to double the budget.Their drivers are not up talk to par with nvidia but when you talk meat and potatoes performance, they are excellent options.
The only thing I would say for nvidia is that they have good driver support with PhysX support and custom AA solutions that AMD hasn't quite matched. Performance is higher but at a price.
That's why I recommend a 670 if you're going to go for an nvidia card unless you can splurge and spend the extra 50-150 on a nice 680.
Most are (as you'll probably hear from this forum), but if you really want Nvidia then the 660 is a decent card. Anything cheaper than the 660 like a 650 ti or just 650 is not really worth it, same with Radeon's 7770 and 7750 unless you need the 7750 for its low power draw.
Problems with the 660 is that it is lacking in RAM (not a true 2GB card), somewhat lacking in memory speed and doesn't overclock as well as the 7850. The 78XX overclock really well and the 7850 overclocks the most over stock of the two because it has the lowest base clock of the two.
That's because Min/Max/Average is a really bad way to measure performance. For example, here's 3 minutes of gameplay in Firefall. First chart is min/max/average, second chart is a plotline of the FPS.Huh, that's interesting. I've just read that I should be looking at 1GB+ vram for my next card. I was fine getting the 768mb model originally because the difference between it and the 1GB model was generally very insubstantial with respect to benchmarks. I also looked at min/average charts and I wasn't seeing any alarming fps drops.
Edit: double-post, sorry
I have med-small hands, and I definitely cannot claw grip my sensei. Spawn/Xornet is the way to go with claw, but as you are left handed, that's definitely not a good fit. I'll do some research for you and let you know if I find some good stuff.Back in the OP, it recommends the SS sensai has best mouse. Anyone has any experience using claw grip on it?
Long shot : Any lefty claw grip user?
I currently have a Logi G3 but I need a new mouse at work, this Dell BS is destroying my hand.
Performance per dollar has to be accounted for though. Some of the AMD GPU's are a far better proposition.
I just saw a 670 for 400 and that seems like a great deal and not far off from a 680. I'm basing my next system off of that GPU as a matter of fact.
On a side note - I decided to go with the h100i water cooler. Thanks to mkenyon's always handy advice, I decided to go with with the Corsair 500R. I really liked this case and a side bonus is it has a top cooling intake/outtake spot for dual fans or basically one of their h100 water coolers.
The only remaining question regards air flow. If I'm using the radiator with fans on the top, is it best to use the top fans as intake or outtake? And if that does change, should I change any of the other case fans like the one in the back to an in or out? This all seems simple but air flow is serious business /headache
Aren't those temperatures still a bit high?So, built the new PC. No problem. Everything was fine. Booted up and installed Windows. Installed my programs and even went to overclock. But something was wrong. CPU was idling around 70 degrees C. When running Prime95, it was 102. Looking online, it said that an improper installation of the heatsink or poor application of the thermal paste can cause a substantial increase in temperature. Well, opened it up, took of heatsink and...
Yeah, forgot to take off the sticker. No metal to metal connection. No heat dissipation. Thermal paste did nothing.
However, after removing it, temps were idle around 38-40 with stress testing overclocked @ 4.3 was at 85. Will try for 4.5 later.
HUZZAH!
That's because Min/Max/Average is a really bad way to measure performance. For example, here's 3 minutes of gameplay in Firefall. First chart is min/max/average, second chart is a plotline of the FPS.
Minimum 83 FPS, pretty good right? Wrong. Here's a plotline of the exact same data that shows, in milliseconds, how long it took to render each frame.
There are a huge number of frames that are taking over 16.7ms (60fps) to render, with some reaching 70ms (14fps). These are the kind of stutters that one would encounter when running out of VRAM.
A much more accurate measurement of 'average' experience is 99th Percentile Frame Latency, which shows us how long it took to render 99% of all frames.
In instances where there aren't major stutters, this should generally line up with average FPS. But that's the thing, average FPS completely misses situations where things are going wrong. It only polls data once a second, yet it could be averaging out 70ms frames with 8ms frames over a second to show you 80fps.
The more I'm researching this stuff, the more I can't believe that FPS has been a standard for so long. It's sooooooo bad.
Aren't those temperatures still a bit high?
Frequency matters very little compared to voltage when it comes to temperatures.I have a Hyper 212 EVO. Should it be lower? As mentioned it is OC to 4.3 (i7 3770K) but the VCORE is auto right now as opposed to being set manually. Don't know if that makes a difference.
Frequency matters very little compared to voltage when it comes to temperatures.
Turn off auto voltage, because I guarantee it is feeding your processor way too much.
I kind of wonder why Corsair has so consistently put out overpriced garbage. 600T/650D has a bad design,
*looks over at new 600T case*
whats wrong with the 600T?
That is exactly the sort of stuff I was wondering about, when comparing 660 Ti and 7950 with average FPS.
Also, I can also confirm that cooler master mice are best claw grips I've ever used, period.
Curious about SS Sensei though...
A buddy of mine (pro gamer) who I've been playing with for years kept pushing me to get a Xai, then a Sensei. I finally made the switch and he helped me adjust all of the settings on the mouse to fit my use of it, similar to a golfer getting a custom set of clubs. He would watch me stream gameplay and then make CPI X/Y/Z adjustments based on where my cursor was going in game.
I'm not kidding when I say my KDR more than doubled in T:A and Warsow pubs. My effectiveness in our T:A matches went up dramatically. I went from the bottom tier skill on our team to right in the middle.
This ability is why I suggest the Sensei over every mouse. This customization makes it an objectively better mouse than any other one out there.
Bad fans, bad airflow design, fan controller that dies constantly, inconsistent tooling and paint.*looks over at new 600T case*
whats wrong with the 600T?
My $0.02 from the Gaming mice thread. Keep in mind the mouse I otherwise use is the Xornet/Spawn. In fact, I have a Xornet here at work.
If I didn't have 1000 other things to buy, I'd replace the Xornet with another SenseiSame news here as well. I own MX518, G400 and various other logitech mice, but I can never click faster than Xornet. These switches on this mouse are so good.
Since you own both mice, that speaks volumes for Xornet... I do imagine Sensei to be a bit restrictive(?) with that form factor.
If I didn't have 1000 other things to buy, I'd replace the Xornet with another Sensei
Just a heads up that a couple of computer peripherals are on sale today including the Steel Series Sensei for $57.99.
Meh, it's fine. I'd say an EXTREME OC with hours of stress testing and benches can be quite rough on a processor. It's not like a car engine where things are actually still sealing during a break in period though. I'm not an engineer so maybe there's something I don't know, but, this sounds like one of those 'dude who equates all PC stuff to autos once said it on a forum and used analogies to prove his point so people bought it and repeated it a ton' type situations.So apologies for the noob question, but one of my friends was saying that OC day one on a new CPU is a bad idea. I should be letting it burn in a little at first. Thoughts?
My $0.02 from the Gaming mice thread. Keep in mind the mouse I otherwise use is the Xornet/Spawn. In fact, I have a Xornet here at work.
Bad fans, bad airflow design, fan controller that dies constantly, inconsistent tooling and paint.
I've gone through two personally, and have built in one for a friend. My second one I cut up to do what I wanted it to do, as I loved the aesthetics and some of the design philosophy. In the end, I just had to move to something that was a bit better.
Compared to the Switch 810 and Shinobi XL, which are the same price, it's just not very good. Heck, it's not very good compared to the Scout, 500R, Arc Midi, Define R4, yada yada.
The 550D doesn't hold a candle to the R4 or Ghost.
That's the core of the issue.This can't be true at all.
The 550D is built like an absolute beast, and has all the amenities for silent performance and fantastic airflow. I've never used the R4 to say on it, but the 550D scores as high or higher in all the reviews that I've seen.
It's also an average of 30$-40$ more than the R4. That's the only negative, imo. I happen to catch a deal that brought it down to R4 levels, though, so I'm good lol.
That's because Min/Max/Average is a really bad way to measure performance. For example, here's 3 minutes of gameplay in Firefall. First chart is min/max/average, second chart is a plotline of the FPS.
Minimum 83 FPS, pretty good right? Wrong. Here's a plotline of the exact same data that shows, in milliseconds, how long it took to render each frame.
There are a huge number of frames that are taking over 16.7ms (60fps) to render, with some reaching 70ms (14fps). These are the kind of stutters that one would encounter when running out of VRAM.
A much more accurate measurement of 'average' experience is 99th Percentile Frame Latency, which shows us how long it took to render 99% of all frames.
In instances where there aren't major stutters, this should generally line up with average FPS. But that's the thing, average FPS completely misses situations where things are going wrong. It only polls data once a second, yet it could be averaging out 70ms frames with 8ms frames over a second to show you 80fps.
The more I'm researching this stuff, the more I can't believe that FPS has been a standard for so long. It's sooooooo bad.
I have med-small hands, and I definitely cannot claw grip my sensei. Spawn/Xornet is the way to go with claw, but as you are left handed, that's definitely not a good fit. I'll do some research for you and let you know if I find some good stuff.
After upgrading my PC I've decided to put my old stuff on sale. All prices shipped.
i5-750 - Used for a couple of years now and still works fine $95 (Intel heatsink never used BTW)
ASUS P7P55D-MB Same as above two years now $45 (comes with what you see in the pic)
If buying together then $130.
Corsair 500R case (comes with all fans and screws) $95 (local pickup only. If you live in the Chicago suburbs of Skokie, Niles or Morton Grove or if you are willing to drive up there)
It has a broken USB 3.0 to USB 2.0 cable thing. It still works though (I bent the pins back myself) Some light scratches as well, it was bought and used for a few months last year.
That's because Min/Max/Average is a really bad way to measure performance. For example, here's 3 minutes of gameplay in Firefall. First chart is min/max/average, second chart is a plotline of the FPS.
Minimum 83 FPS, pretty good right? Wrong. Here's a plotline of the exact same data that shows, in milliseconds, how long it took to render each frame.
There are a huge number of frames that are taking over 16.7ms (60fps) to render, with some reaching 70ms (14fps). These are the kind of stutters that one would encounter when running out of VRAM.
A much more accurate measurement of 'average' experience is 99th Percentile Frame Latency, which shows us how long it took to render 99% of all frames.
In instances where there aren't major stutters, this should generally line up with average FPS. But that's the thing, average FPS completely misses situations where things are going wrong. It only polls data once a second, yet it could be averaging out 70ms frames with 8ms frames over a second to show you 80fps.
The more I'm researching this stuff, the more I can't believe that FPS has been a standard for so long. It's sooooooo bad.
I have med-small hands, and I definitely cannot claw grip my sensei. Spawn/Xornet is the way to go with claw, but as you are left handed, that's definitely not a good fit. I'll do some research for you and let you know if I find some good stuff.