GAF I need your help! My son want a PC to play Minecraft and Fortnite. What’s a good low-mid range PC?

This thread has convinced me to get a gaming laptop for myself. Far cheaper than a beefy rig, saves space and it would be perfect for the games I want to play.

Thanks, GAF!
 
Well when I built and tested my nephews' (Intel 9th/10th/11th gen)PCs with games(minecraft was one) the Windows task manager frequencies say otherwise. The clock immediately drops to base under any mild multi-core load except for the 9th gen K model on a top tier mobo chipset set for high performance in the bios.

Then you did something wrong.
You were probably using Windows Power Saving mode or whatever its called these days.
Use AfterBurner with all cores in OSD to monitor what the CPU is actually doing in real time.
Task Manager might have just been fluffing your balls.


Cuz underload especially something as simple as a gaming load, your CPU shouldnt be at base clocks it should be in boost.....whether its in PL1 boost or PL2 boost depends on the CPU, but even non-K CPUs in PL1 will boost as high as they can across as many cores while maintaining PL1.....and gaming isnt something so intensive to cause the CPU to generate enough heat for PL1 to downclock to like 2GHz.....thats ridiculous man.


Ive got a 2500K and 10100 connected to a H510 motherboard, so bottom of the barrel right here next to me.
I know how the 2500K works....it doesnt give a shit.
The 10100 which I accidentally bully sometimes by sending it CPU renders will still not go down to base clock, it boosts but locks its wattage.
And a CPU render is a much much much much much higher load than any gaming load you could throw at a CPU.

Just checked to make sure.
And yup itll boost across all cores in PL1.
k9SPkBv.png

^Trying to render a 10 meter glass ball with "bubbles" in it using Bidirectional Path Tracing.......a very very heavy workload.......it just locked itself to PL1 but still boosted to 4.1....I was expecting 4.0.
I had already stopped the render to take the screenshot cuz its a shitty CPU.



P.S
The 2500K was just a dear to me and I could never get rid of it so its my media center.
The 10400 is only there to support a GPU as its part of my mini renderfarm.....sometimes I accidentally use CPU rendering so the jobs get sent to it.
 
If he has a monitor, he can swap out the 13400f for a 13600k and it'd be fine. It's only an extra 50 bucks bringing the total to 1150 without the monitor.
I disagree. The Ryzen 7600X being on an active socket and AMD having no quality control issues, and it have all P cores, all of which base clock at +4GHz make it ideal for Minecraft and Fornite.

And that's not taking into account that under trying out settings Minecraft Java edition can load a CPU and mobo power draw with a discrete GPU heavily. Which is also the reason why a reputable PSU isn't optional with Minecraft.

I mentioned over a year ago killing two 750watt EVGA Bronze+ PSUs with Minecraft after replacing a much older 650watt Bronze SLI PSU with the EVGA freebie that was a cost of getting a GPU through COVID sparsity. Only to end up spending another £120 on a Fractal Platinum +850watt after donating my old SLI one to a build.

Looking through that build, most parts are the cheapest options, but with the build being for a first timer, you really want reputable brand parts for PSU, Case, NVME SSD and Mobo and RAM for a build without gatchas IMO. And as only two companies sell CPU and GPUs it pretty much needs to be a build of well know parts.

One of my nephews insisted on a case that was all looks from some nobody at a budget price (£35), and the awkardness of building and rebuilding to avoid the case gatchas meant it would have been cheaper in my time to have bought his entire budget build from the time wasted, and I still needed to buy a extension cable - with another 2 day delay - for the mobo secondary power lead to route all the PSU cables from the PSU enclosure.
 
Last edited:
Then you did something wrong.
You were probably using Windows Power Saving mode or whatever its called these days.
Use AfterBurner with all cores in OSD to monitor what the CPU is actually doing in real time.
Task Manager might have just been fluffing your balls.


Cuz underload especially something as simple as a gaming load, your CPU shouldnt be at base clocks it should be in boost.....whether its in PL1 boost or PL2 boost depends on the CPU, but even non-K CPUs in PL1 will boost as high as they can across as many cores while maintaining PL1.....and gaming isnt something so intensive to cause the CPU to generate enough heat for PL1 to downclock to like 2GHz.....thats ridiculous man.


Ive got a 2500K and 10100 connected to a H510 motherboard, so bottom of the barrel right here next to me.
I know how the 2500K works....it doesnt give a shit.
The 10100 which I accidentally bully sometimes by sending it CPU renders will still not go down to base clock, it boosts but locks its wattage.
And a CPU render is a much much much much much higher load than any gaming load you could throw at a CPU.

Just checked to make sure.
And yup itll boost across all cores in PL1.
k9SPkBv.png

^Trying to render a 10 meter glass ball with "bubbles" in it using Bidirectional Path Tracing.......a very very heavy workload.......it just locked itself to PL1 but still boosted to 4.1....I was expecting 4.0.
I had already stopped the render to take the screenshot cuz its a shitty CPU.



P.S
The 2500K was just a dear to me and I could never get rid of it so its my media center.
The 10400 is only there to support a GPU as its part of my mini renderfarm.....sometimes I accidentally use CPU rendering so the jobs get sent to it.
That's hilarious you'd suggest that given I built my first PC with my own dad (80286 at about 11years old) when preformatting a 10MB HDD took 12hours - map the bad sections - before formatting.

And rendering isn't a similar workload to gaming, not even close.... and I didn't mean "loading", I meant with the game under full loading the system.

Minecraft with settings dialled up like draw distance and sub division voxel level settings can load all 24 threads on my system.
 
That's hilarious you'd suggest that given I built my first PC with my own dad (80286 at about 11years old) when preformatting a 10MB HDD took 12hours - map the bad sections - before formatting.

And rendering isn't a similar workload to gaming, not even close.... and I didn't mean "loading", I meant with the game under full loading the system.

Minecraft with settings dialled up like draw distance and sub division voxel level settings can load all 24 threads on my system.

Please not the been building my whole life schtick but still failed to set his CPU up correctly to NOT be at baseclocks under load.
We are all experienced builders here, but we also all make mistakes sometimes especially with newer technologies....


And you are correct gaming is not similar to rendering.
Rendering is a much much harder and higher workload on the CPU.
Which is why your voltage on unlocked system while rendering will be much higher than when gaming.
Gaming is light work, rendering is trying to melt your CPU as it will use every free cycle its got, there is no wait like in gaming, its straight punishment.


If you dont believe me, then how come every single benchmark points to the opposite of what you are saying.
If I was to hazard a guess......you made a mistake somewhere, cuz your CPUs shouldnt be hanging out at baseclocks when they are gaming.

They even test different workloads across different threads, the CPU is constantly boosting:
boost-clock-analysis.png
 
IMO, you can look at what I used to do. Get a used work station from craigslist or fb marketplace, something like that. Then drop a cheap GPU into it. Shockingly good value, limited upgrades in the future, but definitely some. I got away with a GTX 980 and a 8k series 6 core CPU for years. I really didn't have any issues at all. Not the sexiest, the big towers are super heavy for some reason, but it is cheap and works. Definitely the easiest way to get into PC gaming.
 
I disagree. The Ryzen 7600X being on an active socket and AMD having no quality control issues, and it have all P cores, all of which base clock at +4GHz make it ideal for Minecraft and Fornite.

And that's not taking into account that under trying out settings Minecraft Java edition can load a CPU and mobo power draw with a discrete GPU heavily. Which is also the reason why a reputable PSU isn't optional with Minecraft.

I mentioned over a year ago killing two 750watt EVGA Bronze+ PSUs with Minecraft after replacing a much older 650watt Bronze SLI PSU with the EVGA freebie that was a cost of getting a GPU through COVID sparsity. Only to end up spending another £120 on a Fractal Platinum +850watt after donating my old SLI one to a build.

Looking through that build, most parts are the cheapest options, but with the build being for a first timer, you really want reputable brand parts for PSU, Case, NVME SSD and Mobo and RAM for a build without gatchas IMO. And as only two companies sell CPU and GPUs it pretty much needs to be a build of well know parts.

One of my nephews insisted on a case that was all looks from some nobody at a budget price (£35), and the awkardness of building and rebuilding to avoid the case gatchas meant it would have been cheaper in my time to have bought his entire budget build from the time wasted, and I still needed to buy a extension cable - with another 2 day delay - for the mobo secondary power lead to route all the PSU cables from the PSU enclosure.
As someone who owns a Ryzen 9900x3d and 7800x3d in my other computer, Ryzen is a stupid buy for budget builds. Especially when the op is giving his current computer to his son in a few years.
 
As someone who owns a Ryzen 9900x3d and 7800x3d in my other computer, Ryzen is a stupid buy for budget builds. Especially when the op is giving his current computer to his son in a few years.
For Minecraft and Fornite(twitch shooters) specifically it isn't; especially Minecraft where the game - not the server - bottlenecks on single core clock and favour P cores, but assuming it can be done in budget, a better foundation with a 3060 12gb(and wifi) versus 5060 8GB, I still think the Ryzen is the better budget option, as the the intel CPU socket is obsolete and will bottleneck any better GPUs, and the GPU options get limited by lack of reliable PSU and wattage when going too budget.

The eventual build I did was just over budget by $25 so, what's to be gained by not taking the Ryzen option? If the answer is fake frames, then for those games that isn't an advantage.


aFvuu2g.png
 
This for Fortnite and Minecraft ? You guys are crazy XD
The good brotha may start out playing just Fortnite and Minecraft, but he might get the itch to play something else more demanding. And its never a problem to overbuild if the budget allows. Get the biggest bang for your buck in its designed budget.
 
OP, this is similar to my build, excpept i have an older gpu a 3060ti, and just upgraded my cpu from a 3600 to a 5700x and went from 16gb to 32gb,
You will also want to get a few case fans. I got a kit with 5 argb fans. I have 3 in the front pulling air in through the mesh, one up top exhaust and one in the back behind the cpu. The cpu uses a tower cooler which will pull the air from the front through it out the back.

I've been building pcs since 1999 and upgrading since the 286 days of the 1980s.

People here will say you need top of the line and 4k and a 1k gpu. Nonsense. double nonsense for a kid. My kids played sims 4 , gary's mod and minecraft with server a decade ago on a pc that was a prebuilt dell i3 with an older handme down 1060.


This pc will play everything nearly maxed @ 1080p. using a 1080p monitor (ips looks great, and having 1080p means you don't need to spend as much on a gpu).
I get 70-80 fps in cyberpunk with my 5700x and 3060ti with full raytracing on (with dlss balanced).

You do not need 4k and its for a kid, they would be happy to have 720p..
You make due with what you can afford.

You can fiddle with the gpu to make it more or less expensive. The cpu/mobo/ram is am4 which means it's affordable, and silcone is at peak/mature, so stability and overclocking is better. Am5 is newer but it also means more expensive.

PCPartPicker Part List

CPU: AMD Ryzen 7 5700X 3.4 GHz 8-Core Processor ($138.00 @ Amazon)
CPU Cooler: ID-COOLING SE-214-XT ARGB 68.2 CFM CPU Cooler ($17.98 @ Amazon)
Motherboard: Asus PRIME B550M-A WIFI II Micro ATX AM4 Motherboard ($114.74 @ Amazon)
Memory: G.Skill Ripjaws V 32 GB (2 x 16 GB) DDR4-3200 CL16 Memory ($51.99 @ Newegg)
Storage: Crucial P3 Plus 1 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive ($61.95 @ iBUYPOWER)
Video Card: MSI VENTUS 2X PLUS OC GeForce RTX 5060 Ti 16 GB Video Card ($499.99 @ B&H)
Case: Thermaltake Versa H18 MicroATX Mini Tower Case ($54.99 @ Amazon)
Power Supply: EVGA 700 BR 700 W 80+ Bronze Certified ATX Power Supply ($79.98 @ Amazon)
Monitor: Acer Nitro ‎KG271 M3biip 27.0" 1920 x 1080 180 Hz Monitor ($139.99 @ Amazon)
Keyboard: Logitech Wireless Combo MK270 Wireless Standard Keyboard With Optical Mouse ($27.99 @ Best Buy)
Total: $1187.60
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2025-05-29 08:51 EDT-0400
 
Last edited:
..

And you are correct gaming is not similar to rendering.
Rendering is a much much harder and higher workload on the CPU.

..
Gaming is an isochronous workload, where you don't get a prepass like a Transaction process monitor to build an optimal execution plan like a renderer.

If that is your assertion then it is a waste of time discussing this. Games have a render window, miss it and you stall or tear and see a multi-core inefficiency cascade follow, so yes they don't work multi-core as hard as optimal execution planned renderers, and so games inevitably end up single core and primary L1/L2 cache bound in real situations where CPU and graphics are pushed and synthetic testing won't necessarily reflect that.
 
For Minecraft and Fornite(twitch shooters) specifically it isn't; especially Minecraft where the game - not the server - bottlenecks on single core clock and favour P cores, but assuming it can be done in budget, a better foundation with a 3060 12gb(and wifi) versus 5060 8GB, I still think the Ryzen is the better budget option, as the the intel CPU socket is obsolete and will bottleneck any better GPUs, and the GPU options get limited by lack of reliable PSU and wattage when going too budget.

The eventual build I did was just over budget by $25 so, what's to be gained by not taking the Ryzen option? If the answer is fake frames, then for those games that isn't an advantage.


aFvuu2g.png
Overkill, and you forgot monitor, keyboard/mouse and such. I recommended a 5700x its $139 , a 5060ti 16gb, and 32gb ram and a 1080p 27" monitor all for $1187. Going am5 is expensive as it means $200 for a cpu minium (yours is $300) that am4 will last at least 5 years, perhaps a decade today, and the kid will be old enough to get his own by that point.
Am5 also means expensive ram and motherboard.

I say get am4, you can get mobo/ram/cpu for under $300 (the price of just your builds cpu) then use the extra cash on a better gpu. 1080p monitor means maxing out almost everything.
I know the common theme on here and elsewhere "what about upgrading". Why would you need to? An am4 build is more bang for your buck. By the time it is even relevant Am5 will be super cheap and am6 or 7 or intel whatever will be out.
 
Last edited:
Overkill, and you forgot monitor, keyboard/mouse and such. I recommended a 5700x its $139 , a 5060ti 16gb, and 32gb ram and a 1080p 27" monitor all for $1187. Going am5 is expensive as it means $200 for a cpu minium (yours is $300) that am4 will last at least 5 years, perhaps a decade today, and the kid will be old enough to get his own by that point.
Am5 also means expensive ram and motherboard.

I say get am4, you can get mobo/ram/cpu for under $300 (the price of just your builds cpu) then use the extra cash on a better gpu. 1080p monitor means maxing out almost everything.
I know the common theme on here and elsewhere "what about upgrading". Why would you need to? An am4 build is more bang for your buck. By the time it is even relevant Am5 will be super cheap and am6 or 7 or intel whatever will be out.
Are you talking CAN like me or USD? in GBP that CPU is cheap at £150, and AFAIK the OP has a monitor and will have all the extras on hand, and probably isn't going to fret over Windows 11 Pro license from a CDkeys place, etc.

What you built seems good IMO, I'd still take the newer AM5 for 700Mhz per core and better caches and better efficiency and power efficiency over the lack of gen on gen improvement from 3060 to 5060 for the use case.
 
Gaming is an isochronous workload, where you don't get a prepass like a Transaction process monitor to build an optimal execution plan like a renderer.

If that is your assertion then it is a waste of time discussing this. Games have a render window, miss it and you stall or tear and see a multi-core inefficiency cascade follow, so yes they don't work multi-core as hard as optimal execution planned renderers, and so games inevitably end up single core and primary L1/L2 cache bound in real situations where CPU and graphics are pushed and synthetic testing won't necessarily reflect that.

jFp9xfQ.gif


None of the Intels CPUs since like Sandybridge will stop boosting and go to baseclocks when gaming even when thread laden.
So that nonsense that a 13400 is too slow because its base clock is 2.5GHz is bullshit.
Itll boost to atleast 4GHz.

If your CPUs stopped boosting and went to baseclocks when gaming, you were doing something wrong.

Id love you to post any evidence of this behavior with Intel CPUs especially 9/10/11th gen CPUs that you attest you actually "tested"...
With 12/13/14th dont even bother trying, ive tested way too many of these CPUs I know that is untrue.





EDIT: And please dont post like Factorio or some shit that is single threaded so the other cores/threads are effectively parked.
 
Last edited:
Are you talking CAN like me or USD? in GBP that CPU is cheap at £150, and AFAIK the OP has a monitor and will have all the extras on hand, and probably isn't going to fret over Windows 11 Pro license from a CDkeys place, etc.

What you built seems good IMO, I'd still take the newer AM5 for 700Mhz per core and better caches and better efficiency and power efficiency over the lack of gen on gen improvement from 3060 to 5060 for the use case.
I missed the part with him having a monitor and such, that changes things as gives more price to work with. Yeah in the states the cheapest AM5 is $200 on sale from $299. Its way more money here, but if monitor is not a concern then I would agree going with a better cpu would be fine. If running at 1080p that 3060 will be plenty and can always be upgraded later down the road.


Is the 5060 really not that much better? I haven't been keeping up as my 3060ti hasn't missed a beat since i got it during covid (evga step up when prices were skyhigh, lucked out on that).
 
For Minecraft and Fornite(twitch shooters) specifically it isn't; especially Minecraft where the game - not the server - bottlenecks on single core clock and favour P cores, but assuming it can be done in budget, a better foundation with a 3060 12gb(and wifi) versus 5060 8GB, I still think the Ryzen is the better budget option, as the the intel CPU socket is obsolete and will bottleneck any better GPUs, and the GPU options get limited by lack of reliable PSU and wattage when going too budget.

The eventual build I did was just over budget by $25 so, what's to be gained by not taking the Ryzen option? If the answer is fake frames, then for those games that isn't an advantage.


aFvuu2g.png
I guess you didn't read the op because socket compatibility is irrelevant. This computer isn't being kept in the long run.

The compromises you made to fit in a 7600x is essentially pointless when there's a 4% difference at most in gaming between a 13600k and a 7600x. Meanwhile, a 13600k is 30% faster in productivity.

Op could just spend an extra 50 to go to a 13600k and still spend less money than your build. You're also more than likely to be GPU limited since low end GPUs will be in use making it essentially the same. More importantly, the extra cores will come into play for streaming, running background processes, etc. Speaking from experience, if you attempt to run background processes along with gaming on a 7800x3d, performance suffers a lot. How much more so on a 6 core cpu.

Buying a 3060 at $400 is a big big scam regardless of the 12gb of vram. If vram is the primary concern, a shift to the 9060xt or 7700xt makes more sense than paying scam prices for a 3060.

All in all, at the low end, Intel offers better value for money. However if you're looking for a pc you can grow with, then it's worth the extra investment on am5. Since that growth factor is not in play, paying the amd tax is just dumb.
 
Last edited:
jFp9xfQ.gif


None of the Intels CPUs since like Sandybridge will stop boosting and go to baseclocks when gaming even when thread laden.
So that nonsense that a 13400 is too slow because its base clock is 2.5GHz is bullshit.
Itll boost to atleast 4GHz.

If your CPUs stopped boosting and went to baseclocks when gaming, you were doing something wrong.

Id love you to post any evidence of this behavior with Intel CPUs especially 9/10/11th gen CPUs that you attest you actually "tested"...
With 12/13/14th dont even bother trying, ive tested way too many of these CPUs I know that is untrue.
With high AVX2 usage that modern games do use, and with what AVX2 bios settings - and what entry level mobo chipset?
 
Last edited:
With high AVX2 usage that modern games do use, and with what AVX2 bios settings - and what entry level mobo chipset?

Mate use Minecraft or Fornite so its relevant to our discussion....hell use Cyberpunk if you want as its heavily multi threaded.
If you have to work your balls of to initiate this behavior then I think its safe to say said behavior is NOT normal so someone (you) must have made a mistake setting up the PC as Fornite and MineCraft shouldnt chug along unless the game is so low power that the CPU doesnt even need to boost up.
 
I guess you didn't read the op because socket compatibility is irrelevant. This computer isn't being kept in the long run.

The compromises you made to fit in a 7600x is essentially pointless when there's a 4% difference at most in gaming between a 13600k and a 7600x. Meanwhile, a 13600k is 30% faster in productivity.

Op could just spend an extra 50 to go to a 13600k and still spend less money than your build. You're also more than likely to be GPU limited since low end GPUs will be in use making it essentially the same. More importantly, the extra cores will come into play for streaming, running background processes, etc. Speaking from experience, if you attempt to run background processes along with gaming on a 7800x3d, performance suffers a lot. How much more so on a 6 core cpu.

Buying a 3060 at $400 is a big big scam regardless of the 12gb of vram. If vram is the primary concern, a shift to the 9060xt or 7700xt makes more sense than paying scam prices for a 3060.

All in all, at the low end, Intel offers better value for money. However if you're looking for a pc you can grow with, then it's worth the extra investment on am5. Since that growth factor is not in play, paying the amd tax is just dumb.
Even on linux?

Windows out the box is default configured to use one process shared across all entry root explorer windows - to avoid edge cases with higher latency async disk IO that can cause issues for servers, etc.

Windows configured to one process per (explorer root) window alleviates background tasks typically stalling the primary thread - for a game/foreground application - as have been the case since NT3.51 IIRC.

The 3060 still has DLSS3/4, so unless it is a RX 9070, I'm not sure AMD on such a budget warrants going lower than that.
I still stick with the general rule that the CPU and mobo should be the most expensive components other than a GPU, and combined the CPU/mobo/RAM should cost no less than the GPU on a mid-range or lower build. but I hear what you are saying, it is certainly an equally viable route although for minecraft java game client I would still be P-cores all the way.
 
Last edited:
I still stick with the general rule that the CPU and mobo should be the most expensive components other than a GPU, and combined the CPU/mobo/RAM should cost no less than the GPU on a mid-range or lower build.

General rule for budget builds is that the most expensive parts in the PC should be the most expensive parts in a PC?
What else is someone doing a budget build gonna spend money on after the GPU, CPU and MB.....256GB of RAM?

Foe-OtOakAACiWu.jpg
 
Last edited:
,,,,

Java and Bedrock can't play together.
They can, but it requires more effort than just running up a basic Java server and typically only works in a LAN setup easily.

I was able to get the Switch version to connect with my daughter's local RPI served world, but it was a ball ache and only works by proxying old services that the PaperMC, etc server cfg files redirect and needed still needed the Wifi on the switch altered specifically too - and was then only suited for playing Minecraft on Switch until I reverted the settings - IIRC
 
Top Bottom