The 5-year Cycle - PC Build Winter 2022

Matt2000

An Unfortunate Discovery
DONOR
Joined
Feb 17, 2006
Messages
17,828
Location
Four from the top and two from the third row, UK.
Car(s)
'12 MX-5 PRHT, '02 Freelander, '90 Disco 1 Bobtail
Thought I may as well put this all in one thread instead of spreading it around, things are progressing more quickly than I thought.

It's now almost 5 years since I built this 8th gen. i7 PC, I've only really had one issue in that time and that's with the AIO. In 2019 it decided to leak (the night before the roadtrip), fortunately not causing any damage to the components. Deepcool replaced the AIO with a newer 'leak proof' model but after less than 2 years the RGB in the pump lost its mind. Needless to say, I won't be buying one of their coolers again.

So as a reminder, the specs of this machine are:

Corsair Crystal 570X case
Corsair HX750i PSU
ASUS ROG Maximus X HERO
Intel Core i7-8700K
DeepCool/GamerStorm Captain 240 Pro 240mm AIO cooler
32GB G.Skill TridentZ RGB DDR4 3466MHz
1TB Samsung 970 EVO NVMe M.2 SSD
Nvidia GeForce RTX 3080 (purchased in 2021)
Deepcool RGB front fans

The only other issue is with the ASUS Aura Sync RGB, it works but it keeps resetting itself to red. So much so that for the last few months I've just left it red. I'm moving away from that in the hope that an alternative system is better. Could be a disaster.

The reasons for the upgrade are to get the best out of the GPU now I have a 120Hz 4K display, the old i5-4690K server is struggling with the tasks I give it and my parents are still using a Core 2 Duo with Windows 7 - that sorely needs replacing. They will get the i5 everything, the 8th gen. i7 will be moved into the server case and will have the RAID/SCSI controllers shoved up it, then the new parts will go in my existing case. I quite liked the idea of a new case but wouldn't have much use for the 570X (it's way too big to be used for my parents' machine) and the new case I like is £385. Ouch.

So the new spec:

Corsair HX1000i PSU
ASUS ROG-Strix Z690-F Gaming
Intel Core i7-12700K
Thermalright deflection plate as recommended by GamersNexus (because why not?)
Corsair H150i Elite LCD 360mm AIO cooler
32GB Corsair Vengeance RGB DDR5 5600MHz RAM - to be later upgraded to 64GB
2TB Samsung 980 Pro NVMe M.2 SSD
GPU and case re-used.
New Windows 11 key.

I planned this for a Christmas break build and it still may be, I was also waiting for the 13th gen. i7s and Z790 boards to come out but the prices are way too high for me. I kicked things off earlier than expected by getting the SSD with £80 cashback and the PSU with 12.5% off.



Then in the last week I collected a load more parts. The motherboard was purchased when it was at its lowest ever price of £290 (still a lot more than I paid for the Maximus X Hero but way less than a Z790), the RGB RAM came from Corsair with a 10% discount code I was given when I bought the PSU and the CPU was bought at at a lowest price on Amazon. I could've waited but the bonus Amazon reward points on my credit card are ending in a week.

The CPU frame also came from Amazon, it was £15 but there's no way I was waiting for AliExpress. They only had the black one, that's perfect for me.



When I ordered the RAM, I also got some case parts. I got a spare triple fan/360mm radiator mount plate to go at the top as per this Corsair forum post, and a closing plate for the PSU cover that I didn't even know existed.

570X_Shroud_Installed.jpg
570X_Shroud_InBox.jpg


I don't like the way it provides a flat wall rather than guiding the air from the bottom fan neatly up to the motherboard and GPU, so I may 3D print a ramp to channel the air up above the cover, however this will at least give me something to mount it to and it was cheap.

I just need the cooler now, like I said there was no intention to get everything so soon. I'll reinstall the original Corsair RGB fans if they connect up to the Commander Core controller I have, otherwise I'll keep these ones and plug them into the motherboard. The iCue software still allows syncing with ASUS Aura for the motherboard and GPU, I just hope it's more reliable.

As far as Windows I have a new W11 key and a Pro ISO downloaded from the Volume Licencing Service Center (there is an advantage to working in IT sometimes), all the other machines will keep their installations as the hardware is moving as a whole. The Windows 7 machine will stay exactly as it is with its legit W7 Pro copy and will go in the attic for later use. I've thrown away too many computers, I miss my old Core 2 Quad Q9550.

Next update will be when I buy the cooler at a good price I suppose.
 
Last edited:
I've only really had one issue in that time and that's with the AIO. In 2019 it decided to leak (the night before the roadtrip), fortunately not causing any damage to the components. Deepcool replaced the AIO with a newer 'leak proof' model but after less than 2 years the RGB in the pump lost its mind. Needless to say, I won't be buying one of their coolers again.
When I was speccing out 5150 III I thought about going with an AIO, but knowing that it will lose fluid over time and the pump will eventually die plus the risk of it developing a leak was too much for my paranoid mind, that's why I went with a Noctua.

Are you going to try using liquid metal or stick with MX5 or some such?
 
When I was speccing out 5150 III I thought about going with an AIO, but knowing that it will lose fluid over time and the pump will eventually die plus the risk of it developing a leak was too much for my paranoid mind, that's why I went with a Noctua.

Are you going to try using liquid metal or stick with MX5 or some such?
I've never really felt the worry about it leaking, before or after the actual leak! If anything the leak showed me that the coolant doesn't come flooding out and that it theoretically doesn't cause any damage if it gets on components. Obviously a loss in cooling and subsequent overheating is bad but the system is designed to safely handle such a situation in extreme cases and my CPU wasn't damaged either. I just wouldn't recommend Deepcool/Gamerstorm.

I looked into liquid metal but at this level I don't think it's worth it over premium thermal compound, I've never made a mistake building a PC before and the chance of making a mess and shorting stuff out with liquid metal seems way too high. It's like de-lidding, worth it if you're serious and constantly requiring the maximum but not worth the risk if you aren't. I won't be. Hopefully the CPU frame will make a nice difference.
 
And if your AIO's pump should die it's not very likely to instantly kill your hardware. Everything is packed with sensors and safety limits these day.
Without looking up the expected running hours of a pump in an AIO or the average case fan, many air coolers (especially the once boxed with a CPU) aren't capable of passively cooling a CPU either so if their fan dies the situation wouldn't be that different.
 
I had an AIO pump die and it just shut down.
 
Water should be used to heat homes and exhaust heat from cars. My opinion is keep water away from my pc hardware.

These days a decent air cooler is maintenance free, which is perfect.
 
Shuffle Act I

I had some annual leave to use up so took a 5 day weekend last week and got this all sorted. I've kind of been putting off posting about it but here we go. The motherboard shuffle starts with my existing server machine, the old Z97 PRO GAMER and i5 - 4690K.

Here is what I'm starting with:


This is actually the second Z97 PRO GAMER I've had because the first one died. It was cheaper and easier to just buy another of the same, I may have been using onboard RAID at the time and that's what made me switch to Storage Spaces.


In its new case. This is a much cheaper Fractal Design Core 2500 case but it's fine for its new purpose.


This is replacing and old Core 2 Duo Windows 7 machine that my parents were still using. There wasn't much wrong with that machine as such, although I found out when replacing it that one of the fans has been making a screeching noise for a week and they didn't bother to tell me. Windows 7 is obviously outdated now and it was always noisy, plus I'd quite like to put it into storage before gets worked to death.

An oddity with this case is its SSD placement. As you can see in the picture above, there is no SSD in any of the bays. You can't see the M.2 slot but there isn't one fitted to there either. It's actually mounted to the back of the main plate, in the same position as the PSU. As such, the PSU completely covers the fitting screws and it isn't quick release. What an odd design. If I ever take it out, I will just stick it in a bay.

If it looks like the cooler will be a close fit in the picture above, that's because it is! The case advertised itself as accepting a certain sized cooler and the cooler measured just short of that, when the case was open it looked like it wouldn't fit but it does. Just.



I wasn't sure if the case fans would be noisy, but they aren't. They're probably quieter than the BeQuiet! fans I have in the server case. It's now completely silent and very quick, unburdened with the Dell RAID controller and all of the software I once had on there.

Next up, moving the motherboard from my main gaming PC to the server case, complete with 240mm AIO attached.
 
Last edited:
Shuffle Act II

Time to move the innards from my then current gaming PC to my server. This i7 is probably overkill for a server machine, but having used it for a week I can say that it makes a massive difference for stuff like video capture. The old i5 was maxing out the processor but this machine is barely using 15% of the CPU.

Not many photos really, I just swapped the parts and they worked. Moving the radiator while the pump was installed wasn't really an issue.

I did take a photo pointing out which SATA port was the hot swap for when I rebuild it in the server case.


Cards reinstalled. I had to drop the LTO4 tape drive and SCSI card because the drive didn't physically fit with the radiator, I didn't really use it much as it was so loud. If this machine ever gets squirreled away I can use it again.


The RGB is still on but just the basic hardware colours, I'm glad I don't have to see it flickering though...

Now I can finally get to the new build.
 
Act III - New Build

OK so finally the new build. Just a recap of what I bought:

Corsair HX1000i PSU
ASUS ROG-Strix Z690-F Gaming motherboard
Intel Core i7-12700K
Thermalright deflection plate
Corsair H150i Elite LCD 360mm AIO cooler
32GB Corsair Vengeance RGB DDR5 5600MHz RAM - to be later upgraded to 64GB
2TB Samsung 980 Pro NVMe M.2 SSD

I actually started this before doing anything else, as I wanted to get the CPU frame fitting out of the way while I had full concentration. This is the Thermalright frame and I followed the instructions from the Gamers Nexus video.

Stock mount was removed and CPU installed...


...then the frame fitted carefully. I've had no issues with it so I guess I did it right.


I also fitted the SSD and had to fit the PSU earlier than planned, it made sense to build my parents' PC with the old HX750i and put the HX1000i into this machine while it still had the old components. Then I found out that the ATX connection didn't match and the new PSU was supposed to be used with type 4 cables only, so changed the lot. Not too bad.

Next was to fit the new radiator and fans. The previous 240mm radiator was fitted at the top as shown above in the new case, so I wanted to try this first. I had bought a second triple fan mounting plate for the Corsair case and it fitted.


However, with the motherboard installed the top heatsinks didn't provide enough space for the fans. It looks like I probably wouldn't have been able to fit any radiator and fans up there.

I fitted the rad to the front instead and fitted the other three fans in this configuration.


The coolant pump has to be mounted that way up as the screen can only fit one way and the firmware doesn't yet allow every display option to be rotated. It means that the hoses are kind of in the way of the screen...

Everything installed, the first test startup went fine.


The top and rear fans are the old Corsair ones I got with the case, replacing the Aura controlled GamerStorm fans. I thought they would work fine with the iCue Commander Core that came with the AIO, but they don't. They're 3-pin so can't be speed controlled by this and the RGB isn't compatible either. I immediately ordered a 3-pack of the ML120 fans that come with the AIO.

I tried the XMP profile in the motherboard to get the most out of the 5600MHz DDR5 memory but it didn't work. I've left it running at 4000MHz for now, plenty fast enough and I'll look at it again later.

Cable management was ignored for a day while I waited for the new fans, but I could already tell it was going to be fun. The iCue controller is in the top left, in an unused 3.5" HDD bay.


New fans arrived and were fitted in place of the old Corsair ones. I immediately noticed an issue, the fans weren't recessed like the old ones and got very close to the coolant hoses.


I found some spare plastic and made a little protection plate to keep them safe, I highly doubt that this would make any noticeable difference to cooling but I can drill holes in it if I have to.


With that done I finally tidied up the cables. It was a tight squeeze and the vertical cover isn't actually fully closed, but it's all in.


All done and back together. I still really like the look of this case. The sync between iCue and Aura works well for the motherboard, GPU and front logo RGB. I removed the LED from the side Corsair and will look for a way to connect this up to Aura.


As you can see, I made good use of such an advanced CPU cooler.

View: https://www.youtube.com/watch?v=miXw0szDrhk
Today I ordered another 32GB of RAM as the price was good, I'd rather do it now than try and buy the same RAM in 6 months time.
 
Shuffle Act I




I wasn't sure if the case fans would be noisy, but they aren't. They're probably quieter than the BeQuiet! fans I have in the server case. It's now completely silent and very quick, unburdened with the Dell RAID controller and all of the software I once had on there.

Next up, moving the motherboard from my main gaming PC to the server case, complete with 240mm AIO attached.
When I glanced at this, I was wondering where the other half of the cooler was ;)
 
Looks better with four RAM modules!



I was a little concerned when I installed it as it didn't boot, I just got a blank screen. The DRAM LED on the board was orange, indicating that was the hold up. I tapped the spacebar and it booted up while I wasn't looking. I think it was showing the memory amount change message on the internal GPU HDMI port as I had the multi-monitor iGPU setting turned off at the time, I've since turned that off. It works fine anyway.
 
Hard to believe it has been 8 months already. I'm pleased to say that this has been a very reliable beast and it was worth every penny. The increase in performance has been very welcome, as has the increase in performance of my server and parents' machines by repurposing my old desktop and server machines respectively.

Two main sources of frustration - Windows 11 and Corsair iCue.

W11 is fine for most of the time but has its problems. Firstly the Start menu design is still not great, I like the folders with more apps in them but why can't I make the whole thing bigger like I could in W10? I'm on a damn 4K screen. Why do they only show 4 apps on the folder preview and why can't I make the icons bigger? Secondly ever since building it I've been having issues with dragging and dropping. I'll drag something to an application and it won't drop. It stays stuck to the cursor until I hit escape and try again. I'm sure it isn't the mouse, I didn't have it happen before.

Corsair iCue still crashes. Thankfully the whole machine hasn't frozen for a long time (fingers crossed) but it'll just crash for no reason and the fans/pump go to default noisy mode. Oddly, the most recent upgrade changed the pump speed profile so 'quiet' is now louder.

Oh yes, My 2TB Samsung 980 Pro was caught up in the firmware problems early this year, being supplied with the bad firmware. Thankfully there was enough media coverage for me to notice and get it updated.

Hardware wise I've had one Samsung SSD die on me - my most recent SATA SSD - a 2TB 870 EVO. This was replaced under warranty after a slightly arduous process that involved a paid call to Amsterdam before I could submit an RMA request. The process took about 10 days overall, which is good.

I've also added one more SSD, the new 4TB Corsair MP600 CORE XT M.2 NVMe. This uses another of the four M.2 slots on this board and although it isn't clear in the manual if they're all Gen4 M.2, this single slot about the double slots is. The MP600 outperforms the 980 PRO in a sequential read/write benchmark, which I wasn't expecting.

Anyway, the real reason I posted this is because I've finally found the real best use for the LCD screen on the coolant pump.

View: https://youtu.be/fTBYIeBH2bM
 
This uses another of the four M.2 slots on this board and although it isn't clear in the manual if they're all Gen4 M.2, this single slot about the double slots is.
According to the tech specs all of them are indeed PCIe 4.0 x4, and the user manual which I downloaded clearly states that as well:

1689354296608.png
 
It certainly does, I'm not sure what I was reading but I know I was in a rush to check as I was at work. Pretty sure I was confusing the M.2 sockets with the PCIe G5 and G3. I knew the Corsair SSD was backwards compatible so didn't go looking much further, and it says Gen4 on the heatsink.

Sequential read is over 6GB/s so that'll do nicely!
 
Paranoid that having one of the SSDs stuck on the case with 3M VHB tape contributed to its failure, I decided to sort out the SSD mounting situation in the case. Sounds easy. This is how they looked before, the 870 2TB being held on with VHB tape:



I don't know why Corsair chose to use these big plastic 2.5" mounts on this model of case, they're pretty terrible. The mounting system is the same as the more modern cases, so I found that I could use the simple mounts from the 1000D and a few other cases. They're made for current cases, so I'll just order them online.

1693090632417.png

Corsair's online shop didn't have enough and neither did Scan, my usual supplier. I can't think why this is such a rare part but apparently it is. I ordered 3 and they were put on back order, eventually arriving at the start of the month. The SSDs do need to be screwed to these but screws aren't included. Fortunately I have lots of screws and finding 12 wasn't hard, it just seems dumb. I needed to change the SATA cable for the 870 2TB as it was originally a 90 degree one, and it wouldn't fit with the SSD in the new position. The final result is much neater though.



Now I'm really tempted to replace that spinning rust with an SSD too. It's a 6TB WD Blue and it's OK at its job, which is as storage for older photos and videos. As it's rarely used it will spin down and then it just takes time to spin back up again when accessing certain software. I'd replace it with an 8TB Samsung 870 QVO, which will get me an extra 2TB of space along with faster access and no HDD noise. Expensive, but I think it would be worth it.

1693091117701.png


Given the money, I'd put these in my server too so I no longer need to have mechanical disks in there. Maybe I can get a bulk discount for ten. :hmm:
 
TL-DR - I've bought two more 4TB Corsair MP600 CORE XT SSDs to replace the 6TB HDD.

I was looking at the SSD options last night as the idea becomes more realistic in my head, I think there's actually a better option than buying the 8TB Samsung.

The best trustworthy price I can find for it is £369.99 on Amazon, but SSD prices are a bit all over the place at the moment and for less than that I can get two 4TB Corsair MP600 CORE XT M.2 SSDs, the same one I bought in July. In barely a month it has dropped in price so it's now cheaper than the Prime Day sale price.

I ended up going fully down the rabbit hole. Initially I had made this same post earlier (I deleted it) assuming that I could just get a 2.5" enclosure to house two of these NVMe SSDs and run them on a SATA interface. Turns out I've never looked that up before and it isn't possible, it can only be done with SATA M.2 SSDs. I don't want to be limited by those. There are some dual M.2 enclosures that use a U.2 interface, but that would require an add-on card to provide that connection and it would be messy. There is also the Sabrent EC-T3DN dual M.2 to Thunderbolt 3 external enclosure, which is a good solution but is very expensive.

I stopped and looked at the logical disks I have. A 1TB secondary photos disk with 350GB free space and a 5TB videos secondary disk with 2.5TB of free space. On my 1TB 850 EVO, a primary photos disk with 150GB free space. My motherboard has space for another two M.2 NVMe SSDs, so if I buy two more 4TB Corsair NVMe disks I can cover the uses here without having to worry about logical disks. I can combine my photos onto one SSD, get more storage and a big performance increase all in one go. With the 4TB disk I have as my primary video storage, 4TB for secondary video storage should be plenty for now. So that's what I've done, ordered two more Corsair SSDs to rid this machine of spinning rust entirely. The irony is that I will be removing one of the 2.5" SSDs, after sorting out the mounting situation.

For the server, this becomes a bit more complicated. I could still buy 8TB Samsung QVOs, but NVMe disks would still be better. There are PCIe M.2 cards that hold up to four M.2 SSDs so one of those could become a good option in the future, I'm currently using one of the two PCIe 3.0 x16 slots on the old Maxmius X board in that machine, but it just has a couple of 128GB M.2 SSDs on it and if I put the 1TB 850 EVO I'm removing from the main PC in there then I can retire those and will have two PCIe 3.0 x16 slots free.

As I didn't do it last time, here are the benchmark comparisons for the 2TB 980 PRO and the 4TB MP600 CORE XT. Interesting results.

Samsung 980 PRO 2TB
CrystalDiskMark_20230827181121.png


Corsair MP600 CORE XT 4TB
CrystalDiskMark_20230827180621.png
 
Last edited:
Be careful with those cards: at least some of them need a feature called bifurcation so you'd have to check if the board/slot you are planning on using supports that.
Thanks, I saw that on some of the products but I don't know anything about it. I may find that it's just easier to accept the SATA speeds and slight loss in value for the server, it would still be a significant improvement over the mechanical disks.
 
Top