What’s it like to try and build your own deep learning workstation? Is it worth it in terms of money, effort, and maintenance? Then once built, what’s the best way to utilize it? Chris and Daniel dig into questions today as they talk about Daniel’s recent workstation build. He built a workstation for his NLP and Speech work with two GPUs, and it has been serving him well (minus a few things he would change if he did it again).
Daniel’s workstation components:
- CPU – AMD YD292XA8AFWOF Ryzen Threadripper 2920X
- CPU cooler – Noctua NH-U12S TR4-SP3, Premium-Grade CPU Cooler for AMD sTRX4/TR4/SP3
- Motherboard – GIGABYTE X399 AORUS PRO
- Memory – Corsair Vengeance LPX 16GB (2x 2 packs), total 64GB
- Storage 1 – Samsung (MZ-V7S1T0B/AM) 970 EVO Plus SSD 1TB
- GPU 1 – RTX 2080 Ti
- GPU 2 – Titan RTX
- Case – Lian Li PC-O11AIR
- Power Supply – Rosewill Hercules
- Case fan(s) – Coolmaster 8mm
This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.