Interested in Training your own Ai Models? Wanting to speed up your inference from a Dual GPU, locally hosted ai home server? This video is a must watch. You can run A LOT of containers against a small fleet of GPUs like this. I cover the complete hardware build, drop a ton of mounting, cooling, pcie and gpu tips and tricks for this EPYC server sporting 512GB of DDR4. Ollama GPU servers have make use of multiple GPUs automatically so I am building a monster rig that we will be testing out. 👇 All Parts Used Linked Below 👇
Written piece and GPU Rack Modification Instructions
[ Ссылка ]
GPU Rack Frame [ Ссылка ]
Gigabyte MZ32-AR0 Motherboard [ Ссылка ]
RTX 3090 24GB GPU (x4) [ Ссылка ]
Kritical Thermal GPU Pads [ Ссылка ]
256GB (8x32GB) DDR4 2400 RAM [ Ссылка ]
PCIe4 Risers (x4) [ Ссылка ]
AMD EPYC 7702p [ Ссылка ]
iCUE H170i ELITE CAPELLIX [ Ссылка ]
(sTRX4 fits SP3 and retention kit comes with the CAPELLIX)
ARCTIC MX4 Thermal Paste [ Ссылка ]
CORSAIR HX1500i PSU [ Ссылка ]
4i SFF-8654 to 4i SFF-8654 (x4) [ Ссылка ]
HDD Rack Screws for Fans [ Ссылка ]
Be sure to 👍✅Subscribe✅👍 for more content like this!
Join this channel [ Ссылка ]
Thanks for watching!
Digital Spaceport Website
🌐 [ Ссылка ]
🛒Shop (Channel members get a 3% or 5% discount)
Check out [ Ссылка ] for great deals on hardware and merch.
*****
As an Amazon Associate I earn from qualifying purchases.
When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
Other Merchant Affiliate Partners for this site include, but are not limited to, Newegg and Best Buy. I earn a commission if you click on links and make a purchase from the merchant.
*****
0:00 Intro
1:09 Which motherboard
5:49 GPU rack frame
6:59 Which power supply
10:16 Water cooling
12:08 Wattage vs other servers
13:43 How much did it cost
20:37 Conclusions
Ещё видео!