For many years, I have been running NAS at home. It all started with a Synology DS209+ on the desk around 2010. Then an HP micro server 40NL popped in, in the garage. For this one, you can chek this site. Then, to increase capacity I added a more recent HP micro server based on Gen10 AMD Opteron(tm) X3216 APU. Unfortunately, this one suffers for bad heat dissipation so I had to add a fan in front of it. In addition performance became bad when running a ubuntu VM to host some containers even with enough memory. Time for v3 arrived !
My next-gen NAS had to match the following criterias:
small footprint, ideally to fit in a 12U rack
minimum 4 magnetic 3"1/2 disks and 2 SSD 2"1/2
Ideally 2 half height PCI slots available (1 for SAS adapter,, 1 for 10G network card)
good performance : 4 cores CPU minimum, upto 32Gb RAM
cheap :)
After googling some time checking many forums I couldn't find a nice enclosure that would match all those criterias. Until I came across a full DIY solution: a 3D printed enclosure !!!
My holy grail can be found here
Having a 3D printer I decided to go for this box that looked incredible and exceeded most of my criterias.
After some discussions I went for the following updates: .6mm noozle and .3mm layers to speed up the printing. This was somehow not recommended by the author, but I wanted to try. And finally, I was very happy with the result without needing to sand paper it. Quality of the Prusa MK3S is really great. I adapted some print/filament parameters , picked the recommended PLA 3D 850 and here we go !
As an indication, print time went from 49 to 25 hours for the drive chamber
I used :
PLA 3D 850 for the drive and power supply chambers and the top
Prusament PETG red carmine for the fan grills
Prusament PETG black for the door, hinges, fan holder, SFX power supply holder
ICE white PLA for motherboard support, SSD holder
ICE flex black for drive mounting system
The 80 pages document that comes with the .stl files is absolutely great ! Many tips and indications to ensure success. I didn't fail any pieces, except for some troubles with my old FLEX filament that had a bit of humidity. I was truly amazed on the engineering packed into this design, from the butterfly pins to the optional magnets to close the front door.
The door and grills were printed on a PEI sheet, giving a very nice finish impression
Now the box enclosure was ready, I had to "fill" it. For reasonnable TDP, I decided to go for a simple AMD setup, based on an AM4 B450 Mini-ITX motherboard as the enclosure supports this motherboard form factor. Only one drawback: there is just one PCI Express slot.
Compute components:
CPU: AMD Ryzen 3 2200G Wraith Stealth Edition (3.5 GHz)
Motherboard: Gigabyte B450 I AORUS PRO WIFI
Memory: Corsair Vengeance LPX Series Low Profile 32 Go (2x 16 Go) DDR4 3200 MHz CL16
Power supply: be quiet! SFX Power 2 400W + 8 pin/4 pin ATX adapter
Fans: 3* Noctua NF-B9 redux
Power button, front USB ports, fan splitter cable,....
Storage components
OEM LSI SAS 9211-8I LSISAS2008-IT
ZeusRAM SAS 8G
Seagate 128GO SAS SSD
Seagate 500GB SAS HDD
OCZ Vertex 60Gb for FreeNAS OS
Using iozone, which is installed by default woth FreeNAS I did some tests in different setups to see the impact of stripe versus RaidZ2, with or without cache, with or without ZIL cache.
Of course the stripe mode is not recommended for "production" or personnal data storage so I focused the tests on RaidZ2 which offer good performance while providing high level of data protection (remember RAID is not backup....)
Follwoing devices were present during performance tests :
da0: <SEAGATE ST200FM0002 0003> Fixed Direct Access SPC-3 SCSI device
da5: <SEAGATE ST1000NM0001 0002> Fixed Direct Access SPC-4 SCSI device
da1: <SEAGATE ST1000NM0001 0002> Fixed Direct Access SPC-4 SCSI device
da4: <SEAGATE ST1000NM0001 0002> Fixed Direct Access SPC-4 SCSI device
da6: <SEAGATE ST1000NM0001 0002> Fixed Direct Access SPC-4 SCSI device
da3: <SEAGATE ST1000NM0001 0002> Fixed Direct Access SPC-4 SCSI device
da2: <STEC ZeusRAM C025> Fixed Direct Access SPC-4 SCSI device
Tests have been done with 32Gb RAM, and minimum services running.
4 tests have been done to see benefits of SSD/ZeusRAM to find the best compromise.
iozone in IOPS mode has been used locally to avoid any networking bottlenecks
Long story short: SSD doesnt add much benefits as long as there is enough RAM available, and ZeusRAM adds significant performance in both read and write tests.
The pictures below show the IOPS based on different file size (upto 524 MB), and different record size (4 to 16384)
performance relies only on processor cache and memory cache
slight improvement for medium files
smoother and more consistant results
In all those cases CPU didnt exceed 15%, which leaves some CPU for additionnal tasks like running VMs
SSD will be replaced by a magnetic disk considering the weak improvement.
The 48 graphs coming from the iozone tests have the same shapes, showing a consistent behavior, without CPU/memory limits being hit. As soon as a ZeusRAM disk is used, the graph looks much more smooth.
All results are below: