Last week I ordered a Synology DS1815+ for the home lab. All the pieces arrived this past Friday. A good bit of my weekend was spent rebuilding the home lab to include the new Synology and vSphere 6.
My lab has been running for a couple of years and I have tested a bunch of parts and pieces of different technologies – VMware Horizon, Citrix XenDesktop, etc. Last week I had upgraded my VCSA to vSphere 6 and had originally planned to just upgrade my hosts and drive on, instead I decided to do a complete rebuild.
Here is a list of the parts I ordered:
- Synology America DiskStation 8-Bay Network Attached Storage (DS1815+)
- 4 x Seagate 2TB NAS HDD SATA 6Gb/s 64MB Cache 3.5-Inch Internal Bare Drive (ST2000VN000)
- 4GB Memory for Synology DiskStation DS1815+ DDR3-1600 SODIMM RAM Module (PARTS-QUICK BRAND)
The Synology did not come with any documentation outside of the quick start guide but it was very easy to set up. I installed the 4 x 2 TB Seagate NAS drives in slots 1-4. I had a couple of 1 TB Western Digital Red drives I installed in slots 5 and 6. I also had a couple of Crucial m4 64GB SSDs (CT064M4SSD2) which were not being used so I added these to the Synology in slots 7 and 8 to use for SSD Read-write cache. Will probably upgrade these to a couple larger SSDs once the budget allows, but for now they will work just fine (Didn’t take long to fill up all 8 slots did it).
After the drives were installed I connected the Synology up to my network and powered it on. After 60 seconds or so the Synology pulled a DHCP address; I connected to http://DHCP_Address:5000/ and followed the set up wizard. I configured the 4 x 2 TB Seagate NAS drives in a volume using the Synology Hybrid RAID (SHR) with 1 disk fault-tolerance, this will be where I host my NFS Datastores for my vSphere lab. The 2 x 1 TB Wester Digital REDs are configured in a separate RAID 1 volume, I am going to use this for general storage – things like my pictures, music, and such. I created the SSD Cache using the 2 Crucial 64GB SSDs and attached the cache to the SHR volume. The Synology was already running DSM 5.1 but it was a couple updates behind so I updated DSM 5.1 to the most recent update – DSM 5.1-5022 Update 4.
I created two NFS Datastores on the SHR volume. This post on Configuring Synology for NFS for ESXi host access helped with this.
Once the Synology was up and running I installed ESXi 6.0 on my two lab hosts. One of these, a Shuttle build, has been my primary lab host for some time. The other is an HP Microserver which I had been using as my NAS box. Both of these have a 4 port Intel 82571EB Gigabit Ethernet Controller. ESXi 6.0 installation is straight forward, only issue I had was the on-board NIC in the Shuttle is not recognized by ESXi, I seem to remember having to install a VIB for it back when I deployed ESXi 5.5 on it. Don’t really need the on-board since I have the 4 ports on the Intel card, so I did not worry about getting it working. May tackle it later.
On the ESXi hosts I configured the VMkernels to use for NFS and mounted the NFS shares on the Synology. I created two separate VLANs and assigned IP addresses on the Synology LAN 3 and LAN 4 connections on each VLAN. Each ESXi hosts has a VMkernel on each VLAN. I mounted each of the two datastores on separate VLANs, this is to balance the NFS connectivity across two 1 GB interfaces on both the Synology and the ESXi hosts (NFS 4.1 is not currently supported by the Synology).
I downloaded and installed the NFS VAAI plugin from Synology. The instructions for installing the VIB can be found here.
I deployed a new Windows 2012 R2 VM to be used for my lab Domain Controller and setup my lab.local domain.
Then I deployed the VCSA using the web installer. For some reason the first deployment failed. The appliance was imported but the customization (hostname, IP, etc) did not apply.
I deleted the failed VCSA appliance and ran the deployment again. This time it worked with out issue.
Once the VCSA was deployed I created a datacenter and added my ESXi hosts. The only thing I had saved from the original lab was my Windows Server 2012 R2 template. I deployed a couple other infrastructure VMs – an SQL server and a File Server. Deployed the vMA 6. I also deployed a View Connection and Security Server and a Windows 7 desktop for remote access – TGFVK (Thank God for vExpert Keys)
So far the Synology is performing well. Everything is running great. Have not really done any benchmarking but I am regularly pushing over 1000 write IOPS and my latency average is under 1 ms.
As part of the new lab build I updated my JumpSquares with my new lab jumps.
Now to start messing with vSphere 6. First stop, SSL certificate replacement…