NAS best practices

13 May 2022

By Grant Caley, UK chief technologist, NetApp

By Grant Caley, UK chief technologist, NetApp

In today’s modern data centre, enterprise NAS plays an important role in not just providing highly scalable user file services, but also providing application file storage for a range of mission critical applications such as AI, VMware, Oracle, and SAP HANA. As a starting point, you should consider and ensure the following:

1) Protocol: Your NAS is the swiss army knife of file storage and should support as wide a set of protocols as possible. Essential are SMB (2, 2.1, 3, 3.11 (and legacy v1 to help with migrations)) and NFS (3, 4.1, pNFS). Also consider support for iSCSI, FC, FCoE, NVMe over Fabrics/RDMA and Object S3. Simultaneous file access, via both SMB and NFS, opens collaboration options and this is often needed in the Education and Engineering sectors.

2) Performance: As you look to deploy a NAS for not just file, but more importantly application usage, performance is a key factor. By default, today’s NAS are usually flash based, and you have the option of SSD or low-latency NVMe disks. If you want to take application file performance to the next level, then look at NVMe End to End, this dramatically improves performance by optimising the protocol between NAS and server.

3) Scale: Whilst your NAS requirements may start small, it is worth considering your future expansion from day one.

Having a NAS that provides a single OS, whatever the size of requirement, reduces management complexity and improves overall service management. Scale can be delivered in multiple ways:

a. Vertical scaling, which is where you can replace controllers, with larger versions, but keeping the underlying storage, as well as growing your storage capacity as you need to expand. You might need disk storage options such as NVMe, SSD, QLC or even still traditional HDD.

b. Choose an architecture that enables the clustering of controllers, such that you can expand by adding more controllers. Important here is the option to mix different controller types, different storage options and to be able to non-disruptively move workloads across cluster members.

c. If you require a scale-out filesystem, such as for HPC, EDA and other workloads, then ensure you choose a NAS that can deliver this potentially multi-Petabyte need, as well as still supporting your other workload types.

d. Being able to securely virtualise and multi-tenant your NAS enables secure scalability, this should also support RBAC, MFA etc.

4) Cost: When choosing a NAS, options such as deduplication, compression & dynamic thin provisioning should be starting stakes. But you can also enable tiering to S3 targeting either on-premises S3 or to any of the multiple public cloud S3 offerings. Tiering can reduce your on-premises storage capacity by often up to 80%. Another often overlooked factor for controlling costs, is through automation. Your NAS should be fully API driveable, integrate with whatever framework you want, whether that is via Ansible, Terraform or just an SDK.

5) Protection: Any NAS you deploy should offer High Availability protection, support redundant hardware, multipathing to shelves and disks and integrate features such as hardware backed write journaling.

- Enterprise NAS also needs to have at least zero performance, space efficient instant snapshots. These provide the 1st level of business continuity and should integrate into the likes of Oracle RMAN, SAP, Kubernetes and of course into user Windows desktops.

- Backups are the next requirement, these should be offsite and should either be a native NAS feature, for example to on-premises or cloud S3, or integrated via a backup tool such as with Rubrik & Veeam.

- Disaster Recovery is also essential with options to replicate either Asynchronously or Synchronously, such that you can select the service level you need to protect for. If you require zero data loss and near instant failover in the event of a site outage, then considering a metro clustering option is also essential.

A final element to consider is whether the NAS can detect, protect, and alert to Ransomware attacks as well as offering immutability of files and importantly backups.

6) Multi-Cloud: If your NAS, with all its Enterprise features, were also available in AWS, Azure and GCP, then you also have options to replicate to the cloud, backup to the cloud and deploy your applications in the cloud, but with the same storage efficiencies, data protection, scale, and security, as they have on-premises.

7) Management: Finally, consider being able to manage your NAS across a hybrid multi-cloud environment, it should provide AI predictive risk analysis and recommendations, proactively, as well as offering integration into systems such as ServiceNow and other frameworks. You should also be able to deliver hybrid Multi-Cloud NAS provisioning, protection, replication and extending via a range of advanced data management options.