BlogDeploying Veritas NetBackup in Amazon Web Services – Five Lessons Learned

cloud computing image 2

We recently worked with a well-known health services company that was struggling to maintain its aging, legacy hardware. The team wanted to deploy much of what existed in its primary datacenter into a cloud infrastructure.

After conducting its own research and testing two pilot projects, the company eventually chose Amazon Web Services (AWS).

We joined the team to help deploy Veritas NetBackup in AWS. Here’s what we learned from the experience:

  1. It’s worth the effort to deploy proper security from the start. Instead of responding to security issues when they happen, take the time to deploy proper security at the outset. In this case, the team divided access across the deployment to make sure no single person had administrative access to perform every possible task in the AWS dashboard. In this model, a single compromised account can’t destroy all the data. The backup environment lives behind a firewall, and backup admins don’t have direct access to the application or database servers.
  2. There are many ways to deploy NetBackup inside the AWS cloud. According to Veritas, more than one customer has deployed NetBackup inside AWS and was pleased with the results—until receiving the first monthly bill. To avoid that scenario, start with the sizing recommendations that Veritas has published based on its own engineering research. We used an EC2 master server and EC2 media server inside AWS with Amazon Simple Storage Service (S3) as a storage target. We also monitored the EC2 performance statistics closely for the first month, making size adjustments as necessary. This deployment strategy keeps all backup traffic inside AWS, which avoids bandwidth charges and isolates the environment from the on-site master server. The health services company could connect the two master servers later as part of advanced image replication (AIR). AIR would allow the company to send backup data into or out of the cloud as necessary. However, this design involves extra bandwidth costs.
  3. Because of its similarities to a modern ESX infrastructure, it’s easy to create the actual AWS Elastic Cloud Compute (EC2) instances. Deploying NetBackup EC2 instances is as simple as selecting the amount of CPU and RAM, the size and type of disks necessary for the workload, and then attaching the EC2 system to the proper VPC network. If you decide to go with tight security, you need to open firewall ports for NetBackup to communicate with each client, master server, media server, and the S3 storage target. We monitored performance statistics on each EC2 instance and made size adjustments to eliminate unused resources while maintaining high levels of performance.
  4. Backups inside AWS should follow the 3-2-1 rule. If you don’t have 3 copies of your data on 2 different media, with 1 copy off site, then you aren’t covering all your disaster scenarios. The health services company is very close to this standard inside the AWS infrastructure with 3 copies of its data—one spinning in production, one backed up inside the deduplication pool for NetBackup, and one written out to S3. The team can make snapshots of the EC2 machines, but these are point-in-time checkpoints of the data. Rolling back is very disruptive.
  5. Adding Amazon Glacier may not make financial sense. Amazon Glacier VTL, the cloud storage service for data archiving and long-term backup, involves extra fees. For that reason, Glacier didn’t make financial sense at the project outset. But once the health services company has written approximately 10TB of long term retention data into S3, Glacier may start providing cost savings. With Glacier in place, duplicating the long-term backups out of S3 will place the company’s data into this cheaper tier and reduce the monthly costs of S3.

Overall, we’ve found that AWS is surprisingly easy and transparent about costs. But to accurately estimate what a cloud deployment will cost, you need to do your homework up front.

Free eBook

Understanding the Cloud—What You Need to Know Before Diving In

Before choosing a public cloud provider, learn the top considerations and fundamental concepts of cloud computing.

Learn More

Free eBook

Understanding the Cloud—What You Need to Know Before Diving In

Before choosing a public cloud provider, learn the top considerations and fundamental concepts of cloud computing.

Learn More

About the Author

Jon Bousselot

Jon Bousselot, Senior Consulting Engineer

Jon Bousselot has a broad range of knowledge spanning diverse aspects of information technology architectures. With many advanced engineering certifications from leading manufacturers and over 23 years of experience, Jon can quickly grasp the big picture on large-scale project initiatives to meet project goals.