When it comes to deduplicating your data, a smarter approach is to dedupe at the source. Why wait until all that data gets to storage to dedupe it? There is a better way. In this video see how we to find a better way to use Mini Coopers as an analogy for data storage. Symantec BackupExec and NetBackup with Dedupe Everywhere eliminate the need for expensive “dedupe” storage by performing dedupe as part of the backup process. Don’t traverse your network with large volumes of data before deduplication occurs. Dedupe Everywhere: on the client, on the media server, on appliances, and for virtual machines.
Unable to connect to the OpenStorage device. Ensure that the network is properly configured between the device and the media server.
Backup Server: Symantec Backup Exec 2010 R3
OpenStorage Device: DataDomain DD670 running 126.96.36.199-226726
Today I was configuring DDBOOST for Backup Exec 2010 R3 using a replicated pair of DD670’s. The DD670’s were pre-configured with version 4.9 which I upgraded to 188.8.131.52-226726. I setup the 670 with a standard configuration and then configured a storageunit.
How Backup Exec Deduplication Works:
Deduplication works by dividing data into 128K segments and then storing the segments in a deduplication storage folder, along with a database that tracks the segments. Data is not stored again when a backup encounters a segment that is already stored in the deduplication storage folder. So, if you back up the same unchanged file over and over again, it is stored only one time in the deduplication storage folder.
Where the Backup Exec Deduplication Option Works Best
Deduplication only happens when the Deduplication Option detects blocks of data that are in fact the same. Operating system files deduplicate well. They are the same across multiple systems and do not change often.
Deduplication works well in the following scenarios:
? With Windows and Linux file system data
? Where the same file is backed up multiple times
? Where the percentage of data that changes is small
Where Other Backup Exec Options Work Best
Deduplication does not work well if data changes frequently or if the Deduplication Option cannot detect the duplicated blocks of data. For example, when a new bit of data is inserted at the beginning of a large file (VMDKs), the blocks of data are shifted so that none of them will match. Therefore, the file is not deduplicated.
This segment shift works against the Deduplication Option in cases where a non-file system backup is sent to the deduplication storage folder. These backups appear as one very large stream to the deduplication storage folder. Because of this, adding data early in the data stream causes the rest of the data stream to deduplicate poorly, if at all (Example: Exchange Database maintenance).
The good news is that in these cases, some Backup Exec agents can avoid backing up duplicate data with the use of traditional differential and incremental backup techniques. For example, when backing up VMWare or Hyper-V virtual machines, significantly better deduplication rates will be achieved by ensuring the Backup Exec Agent for Windows Systems is installed in each of the virtual machines and backing those machines up as though they are physical machines. Doing so allows the deduplication option to read each of the files and folders within the virtual machine and deduplicate
those individual files. (NOTE: The Agent for VMWare Virtual Machines and the Agent for Microsoft Hyper-V licenses allow for unlimited usage of the Agent for Windows Systems within the same host machine.)
Expectations for the Deduplication Option
Deduplication is data-dependent. That is, the amount of deduplication that you are going to get out of a particular data set depends on what is in the data set. Data that is all unique is not going to benefit from deduplication. Data that contains many copies of the same data will benefit from deduplication.
If there is a terabyte of source data that doesn’t have any duplicate information in it, the deduplication storage folder is going to need a terabyte of space to store it.
A deduplication storage folder has significant memory and disk space requirements. Make sure to review the requirements for the Deduplication Option before implementing it. While the option may initially work on a system that does not meet these requirements, as time goes by and the deduplication storage folder fills up, a lack of memory and disk space will cause problems.
A deduplication storage folder is significantly more complex than a backup-to-disk folder. Detecting duplicate data, tracking it in a database, and managing the interconnected links in the deduplication folder all adds up to significant memory and CPU usage. Memory, processing, and time is traded for reduced storage space requirements. This trade-off needs to be considered when choosing to use a deduplication storage folder over a backup-to-disk folder.
Backup Exec 2010 R2 is the second release of Backup Exec 2010 and is generally available to all users starting today, August 2nd, 2010.
This release features improvements in usability, licensing and renewal management, deduplication, virtualization, and extends platform and application support. Here are some of the most important features that the release introduces…
•Enhanced Installation and backup wizards reduce the time and complexity of setting up your backups
•Backup recommendation tool identifies potential gaps in your backup strategy and provides recommendations on the agent(s) required to ensure complete data protection.
•Integrated RSS and Renewal Assistant help keep you informed and current.
•NEW! support for SharePoint 2010, Exchange 2010 SP1, Microsoft SQL 2008 R2, Enterprise Vault 9, Mac OSX 10.6, NDMP NetApp ONTAP 8.0, EMC DART 6.0, and OST support for Data Domain DDOS 4.8 with Boost Technology.
And time for some good news..customers who are current with maintenance or support contracts for previous versions of their licensed Backup Exec software can upgrade to the appropriate Backup Exec 2010 R2 licenses at no additional charge.
This is my first blog post, just breaking this thing in! You will see more Backup Recovery and Archiving posts as I get rolling !