Offsite Backups

I met with a prospect yesterday to discuss their IT systems and needs. They backup to a NAS device on site with no offsite backup except quarterly.  This a fairly significant operation with a number of locations and all the data returns to this headquarters office. So one small fire, water damage, theft, vandalism, broken pipe, tornado – hurricane etc. and this organization loses all of their data. It amazes me that folks can so blase’ about their operational data.  There are cost effective approaches to get the data backed up to at least be taken offsite!

There are also some very nice solutions that include fast onsite hard drive to hard drive backup combined with offsite backup and pre-configured server recovery from the backup device. We represent both Zenith and Barracuda solutions in this space.

The Barracuda Backup Service makes three backup copies of an organization’s primary data: one local backup and two offsite data backups to geographically-separate data centers.

How to reduce IT costs

An enterprise VAR survey is quoted in the July 2009 INFOSTORE magazine issue on the “biggest opportunities” for customer to “reduce IT costs”. By far the biggest option was “Virtualization” with 49% of respondent mentioning it. The second choice was a surprising one – “data deduplication” – with 18% of respondents listing it. The #3 choice – way down at 4% was the not very innovative, “delay purchases”!
Data deduplication if you haven’t heard about it is an innovative way to reduce storage requirements. At a simple level if you store a 15MB email attachment on your network –there may be 10 or many more copies of that attachment in various mail boxes -all taking up storage space. Data deduplication would mean retaining just one copy with a pointer to that copy where the other copies would be. This concept can be carried down to the data block or bit level. An algorithm can assign a hash number to each string of data and store one data copy and the indexed hash numbers. In this way, your data storage requirement can be greatly reduced. So far, the main application for data deduplication has been in backup software. Note that there are risks –as with any data compression method – so care should be taken in selecting tools to do this job. Big firms with huge data storage requirements are obviously the first targets for the technology.
Virtualization – choice number 1 in this survey – is a money saver even for, and perhaps especially for, firms that are quite small. I say especially for small firms because you can get the first step copy of VMware or Microsoft’s HyperV at no cost. Now if you have one or two servers, virtualization is of no real utility, but when a special application, separate Exchange server, etc. comes along beyond that, virtualization can save costs and add powerful disaster recovery options. Of course the savings really grow as you get into more and more servers.