Both Sides Now
This article first appeared on the Idealware Blog in February of 2009.
Say you sign up for some great Web 2.0 service that allows you to bookmark web sites, annotate them, categorize them and share them. And, over a period of two or three years, you amass about 1500 links on the site with great details, cross-referencing — about a thesis paper’s worth of work. Then, one day, you log on to find the web site unavailable. News trickles out that they had a server crash. Finally, a painfully honest blog post by the site’s founder makes clear that the server crashed, the data was lost, and there were no backups. So much for your thesis, huh? Is the lesson, then, that the cloud is no place to store your work?
Well, consider this. Say you start up a Web 2.0 business that allows people to bookmark, share, categorize and annotate links on your site. And, over the years, you amass thousands of users, some solid funding, advertising revenue — things are great. Then, one day, the server crashes. You’re a talented programmer and designer, but system administration just wasn’t your strong suit. So you write a painful blog entry, letting your users know the extent of the disaster, and that the lesson you’ve learned is that you should have put your servers in the cloud.
My recent posts have advocated cloud computing, be it using web-based services like Gmail, or looking for infrastructure outsourcers who will provide you with virtualized desktops. And I’ve gotten some healthily skeptical comments, as cloud computing is new, and not without it’s risks, as made plain by the true story of the Magnolia bookmarking application, which recently went down in the flames as described above. The lessons that I walk away with from Magnolia’s experience are:
- You can run your own servers or outsource them, but you need assurances that they are properly maintained, backed up and supported. Cloud computing can be far more secure and affordable than local servers. But “the cloud”, in this case, should be a company with established technical resources, not some three person operation in a small office. Don’t be shy about requesting staffing information, resumes, and details about any potential off-site vendor’s infrastructure.
- You need local backups, no matter where your actual infrastructure lives. If you use Salesforce or Google, export your data nightly to a local data store in a usable format. Salesforce lets you export to Excel; Google supports numerous formats. Gmail now supports an Offline mode that stores your mail on the computer you access it from. If you go with a vendor who provides virtual desktop access (as I recommend here), get regular snapshots of the virtual machines. If this isn’t an over the air transfer, make sure that your vendors will provide DVDs of your data or other suitable medium.
- Don’t sign any contract that doesn’t give you full control over how you can access and manipulate your data, again, regardless of where that data resides. A lot of vendors try and protect themselves by adding contract language prohibiting mass updates and user access, even on locally-installed applications. But their need to simplify support should not be at the expense of you not having complete control over how you use your information.
- Focus on the data. Don’t bend on these requirements: Your data is fully accessible; It’s robustly backed up; and, in the case of any disaster, it’s recoverable.
Technology is a set of tools used to manage your critical information. Where that technology is housed is more of a feature set and financial choice than anything else. The most convenient and affordable place for your data to reside might well be in the cloud, but make sure that it’s the type of cloud that your data won’t fall through.