Tag Archives: cloud

Meet The Idealware Bloggers Part 3: Peter Campbell

This interview was conducted by Heather Gardner-Madras and originally published on the Idealware Blog in May of 2009.

The third interview of the series is with Peter Campbell and I had a good time putting a face with the twitter conversations we’ve been having in the past year, as well as finding out more about how he came to write for the Idealware blog.

Peter Campbell

On Connecting Nonprofits & Technology
Peter’s decision to combine technology with nonprofit work was very deliberate. Well into a career as an IT director for a law firm in San Francisco he had something of an epiphany and wanted to do something more meaningful in the social services sector. It took him 9 months to find just the right job and he landed at Goodwill. In both positions he was able to take advantage of good timing and having the right executive situations to create his own vision and really bring effective change to the organizations. At Goodwill Industries, Peter developed retail management software and introduced e-commerce. Now with Earth Justice, he is also sharing his experience with the broader community.

On Blogging
Although Peter always wanted to incorporate writing as a part of his work and wrote a good bit, the advent of blogs didn’t provide a lot of motivation for him because he wanted to be sure to have something worthwhile to say. A firm believer in blogging about what you know, he was intrigued by the opportunity to blog at Idealware since the topics and style were aligned with his knowledge and experience. So while the previous 3 years of blogging had only yielded about 50 entries, this was an opportunity to get on a roll, and if you have been following this blog you know that it has really paid off and provided a lot of great resources already.

The Magic Wand Question
One of the questions I asked in each interview was this: If you had a magic wand that could transform one aspect of nonprofit technology in an instant, what would it be and why?

Peter’s answer is simple and echoes a common thread in responses to this question: Change the way nonprofit management understands technology – help them realize the value it offers, the resources needed to get the most out of it, and how to use it.

The Next 5 Years
In response to a question about what he finds to be the most exciting trend in nonprofit technology in the next five years Peter felt there are many of things to be excited about right now.

He feels that transformations in technology are cropping up quickly and nonprofits have a real opportunity to be at the forefront of these changes. The data revolution and rise of cloud computing will liberate nonprofits and turn the things we struggle with now into an affordable solution. Virtualization, as well, will provide new freedom and efficiency. According to Peter, these trends will work together to change the way we manage and invest in technology. In his words – right now its still geeky and complex, but it will get easier.

Personal snapshots
First thing you launch on your computer when you boot/in the morning?
Twitter client, then FireFox with Gmail and Google Reader and 2 blogs open in tabs.

Is there a tech term or acronym that makes you giggle and why?
Not really, but there are some that infuriate me. I am a fan of BPM (Business Process Management) because it describes what you should do – manage your processes and realize that tech is the structure to do it with, not the brain.

Favorite non-technology related thing or best non-techy skill?
Besides technology, I hope my best skill is my writing.

Which do you want first – Replicator, holodeck, transporter or warp drive?
Transporter is the great one, but I don’t want to be the beta tester.

See previous posts to learn more about Steve Backman and Laura Quinn.

 

SaaS and Security

This post was originally published on the Idealware Blog in May of 2009.

My esteemed colleague Michelle Murrain lobbed the first volley in our debate over whether tis safer to host all of your data at home, or to trust a third party with it. The debate is focused on Software as a Service (SaaS) as a computing option for small to mid-sized nonprofits with little internal IT expertise. This would be a lot more fun if Michelle was dead-on against the SaaS concept, and if I was telling you to damn the torpedos and go full speed ahead with it. But we’re all about the rational analysis here at Idealware, so, while I’m a SaaS advocate and Michelle urges caution, there’s plenty of give and take on both sides.

Michelle makes a lot of sound points, focusing on the very apt one that a lack of organizational technology expertise will be just as risky a thing in an outsourced arrangement as it is in-house. But I only partially agree.

  • Security: Certainly, bad security procedures are bad security procedures, and that risk exists in both environments. But beyond the things that could be addressed by IT-informed policies, there are also the security precautions that require money to invest in and staff to support, like encryption and firewalls. I reject the argument that the data is safer on an unsecured, internal network than it is in a properly secured, PCI-Compliant, hosted environment. You’re not just paying the SaaS provider to manage the servers that you manage today; you’re paying them to do a more thorough and compliant job at it.
  • Backups: Many tiny nonprofits don’t have reliable backup in place; a suitable SaaS provider will have that covered. While you will also want them to provide local backups (either via scheduled download or regular shipment of DVDs), even without that, it’s conceivable that the hosted situation will provide you with better redundancy than your own efforts.
  • Data Access: Finally, data access is key, but I’ve seen many cases where vendor licensing restricts users from working with their own data on a locally installed server. Being able to access your data, report on it, back it up, and, if you choose, globally update it is the ground floor that you negotiate to for any data management system, be it hosted or not. To counter Michelle, resource-strapped orgs might be better off with a hosted system that comes with data management services than an internal one that requires advanced SQL training to work with.

Where we might really not see eye to eye on this is in our perception of how ‘at risk” these small nonprofits are, and I look at things like increasing governmental and industry regulation of internal security around credit cards and donor information as a time bomb for many small orgs, who might soon find themselves facing exorbitant fines or criminal charges for being your typical nonprofit, managing their infrastructure on a shoestring and, by necessity, skimping on some of the best practices. It’s simple – the more we invest in administration, the worse we look in our Guidestar ratings. In that scenario, outsourcing this expertise is a more affordable and reliable option than trying to staff to it, or, worse, hope we don’t get caught.

But one point of Michelle’s that I absolutely agree with is that IT-starved nonprofits lack the internal expertise to properly assess hosting environments. In any outsourcing arrangement, the vendors have to be thoroughly vetted, with complete assurances about your access to data, their ability to protect it, and their plans for your data if their business goes under. Just as you wouldn’t delegate your credit card processing needs to some kid in a basement, you can trust your critical systems to some startup with no assurance of next year’s funding. So this is where you make the right investments, avail yourself of the type of information that Idealware provides, and hire a consultant.

To me, there are two types of risk: The type you take, and the type you foster by assuming that your current practices will suffice in an ever-changing world (more on this next week). Make no mistake, SaaS is a risky enterprise. But managing your own technology without tech-savvy staff on hand is something worse than taking a risk – it’s setting yourself up for disaster. While there are numerous ways to mitigate that, none of them are dollar or risk free, and SaaS could prove to be a real bang for your buck alternative, in the right circumstances.

Both Sides Now

This article first appeared on the Idealware Blog in February of 2009.

Say you sign up for some great Web 2.0 service that allows you to bookmark web sites, annotate them, categorize them and share them. And, over a period of two or three years, you amass about 1500 links on the site with great details, cross-referencing — about a thesis paper’s worth of work. Then, one day, you log on to find the web site unavailable. News trickles out that they had a server crash. Finally, a painfully honest blog post by the site’s founder makes clear that the server crashed, the data was lost, and there were no backups. So much for your thesis, huh? Is the lesson, then, that the cloud is no place to store your work?

Well, consider this. Say you start up a Web 2.0 business that allows people to bookmark, share, categorize and annotate links on your site. And, over the years, you amass thousands of users, some solid funding, advertising revenue — things are great. Then, one day, the server crashes. You’re a talented programmer and designer, but system administration just wasn’t your strong suit. So you write a painful blog entry, letting your users know the extent of the disaster, and that the lesson you’ve learned is that you should have put your servers in the cloud.

My recent posts have advocated cloud computing, be it using web-based services like Gmail, or looking for infrastructure outsourcers who will provide you with virtualized desktops. And I’ve gotten some healthily skeptical comments, as cloud computing is new, and not without it’s risks, as made plain by the true story of the Magnolia bookmarking application, which recently went down in the flames as described above. The lessons that I walk away with from Magnolia’s experience are:

  • You can run your own servers or outsource them, but you need assurances that they are properly maintained, backed up and supported. Cloud computing can be far more secure and affordable than local servers. But “the cloud”, in this case, should be a company with established technical resources, not some three person operation in a small office. Don’t be shy about requesting staffing information, resumes, and details about any potential off-site vendor’s infrastructure.
  • You need local backups, no matter where your actual infrastructure lives. If you use Salesforce or Google, export your data nightly to a local data store in a usable format. Salesforce lets you export to Excel; Google supports numerous formats. Gmail now supports an Offline mode that stores your mail on the computer you access it from. If you go with a vendor who provides virtual desktop access (as I recommend here), get regular snapshots of the virtual machines. If this isn’t an over the air transfer, make sure that your vendors will provide DVDs of your data or other suitable medium.
  • Don’t sign any contract that doesn’t give you full control over how you can access and manipulate your data, again, regardless of where that data resides. A lot of vendors try and protect themselves by adding contract language prohibiting mass updates and user access, even on locally-installed applications. But their need to simplify support should not be at the expense of you not having complete control over how you use your information.
  • Focus on the data. Don’t bend on these requirements: Your data is fully accessible; It’s robustly backed up; and, in the case of any disaster, it’s recoverable.

Technology is a set of tools used to manage your critical information. Where that technology is housed is more of a feature set and financial choice than anything else. The most convenient and affordable place for your data to reside might well be in the cloud, but make sure that it’s the type of cloud that your data won’t fall through.

The Sky is Calling

This post originally appeared on the Idealware Blog in February of 2009.

My big post contrasting full blown Microsoft Exchange Server with cloud-based Gmail drew a couple of comments from friends in Seattle. Jon Stahl of One/Northwest pointed out, helpfully, that MS sells it’s Small Business Server product to companies with a maximum of 50 employees, and that greatly simplifies and reduces cost for Exchange. After that, Patrick Shaw of NPower Seattle took it a step further, pointing out that MS Small Business Server, with a support arrangement from a great company like NPower (the “great” is my addition – I’m a big fan), can cost as little as $4000 a year and provide Windows Server, Email, Backup and other functions, simplifying a small office’s technology and outsourcing the support. This goes a long way towards making the chaos I described affordable and attainable for cash and resource strapped orgs.

What I assume Npower knows, though, and hope that other nonprofit technical support providers are aware of, is that this is the outdated approach. Nonprofits should be looking to simplify technology maintenance and reduce cost, and the cloud is a more effective platform for that. As ReadWriteWeb points out, most small businesses — and this can safely be assumed to include nonprofits — are completely unaware of the benefits of cloud computing and virtualization. If your support arrangement is for dedicated, outsourced management of technology that is housed at your offices, then you still have to purchase that hardware and pay someone to set it up. The benefits of virtualization and fast, ubiquitous Internet access offer a new model that is far more flexible and affordable.

One example of a company that gets this is MyGenii. They offer virtualized desktops to nonprofits and other small businesses. As I came close to explaining in my Lean, Green, Virtualized Machine post, virtualization is technology that allows you to, basically, run many computers on one computer. The environmental and financial benefits of doing what you used to do on multiple systems all on one system are obvious, but there are also huge gains in manageability. When a PC is a file that can be copied and modified, building new and customized PCs becomes a trivial function. Take that one step further – that this virtual PC is stored on someone else’s property, and you, as a user, can load it up and run it from your home PC, laptop, or (possibly) your smartphone, and you now have flexible, accessible computing without the servers to support.

For the tech support service, they either run large servers with virtualization software (there are many powerful commercial and open source systems available), or they use an outsourced storage platform like Amazon’s EC2 service. In addition to your servers, they also house your desktop operating systems. Running multiple servers and desktops on single servers is far more economical; it better utilizes the available server power, reducing electricity costs and helping the environment; and backups and maintenance are simplified. The cost savings of this approach should benefit both the provider and the client.

In your office, you still need networked PCs with internet access. But all you need on those computers is a basic operating system that can boot up and connect to the hosted, virtualized desktop. Once connected, that desktop will recognize your printers and USB devices. If you make changes, such as changing your desktop wallpaper or adding an Outlook plugin, those changes will be retained. The user experience is pretty standard. But here’s a key benefit — if you want to work from home, or a hotel, or a cafe, then you connect to the exact same desktop as the one at work. It’s like carrying your computer everywhere you go, only without the carrying part required.

So, it’s great that there are mission focused providers out there who will affordably support our servers. But they could be even more affordable, and more effective, as cloud providers, freeing us from having to own and manage any servers in the first place.

Colossus vs. Cloud – an Email System Showdown

This post was originally published on the Idealware Blog in January of 2009.

If your nonprofit has 40 or more people on staff, it’s a likely bet that you use Microsoft Exchange as your email server. There are, of course, many nonprofits that will use the email services that come with your web hosting, and there are some using legacy products like Novell’s Groupwise or Lotus Notes/Domino. But the market share for email and groupware has gone to Microsoft, and, at this point, the only compelling up and coming competition comes from Google.

There are reasons why Microsoft has dominated the market. Exchange is a mature and powerful product, that does absolutely everything that an email system has to do, and offers powerful calendaring, contact management and information sharing features on top of it. A quick comparison to Google’s GMail offering might look a bit like “Bambi vs. Godzilla“. And, as Michelle pointed out the other day, GMail might be a risky proposition, despite it being more affordable, because it puts your entire mail store “in the cloud”. But Gmail’s approach is so radically different from Microsoft’s that I think it deserves a more detailed pro/con comparison.

Before we start, it’s important to acknowledge that the major difference is the hosted/cloud versus local installation, and there’s a middle ground – services that host Exchange for you – Microsoft even has their own cloud service. If you are evaluating email platforms and including GMail and Exchange, hosted Exchange should be weighed as an additional option. But my goal here is to contrast the new versus the traditional, and traditional Exchange installations are in your server room, not someone else’s.

Server Platform

Installing Exchange is not a simple task. Smaller organizations can get away with cheaper hardware, but the instructions say that you’ll need a large server for mail storage; a secondary server for web and internet functions, and, most likely, a third server to house your third party anti-spam and anti-virus solutions. Plus, Exchange won’t work in a Linux or Novell network – there has to be an additional server running Microsoft’s Active Directory in place before you can even install it. It can be a very stable product if you get the installation right, but getting it right means doing a lot of prep and research, because the slim documents that come in the box don’t prepare you for the complexity. Once you have it running, you have to run regular maintenance and keep a close watch – along with mailbox limits – to insure that the message bases don’t fill up or corrupt.

GMail, on the other hand, is only available as a hosted solution. Setup is a matter of mapping your domain to Google’s services (can be tricky, but child’s play compared to Exchange) and adding your users.

Win – GMail. It saves you a lot of expense, when you factor in the required IT time and expertise with the hardware and software costs for multiple servers.

EMail Clients

Outlook has it’s weaknesses – slow and obtuse search, poor spam handling, and a tendency toward unexplained crashes and slowdowns on a regular basis. But, as a traditional mail client, it has a feast of features. There isn’t much that you can’t do with it. One of the most compelling reasons to stick with Outlook is it’s extensibility. Via add-ons and integrations, Outlook can serve as a portal to applications, databases, web sites and communications. In a business environment, you might be sacrificing some key functionality without it, much as you often have to use Internet explorer in order to access business-focused web sites.

But where Outlook is a very hefty application, with tons of features and settings buried in it’s cavernous array of menus and dialog boxes, Gmail is deceptively uncluttered. The truth is that the web-based GMail client can do a lot of sophisticated tricks, including a few that Outlook can’t — like allowing you to decide that you’d rather “Reply to All” mid-message — and some that you can only do with Outlook by enabling obscure features and clicking around a lot, like threading conversations and applying multiple “tags” to a single message. Gmail is the first mail client to burst out of the file cabinet metaphor. Once you get used to this, it’s liberating. Messages don’t get archived to drawers, they get tagged with one or more labels. You can add stars to the important ones. It’s not that you can’t emulate this workflow in Outlook, it’s that it’s fast and smooth in GMail, and supported by a very intelligent and blazingly fast search function. Of course, if that doesn’t float your boat, you can always use Outlook – or any other standard POP3 or IMAP client – to access GMail.

Win – GMail. It’s more innovative and flexible, and I didn’t even dig deep.

Availability

Exchange, of course, is not subject to the vagaries of internet availability when you’re at the office. Mind you, much of the mail that you’re waiting to receive is. And Outlook – if you run in “Cached mode” – has had offline access down for ages. GMail just started experimenting with that this week. If you’re not in the office, Exchange supports a variety of ways to get to the mail. Outlook Web Access (OWA) is a sophisticated web-based client that, with Exchange 2007 and IE as the browser, almost replicates the desktop Outlook experience. OMA is a mobile-friendly web interface. And ActiveSync, which is supported on many phones (including the iPhone) is the most powerful, stable and feature-rich synchronization platform available. Exchange can do POP and IMAP as well, and also supports a VPN-like mode called Outlook Anywhere (or HTTPS over RPC).

GMail only supports web, pop and IMAP. There’s a mobile GMAIL app which is available on more phones than Activesync is, but it isn’t as robust or full featured as Microsoft’s offering.

So, oddly, the Win for remote access goes to Microsoft over Google, because Microsoft’s offerings are plentiful and mature.

Business Continuity

So, not to belabor this, Exchange is well supported by many powerful backup products. In cached mode, it mirrors your server mailbox to your dektop, which is additional redundancy.

GMail is in the cloud, so backup isn’t quite as straightforward. Offline mode does some synchronization, like Exchange’s cached mode, but it’s not 100% or, at this point, configurable. Prudent GMail users will, even if they don’t read mail in it, set up a POP email program to regularly download their mail in order to have a local copy.

Win – Microsoft

Microsoft also Wins the security comparison – Google can, and has, cut off user’s email accounts. There seem to have been good reasons, such as chasing out hackers who had commandeered accounts. But keeping your email on your backed-up server behind your firewall will always be more secure than the cloud.

But I’d hedge that award with the consideration that Exchange’s complexity is a risk in itself. It’s all well and safe if it is running optimally and it’s being backed up. But most nonprofits are strapped when it comes to the staffing and cost to support this kind of solution. If you can’t provide the proper care and feeding that a system like Exchange requires, you might well be at more risk with an in-house solution. The competence of a vendor like Google managing your servers is a plus.

Finally, cost. GMail wins hands down. The supported Google Apps platform is free for nonprofits. Microsoft offers us deep discounts with their charity pricing, but Dell and HP don’t match on the hardware, and certified Microsoft Administrators come in the $60-120k annual range.

So, in terms of ease of management and cost, GMail easily wins. There are some big trade-offs between Microsoft’s kitchen sink approach to features and Google’s intelligent, progressive functionality, and, in well-resourced environments, Microsoft is the secure choice, but in tightly resourced ones – like nonprofits – GMail is a stable and supported option. The warnings about trusting Google — or any other Software as a Service vendor — are prudent, but there are a lot of factors to weigh. And it’s going to come down to a lot of give and take, with considerations particular to your environment, to determine what the effective choice is. In a lot of cases, the cloud will weigh heavier on the scale than the colossus.

The Lean, Green, Virtualized Machine

This post was originally published on the Idealware Blog in November of 2008.
I normally try to avoid being preachy, but this is too good a bandwagon to stay off of. If you make decisions about technology, at your organization, as a board member, or in your home, then you should decide to green your IT. This is socially beneficial action that you can take with all sorts of side benefits, such as cost savings and further efficiencies. And it’s not so much of a new project to take on as it is a set of guidelines and practices to apply to your current plan. Even if my day job wasn’t at an organization dedicated to defending our planet, I’d still be writing this post, I’m certain.I’ve heard a few reports that server rooms can output 50% or more of a company’s entire energy; PC Magazine puts them at 30-40% on average. If you work for an organization of 50 people or more, then you should look at this metric: how many servers did you have in 2000; how many do you have now? If the volume hasn’t doubled, at least, then you’re the exception to a very bloated rule. We used to pile multiple applications and services onto each server, but the model for the last decade or so has been one server per database, application, or function. This has resulted in a boom of power usage and inefficiency. Another metric that’s been quoted to me by IDC, the IT research group, is that, on average, we use 10% of any given server’s processing power. So the server sits there humming 24/7, outputting carbons and ticking up our power bills.

So what is Green IT? A bunch of things, some very geeky, some common sense. As you plan for your technology upgrades, here are some things that you can consider:

1. Energy-Saving Systems. Dell, HP and the major vendors all sell systems with energy-saving architecture. Sometimes they cost a little more, but that cost should be offset by savings on the power bills. Look for free software and other programs that will help users manage and automate the power output of their stations.

2. Hosted Applications. When it makes sense, let someone else host your software. The scale of their operation will insure that the resources supporting your application are far more refined than a dedicated server in your building.

3. Green Hosting. Don’t settle for any host – if you have a hosting service for your web site, ask them if they employ solar power or other alternative technologies to keep their servers powered. Check out some of the green hosting services referenced here at Idealware.

4. Server Virtualization. And if, like me, you have a room packed with servers, virtualize. Virtualization is a geeky concept, but it’s one that you should understand. Computer operating system software, such as Windows and Linux, is designed to speak to a computer’s hardware and translate the high-level activities we perform to machine code that the computer’s processor can understand. When you install Windows or Linux, the installation process identifies the particular hardware on your system–the type of processor, brand of graphics card, number of USB ports–and configures the operating system to work with your particular devices.

Virtualization is technology that sits in the middle, providing a generic hardware interface for the operating system to speak with. Why? Because, once the operating system is speaking to something generic, it no longer cares what hardware it’s actually installed on. So you can install your Windows 2003 server on one system. Then, if a component fails, you can copy that server to another system, even if it’s radically different – say, a Mac – and it will still boot up and run. More to the point, you can boot up multiple virtual servers on one actual computer (assuming it has sufficient RAM and processing power).

A virtual server is, basically, a file. Pure and simple: one large file that the computer opens up and runs. While running, you can install programs, create documents, change your wallpaper and tweak your settings. When you shut down the server, it will retain all of your changes in the file. You can back that file up. You can copy it to another server and run it while you upgrade components on it’s home server, so that your users don’t lose access during the upgrade. And you can perform the upgrade at 1:00 in the afternoon, instead of 1:00 in the morning.

So, this isn’t just cool. This is revolutionary. Need a new server to test an application? Well, don’t buy a new machine. Throw a virtualized server on an existing machine.

Don’t want to mess with installing Windows server again? Keep a virtualized, bare bones server file (VM) around and use it as a template.

Don’t want to install it in the first place? Google “Windows Server VM”. There are pre-configured virtual machines for every operating system made available for download.

Want to dramatically reduce the number of computers in your server room, thereby maximizing the power usage of the remaining systems? Develop a virtualization strategy as part of our technology plan.

This is just the surface of the benefits of virtualization. There are some concerns and gotchas, too, that need to be considered, and I’ll be blogging more about it.

But the short story is that we have great tools and opportunities to make our systems more supportive of our environment, curbing the global warming crisis one server room at a time. Unlike a lot of these propositions, this one comes with cost reductions and efficiencies built-in. It’s an opportunity to, once in place, lighten your workload, strengthen your backup strategy, reduce your expenses on hardware and energy, and, well — save the world.

Better Organization Through Document Management Systems

This article was originally published at Idealware in January of 2007.

Is your organization drowning in a virtual sea of documents? Document management systems can provide invaluable document searching, versioning, comparison, and collaboration features. Peter Campbell explains.

tax-468440_640For many of us, logging on to a network or the Internet can be like charting the ocean with a rowboat. There may be a sea of information at our fingertips, but if we lack the proper vessel to navigate it, finding what we need — even within our own organization’s information system — can be a significant challenge.

Organizations today are floating in a virtual sea of documents. Once upon a time, this ocean was limited to computer files and printed documents, but these days we must also keep track of the information we email, broadcast, publish online, collaborate on, compare, and present — as well as the related content that others send us. Regulatory measures like the Sarbanes-Oxley actand the Health Insurance Portability and Accountability Act (HIPAA) have created a further burden on organizations to produce more documents and track them more methodically.Taken as a whole, this flood of created and related content acts as our nonprofit’s knowledge base. Yet when we simply create and collect documents, we miss the opportunity to take advantage of this knowledge. Not only do these documents contain information we can reuse, we can also study them to understand past organizational decisions and parse them to produce metrics on organizational goals and efficiencies.

Just as effective document management has become an increasing priority for large companies, it has also become more important — and viable — at smaller nonprofits. And while free tools like Google Desktop or Windows Desktop Search can help increase your document-management efficiency, more sophisticated and secure document-management tools — called Document Management Systems (DMSs) — are likely within your reach. Document management systems offer integrated features to support Google-esque searching, document versioning, comparison, and collaboration. What’s more, when you save a document to a DMS, you record summary information about your document to a database. That database can then be used to analyze your work in order to improve your organization’s efficiency and effectiveness.

Basic Document Management

One way to increase the overall efficiency of your document management is simply to use your existing file-system tools in an agreed upon, standardized fashion. For instance, naming a document “Jones Fax 05-13-08.doc” instead of “Jones.doc” is a rudimentary form of document management. By including the document type (or other descriptive data) your document will be easier to locate when you’re looking for the fax that you sent to Jones on May 13, as opposed to other erstwhile “Jones” correspondence. Arranging documents on a computer or file server in standard subfolders, with meaningful names and topics, can also be useful when managing documents.

For small organizations with a manageable level of document output, these basic document-storing techniques may suffice, especially if all document editors understand the conventions and stick by them. But this kind of process can be difficult to impose and enforce effectively, especially if your organization juggles thousands of documents. If you find that conventions alone aren’t working, you may wish to turn to a Document Management System.

One huge advantage of this system is that it names and stores your documents using a standardized, organization-wide convention, something that can be difficult to maintain otherwise, especially given a typical nonprofit’s turnover rate and dependence on volunteers. What’s more, a DMS will track not just the date the file was last modified (as Windows does), but also the date the document was originally created — which is often more useful in finding a particular document.

In fact, a DMS’s “File > Open” dialogue box can locate files based on any of the information saved about a document. A DMS can narrow a search by date range, locate documents created by particular authors, or browse through recently modified documents, sparing you the necessity of clicking through multiple folders to find what you’re looking for. It will also allow you to search the content of documents using a variety of methods, including the Boolean system (e.g. “includes Word A OR Word B but NOT Word C”) and proximity criteria (e.g., “Word A and word B within n words of each other”). Just as Google has become the quickest way to pull Web-page “needles” out of a gigantic Internet haystack, a solid DMS allows you to quickly find what you’re looking for on your own network.

A good DMS also allows the document creator to define which co-workers can read, edit, or delete his or her work via the document profile. On most networks, this type of document protection is handled by network access rules, and making exceptions to them requires a call to the help desk for assistance.

  • Document check-in and check-out.

    If you try to open a file that someone else is already editing, a network operating system, like Windows Server 2003, will alert you that the file is in use and offer you the option to make a copy. A DMS will tell you more: who is editing the document, what time she checked it out, and the information she provided about the purpose of her revision and when she plans to be done with the document.

  • Document comparison.

    A DMS not only supports Word’s track-changes and document-merging features, but allows you to compare your edited document to an unedited version, highlighting the differences between the two within the DMS. This is a great feature when your collaborator has neglected to track his or her changes, particularly because it allows you to view the updates without actually adding the revision data to your original files, making them less susceptible to document corruption.

  • Web publishing.

    Most DMSs provide content-management features for intranets and even public Web sites. Often, you can define that specific types of documents should be automatically published to your intranet as soon as they’re saved to the DMS. (Note, however, that if your core need is to publish documents on a Web site, rather than track versions or support check-ins and check-outs, a dedicated Content Management System [CMS] will likely be a better fit than a DMS.)

  • Workflow automation.

    A DMS can incorporate approvals and routing rules to define who should see the document and in what order. This allows the system to support not only the creation and retrieval of documents, but also the editing and handoff process. For example, when multiple authors need to work on a single document, the DMS can route the file from one to the next in a pre-defined order.

  • Email Integration.

    Most DMSs integrate with Microsoft Outlook, Lotus Notes, and other email platforms, allowing you to not only view your document folders from within your email client, but to also to save emails to your DMS. If, for example, you send out a document for review, you can associate any feedback and comments you receive via email with that document, which you can retrieve whenever you search for your original file.

  • Document Recovery.

    DMSs typically provide strong support for document backup, archiving, and disaster recovery, working in conjunction with your other backup systems to safeguard your work.

Three Types of Document Management Systems

If you decide that your organization would benefit from a DMS, there are a variety of choices and prices available. In general, we can break up DMSs into three types:

  • Photocopier- and Scanner-Bundled Systems

    Affordable DMS systems are often resold along with photocopiers and
    scanners. While primarily intended as an image and PDF management
    system, these DMSs integrate with the hardware but can also manage files created on the network. Bundled systems may not include the very high-end features features offered by enterprise-level DMSs, but will offer the basics and usually come with very competitive, tiered pricing. A popular software package is offered by Laserfiche.

  • Enterprise-Level Systems

    These robust, sophisticated systems usually require a strong database
    back end such as Microsoft SQL or Oracle and tend to be expensive.
    Enterprise-level systems include the advanced features listed above, and some are even tailored to particular industries, such as legal or
    accounting firms. Examples of powerful enterprise systems include Open
    Text eDocs, Interwoven WorkSite, and EMC’s Documentum.

  • Microsoft Office SharePoint (MOSS 2007)

    Microsoft SharePoint is an interesting and fairly unique offering in the DMS area. While it’s best know as a corporate intranet platform, the 2007 version of the package provides building blocks for content-, document-, and knowledge-management, with tight integration with Microsoft Office documents, sophisticated workflow and routing features, and extensive document and people-searching capabilities. It is a powerful tool and — typically — an expensive one, but because it is available to qualifying nonprofits for a low administrative free through TechSoup (which offers both SharePoint Standard Edition andEnterprise Edition), it is also a far more affordable option for nonprofits than similar DMS products on the market. One caveat: Sharepoint, unlike the other systems mentioned above, stores documents in a database rather than in your file system, which can make the documents more susceptible to corruption. (Note: SharePoint Server is a discreet product that should not be confused with Windows Shared Services, which comes bundled with Windows Server 2003.

The Future of Document Management

The most significant changes in document management over the last decade have been the migration of most major DMS systems from desktop to browser-based applications, as well as their ever-increasing efficiency and search functionality. The growing popularity of Software as a Service (SaaS), tagging, and RSS tools are likely to impact the DMS space as well.

Software as a Service

SaaS platforms like Google Apps and Salesforce.com store documents online, on hosted servers, as opposed to on traditional internal file servers. Google Apps doesn’t currently offer the detailed document profile options standard DMSs do, but it will be interesting to see how that platform evolves.

Another SaaS product, Salesforce, has been active in the document management space. Salesforce’s constituent relationship management (CRM) platform currently allows organizations to simply upload documents for a constituent. Salesforce has recently purchased a commercial DMS called Koral, however, and is in the process of incorporating it into its platform, an enhancement that will help tie documents to the other aspects of constituent relationships.

Tagging

A startup called Wonderfile has introduced an online DMS that incorporates the heavy use of tagging to identify and describe documents. Using this software, you would move your documents to the Wonderfile servers and manage them online with Del.icio.us-style methods of tagging and browsing. A drawback to Wonderfile is that, although a creative solution, storing and sharing your documents online is more valuable when you can edit and collaborate on them as well. As full-fledged, Web-based document creation and editing platforms, Google Apps and its peers are a better alternative, despite their lack of tagging functionality.

Microsoft has also been quietly adding tagging capability to their file-browsing utility Windows Explorer, allowing to you add keywords to your documents that show up as columns that you can sort and filter by. This works in both Windows XP and Vista.

RSS

While none of the existing DMSs are currently doing much with RSS — an online syndication technique that could allow users to “subscribe” to changes to documents or new content via a Web browser — Salesforce plans to integrate RSS functionality with its new Koral system. This type of syndication could be a useful feature, allowing groups of people to track document revisions, communicate about modifications, or monitor additions to folders.

Finding What You’re Looking For

Is it time for your organization to trade in that rowboat for a battle cruiser? With an ever-expanding pool of documents and resources, nonprofits need ways to find the information we need that are richer and more sophisticated than the standard filenames and folders. If your organization struggles to keep track of important documents and information, a DMS can help you move beyond the traditional “file-and-save” method to an organizational system that allows you to sort by topics and projects using a variety of techniques and criteria.

But we should all hope that even better navigational systems are coming down the road. Having seen the creative advances in information management provided by Web 2.0 features like tagging and syndication, it’s easy to envision how these features, which work well with photos, bookmarks, and blog entries, could be extended to documents as well.

 

Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

Thanks to TechSoup for their financial support of this article. Tim Johnson, Laura Quinn of Idealware, and Peter Crosby ofalltogethernow also contributed to this article.