Category Archives: Articles

Articles republished from Idealware, NTEN, Techsoup and other places.

How to Measure the Value of an IT Investment

This article was originally published by Techsoup on July 8th, 2016

 Person's hand holding several dice that are about to be rolledSome say life’s a gamble. But gambling can be very random, as in the rolling of a die, or very scientific, as in the calculation of odds and percentages. Investing in technology should not be a gamble, in as much as you can predict what it will do for you. In the standard business lingo, we call this prediction “return on investment” or “ROI.” And whether you calculate that with all the vigor of two college students on a weekend trip to Reno, or a scientist who deeply understands the odds, is important. In this article, we’ll discuss the many factors that go into a fully informed determination of the ROI for a technology project.

What Is ROI?

The simplest definition of ROI is that, for any project or purchase, it’s the amount saved or realized minus the cost to invest. If we spend $75 for a new fundraising widget for our website, and we make $125 in donations from it, then our return on investment is $50.

Maybe.

Or maybe not, because we invested in web developer time to deploy the widget to our website and staff time to process the donations. Plus, we spent a portion of each donation on credit card processing fees, right?

Not Strictly a Financial Formula

So ROI is not a strictly financial formula. Actual ROI is based on many factors, including hard-to-quantify things such as organizational culture, training, and readiness for adoption. The benefits of a major tech investment are proportional to the readiness of your particular organization.

Let’s try another example. We’ll spend $2,000 to upgrade to a new version of our fundraising system. It boasts better reporting and data visualizations, which, per the salesperson, will allow us to increase our donations by 10 percent. We think we’ll make $10,000 a year in additional donations, and expect the upgrade to benefit us for two years. So the strictly financial return is $18,000 ($20,000 new revenue – $2,000 upgrade cost).

But that 10 percent increase isn’t based solely on having the new features available in the product; it’s based on using the new features strategically, which your staff might not know how to do. It assumes that the software will be configured correctly, which assumes you are fully cognizant of your needs and processes related to the information that the system will manage. And it assumes that you have a staffing level that might be larger than you can afford.

It Doesn’t Start with Dollars

The concept here is pretty simple: it is easier to bake a cake from a recipe if you buy the ingredients beforehand. Then you need to have all of the required mixing implements and receptacles, clear the necessary counter space, and know how to turn on the oven.

Similarly, successfully calculating the return on investments requires having a complete picture of what you will be investing in.

Ask Yourself These Four Questions

  1. Do I understand what improvement this investment will result in and/or the problems it will solve?Core to measuring the return on the investment is knowing what it is that you have to measure. That will be some quantifiable amount of anticipated revenue, productivity gain, staffing reduction, or increase in clients served. You should know what those metrics are at the start of a project.
  2. Have I thoroughly considered the staffing changes that this investment might enable or require?For any large investment, like a new fundraising database or constituent management system, or a new, complex initiative, you want to know upfront how your day-to-day operations will be impacted. A new system might automate laborious processes, allowing you to repurpose staff. Or it might well require additional staffing in order to maximize the return. Those costs or savings are a key factor in the ROI.
  3. Do I have the necessary buy-in from the board, executives, and staff that will result in a successful implementation?Key to any large project’s success is having the support from the key decision makers. If you’re in middle management, and your initiative is not well understood and appreciated by those in charge, then there’s a significant chance that the project will fail. As right as you might be that your organization would benefit, again, the return on investment requires that the organization is invested.
  4. Have I identified any required training and ensured that we have the resources to provide it and the time to take it?So much of the value in a new system is derived from people knowing how to use it. In resource-strapped nonprofits, training time is often seen as frivolous or less important than whatever the crisis du jour might be. Don’t let that happen, because what you get out of a system is all contingent on being able to use it well and strategically. Without training, people will tend to try and emulate what they did before the new system was in place, and that will more likely reduce your return than produce it.

Tools and Tactics

There are some techniques for calculating ROI. As noted above, you should start with metrics that identify your current conditions and can be tracked after implementation. These might be dollars received, hours spent doing tasks, or number of employees dedicated to a process. Consider this your baseline. From there, you can forecast a scenario based on the advantages that you anticipate having upon completion of the project.

For example, if your current fundraising system can’t track multiple donors at the same address, then you’re probably expending time and effort to track such things in creative ways. A system that properly supports “householding” will eliminate the workarounds that you’ve created to maintain that data. You can estimate the time saved.

Once completed, these before and after numbers will help you quantify the anticipated return, as well as guide the implementation. That’s because the forecast is a set (or subset) of your goals.

  • Be sure to track both short- and long-term impacts. One basic calculation is a 5- or 10-year financial analysis. It’s not uncommon to have increased implementation costs in the first year, so tracking the annual cost fluctuations over the expected life of the investment will give you a better picture of its value.
  • For example, say you decide to invest in a donor tracking system, replacing a laborious task of tracking donations in Excel. Your current annual fundraising is about $1 million. You have reasonably estimated that the new system will net you an additional $50,000 a year, after a two-year ramp-up phase with the system. It’ll achieve that via cost savings due to efficiencies realized and increased revenue based on superior fundraising tools. Here’s what a 10-year analysis might look like:
Example spreadsheet showing ten-year analysis of costs, revenue, and net revenue

Other things might impact revenue as well, such as improved marketing, so we’re only tracking anticipated revenue associated with this investment.

Finally, don’t work in isolation. Talk with peers who have done similar projects. Find out what worked for them and what didn’t, and what successes they were able to measure. Much of this forecasting is based on speculation, and your job is to fact-check that speculation and get it closer to reality as much as you can.

Checking Your Work

As noted above, you should start with metrics that identify your current conditions and can be tracked after implementation. These metrics could be dollars received, hours spent doing tasks, or number of employees dedicated to a process. Checking your work may seem unnecessary, as the dollars have already been spent, but tracking your progress is the best way to improve on calculating ROI on subsequent investments. You can learn a lot, not only about the particular project, but about your organizational effectiveness as a whole.

The Secret to Calculating ROI

This is the secret: it’s not the return on the dollars spent. It’s the improvements in your organizational capacity and efficiency that can be made if you develop a culture that can predict which investments are worthwhile.

Further Reading

Image: Twinsterphoto / Shutterstock

Creating A Tech-Savvy Nonprofit Culture

This article was originally published in NTEN Change Magazine in June of 2015.

TechSavvy

What kind of challenge does your organization have supporting technology? Below are several scenarios to choose from:

  • Little or no tech staff or tech leadership: We buy inexpensive computers and software and rely on consultants to set it up.
  • Our IT support is outsourced: there is no technology plan or any staff training.
  • We have a tech on staff who does their best to keep things running: no staff training, no technology planning.
  • We have a tech on staff and an IT Director, but no technology plan: IT is swamped and not very helpful.
  • We have staff and IT leadership, but strategic plans are often trumped by budget crises. Training is minimal.
  • IT Staff, Leadership, budget, and a technology plan, but executive support is minimal. IT projects succeed or fail based on the willingness of the departmental managers to work with IT.

What do all of these scenarios have in common? A lack of a functional technology plan, little or no staff training, and/or no shared accountability for technology in the organization. While it’s likely that the technical skills required in order to successfully perform a job are listed in the job descriptions, the successful integration of technology literacy into organizational culture requires much more than that. Here are some key enabling steps:

Technology Planning: If you have a technology plan, it might not do more than identify the key software and hardware projects planned. Technology planning is about much more than what you want to do. A thorough plan addresses the “who,” the “why,” and the “how” you’re going to do things:

  • A mission statement for the technology plan that ties directly to your organizational mission. For a workforce development agency, the tech mission might be to “deploy technology that streamlines the processes involved in training, tracking, and placing clients while strategically supporting administration, development, and communications”.
  • A RACI matrix outlining who supports what technology. This isn’t just a list of IT staff duties, but a roadmap of where expertise lies throughout the organization and how staff are expected to share it.
  • A “Where we are” assessment that points out the strengths, weaknesses, threats, and opportunities in your current technology environment.
  • A “Where we need to go” section that outlines your three to five year technology vision. This section should be focused on what the technology is intended to accomplish, as opposed to which particular applications you plan to buy. For example, moving to internal social media for intra-organization communication and knowledge management” is more informational than “purchase Yammer.
  • Finally, a more technical outline of what you plan to deploy and when, with a big disclaimer saying that this plan will change as needs are reassessed and opportunities arise.

Training: Training staff is critical to recouping your investments in technology. If you do a big implementation of a CRM or ERP system, then you want your staff to make full use of that technology. If you’re large enough to warrant it (50+ staff), hire an in-house trainer, who also plays a key role in implementing new systems. This investment will offset significant productivity losses.

Smaller orgs can make use of online resources like Khan Academy and Lynda.com, as well as the consultants and vendors who install new systems. And technology training should be part of the onboarding process for new hires, even if the trainers are just knowledgeable staff.

In resource-strapped environments, training can be a hard sell. Everybody likes the idea, but nobody wants to prioritize it. It’s up to the CEO and management to lead by policy and example – promote the training, show up at the training, and set the expectation that training is a valued use of staff time.

Organizational Buy-in: Don’t make critical technology decisions in a vacuum. When evaluating new software, invite everyone to the demos and include staff in every step of the decision-making process, from surveying them on their needs before you start defining your requirements to including staff who will be using the systems in the evaluation group. When staff have input into the decision, they are naturally more open to, and accountable for, healthy use of the system.

Executive Sponsorship: With technology clearly prioritized and planned for, the last barrier is technophobia, and that’s more widespread than the common cold in nonprofits. Truly changing the culture means changing deep-rooted attitudes. This type of change has to start at the top and be modeled by the executives.

True story: At Salesforce.com, every new employee is shown the “Chatter” messaging tool and told to set up a profile. If a new user neglects to upload a photo, they will shortly find a comment in their Chatter feed fromMarc Benioff, the CEO of Salesforce, saying, simply, “Nice Photo”. That’s the CEO’s way of letting new staff know that use of Chatter is expected, and the CEO uses it, too.

Play! One more thing will contribute to a tech-savvy culture: permission to play. We want to let staff try out new web tools and applications that will assist them. The ones that are useful can be reviewed and formally adopted. But locking users down and tightly controlling resources – a common default for techies, who can trend toward the control-freakish side – will do nothing to help establish an open-minded, tech-friendly atmosphere.

Overcoming Tech Aversion: We all know, now, that technology is not an optional investment. It’s infrastructure, supplementing and/or taking the place of fax machines, printers, photocopiers, telephones, and in more and more cases, physical offices. In the case of most nonprofits, there isn’t an employee in the company that doesn’t use office technology.

But there are still many nonprofits that operate with a pointed aversion to technology. Many executives aren’t comfortable with tech. They don’t trust it, and they don’t trust the people who know what to do with it. A whole lot depends on getting tech right, so enabling the office technologist – be it the IT Director or the accidental techie – is kind of like giving your teenager the keys to the car. You know that you have to trust them, but you can’t predict what they’re going to do.

Building that trust is simply a matter of getting more comfortable with technology. It doesn’t mean that management and staff all have to become hardcore techies. They just have to understand what technology is supposed to do for them and embrace its use. How do you build that comfort?

  • Have a trusted consulting firm do a technology audit.
  • Visit tech-savvy peers and see how they use technology.
  • Go to a NTEN conference.
  • Buy an iPad!

Building a tech-savvy culture is about making everyone more engaged, accountable, and comfortable with the tools that we use to accomplish our missions. Don’t let your organization be hamstrung by a resistance to the things that can propel you forward.

What Is Nonprofit Technology – The Director’s Cut

This article was originally published on the NTEN Blog on March 10th, 2015, where it was edited for length. As with any director’s cut, their version might be better than this one! But this is how it was originally composed. Click here for more context.

For the past 14 years, I’ve been working for 501(c)(3) corporations, commonly referred to as nonprofits.  I’ve also become active in what we call the “nptech” community — “nptech” being shorthand for “nonprofit technology”.  But nonprofits, which comprise about 10% of all US businesses, have wildly diverse business models.  To suggest that there is a particular type of technology for nonprofits is akin to saying that all of the businesses in downtown Manhattan have similar technology needs. So what is nonprofit technology?  Less of a platform and more of a philosophy.

Snowflakes? No flakes.

It’s often said that each nonprofit is unique, like a snowflake, with specific needs and modes of operation.  Let’s just remember that, as unique as a snowflake is, if you lay about a million of them next to each other on a  field, you can not tell them apart.

Nonprofits do not use any technology that is 100% unique to the nonprofit sector.  Fundraising systems operate exactly like sales systems, with leads, opportunities, campaigns and sales/donations. Similarly, advocacy applications are akin to marketing software. What nonprofits call Constituent Relationship Management Systems are called Customer Relationship Management systems everywhere else.  I want to make it clear that the technology used by nonprofits is analogous enough to what for-profits use to be nearly indistinguishable.

Also, small businesses, big businesses, most businesses operate under tight margins.  They keep overhead to a minimum.  They make decisions based on a scarcity of funding.   Nonprofits are not unique in their lack of sizable technology budgets.

No Margin for Investment.

The most significant difference between a nonprofit and a for-profit, from a business perspective, is this:

A for-profit holds to tight budgets in order to maximize profit. A nonprofit holds to tight budgets in order to remain funded.

Of course, for-profits can go under by getting their overhead ratio wrong.  But where they have room to move, and, say, invest 30% in overhead one year in order to boot up a long-term, profitable strategy, they can.  They can make the case to their board. Their customers will likely not even know how much they spent on technology, marketing, or extra staff.

If a nonprofit decides to boost the overhead rate by 30% for a year in order to boot up a long-term, mission-effective strategy, then Guidestar, Charity Navigator, the Better Business Bureau and their own website will, basically, tell their donors that they’re a bad investment, and the drop in donations might well sink them.  501(c)(3)’s are required to publish their financial statements for public review annually, and this is the data that they are primarily assessed on.  The effectiveness of their strategies are harder for nonprofits to qualify than it is for a retailer or manufacturer.

Customers don’t care how a Target and WalMart run their businesses; they care that they can buy anti-bacterial wipes at competitive prices. Constituents care deeply about how much of their donation is going to the people or cause that a nonprofit serves, as opposed to the operating expense of the nonprofit.

All businesses want to minimize expenses and increase profitability (even nonprofits!). But nonprofits must minimize those expenses; they have no strategic breathing room when it comes to funding operations.

Management is not the priority, fundraising is.

So, for a nonprofit, a CEO’s primary job is to maintain the funding.  In many cases, this means that the qualifications of a nonprofit CEO have a lot to do with their networking and fundraising skills.  Many nonprofits are run by people who don’t have extensive training or experience in business management.

Nonprofit IT Staff aren’t your typical techies

Nonprofits have lower IT budget and staff ratios than a typical for-profit. The average nonprofit IT budget is 1% to 2% of the entire budget, per NTEN Staffing Survey; average for-profit is 2% to 3%, per Gartner). IT Salaries are consistently below the market rate, and they vary wildly, with some nonprofits paying far below market, others at market. A common scenario at a nonprofit is for the technical staff to include, if not be totally made up of, “accidental techies“.  People who were hired for clerical or administrative work, had a knack for tech, and became the defacto tech person, sometimes also getting a title that reflects that. This is more common in smaller organizations, but it can happen anywhere that the administrative staffing is a small percentage of the overall staff and/or the CEO doesn’t know to hire IT professionals.

Is that a bad thing? Yes and no.  Accidental techies are often the people who had good, strategic notions about how technology could be applied to achieve objectives.  They tend to be smart, autonomous, good learners and teachers.  But they are more likely to be reactive and opportunistic in their approach to their work. IT also benefits from planning and consistency.  Truthfully, you need both styles on a healthy IT team.

So what is “Nonprofit Technology”?

It’s both a class of software and an approach to technology deployment.

Nonprofit technology includes fundraising, advocacy, grants management and other applications that support the primary technology needs, such as donor management and promotion of causes. In some cases, the same systems that salespeople and marketers use can suffice, as evidenced by the popularity of Salesforce in the nonprofit space. But the nonprofit sector has it’s own terminology around revenue processes, so, if commercial software is used, it’s modified to address that.  In the Salesforce case, a nonprofit will either use the Nonprofit Starter Pack, which “skins” Salesforce to feel more like a fundraising system, or purchase an actual fundraising application developed for the platform, such as Roundcause or Blackbaud’s Luminate.  Idealware, a nonprofit dedicated to helping nonprofits make good software choices publishes a booklet listing the types of software that nonprofits use.

Outside of those specialty applications, nonprofits use fairly standard stuff from Microsoft, Adobe, Google and other big companies. Many of these companies offer charity pricing, and further discounts are available to 501(c)(3)’s through Techsoup, a company that provides a transaction layer to vendors who want to donate software to charities. A seasoned IT staffer knows how to cut through the front line salespeople and find the person at a company that might make a donation or discount software or hardware.

But purchasing software is actually the easiest part.  Deploying it is the challenge, with little IT staff and less time to focus on how systems should be implemented, technology rollouts are often done on the fly.  Where a for profit might invest significant time up front analyzing the business processes that the system will address; evaluating products, and training staff, these steps are a hard sell in an understaffed environment where people always have at least five other things to prioritize.

Taking the NPTech Challenge

So if you are thinking of working at a nonprofit as an IT implementer (System Manager, IT Director, CIO), take heart: the work is rewarding, because the motivations are broader than just bringing home a paycheck.  The people are nice, and most nonprofits recognize that, if they’re going to pay poorly, they should let people have their lives nights and weekends. There are opportunities to learn and be creative. The constrained environment rewards inventive solutions. If you’re a tech strategist, you can try things that a more risk-averse for profit wouldn’t, as long as you the risk you’re taking isn’t too costly. For example, I built a retail reporting data warehouse at a Goodwill in 2003, saving us about a $100,000 on what it would have cost to buy a good reporting system.  I also pitched a business plan and started up ecommerce there, and I don’t have a college degree. If money isn’t your motivation, but accomplishing things that make a difference in people’s lives does excite you, this is a fertile environment.

That said, if you don’t like to talk to people, and you don’t think that marketing should be part of your job, think twice.  Successful technology implementations at nonprofits are done by people who know how to communicate. The soft skills matter even more than the tech skills, because you will likely be reporting to people who don’t understand what tech does.  If you can”t justify your projects in terms that they’ll understand, they won’t consider funding them.

You should be as good at the big picture as you are at the small ones.  NPTech is all about fixing the broken routers while you configure the CRM and interpret the Google analytics. You have to be good at juggling a lot of diverse tasks and projects, and conversant in multiple technologies.

Creativity trumps discipline. If you strictly follow the best ITIL policies and governance, be afraid. Strict adherence to for profit standards requires staffing and budget that you aren’t likely to have.  Good technology governance at nonprofits is a matter of setting priorities and making strategic compromises.

Collaboration and delegation are key. Nonprofits have a lot of cross-department functionality.  If you are all about IT controlling the systems, you’re going to have more work on your plate than you can handle and a frustrated user-base to work with.  Letting those who can do tech do tech — whether or not they have the credentials or report to you — is a key strategy towards getting it done.

NPTech is not just a job, it’s a community.

If some of what I’ve described above sounds discouraging, then know that the challenges are shared by a  committed group of tech practitioners that is welcoming and relatively free of ego.  Nobody has to take on the battle of improving nonprofit technology alone.  Search the #nptech hashtag on Google or Twitter and you’ll find groups, blogs and individuals who see this challenge as a shared one for our sector.  Make the time to check out an NTEN 501 Tech club meeting in your city or, better yet, attend their annual conference. Read all of the articles at Idealware.  Join the forums at Techsoup.  If this work is for you, then you are one of many who support each other, and each other’s organization’s missions, and we’ll help you along the way.

Architecting Healthy Data Management Systems

This article was originally published in the NTEN eBook “Collected Voices: Data-Informed Nonprofits” in January of 2014.

tape-403593_640Introduction

The reasons why we want to make data-driven decisions are clear.  The challenge, in our cash-strapped, resource-shy environments is to install, configure and manage the systems that will allow us to easily and efficiently analyze, report on and visualize the data.  This article will offer some insight into how that can be done, while being ever mindful that the money and time to invest is hard to come by.  But we’ll also point out where those investments can pay off in more ways than just the critical one: the ability to justify our mission-effectiveness.

Right off the bat, acknowledge that it might be a long-term project to get there.  But, acknowledge as well, that you are already collecting all sorts of data, and there is a lot more data available that can put your work in context.  The challenge is to implement new systems without wasting earlier investments, and to funnel data to a central repository for reporting, as opposed to re-entering it all into a redundant system.  Done correctly, this project should result in greater efficiency once it’s completed.

Consider these goals:

  • An integrated data management and reporting system that can easily output metrics in the formats that constituents and funders desire;
  • A streamlined process for managing data that increases the validity of the data entered while reducing the amount of data entry; and
  • A broader, shared understanding of the effectiveness of our strategic plans.

Here are the steps you can take to accomplish these goals.

Taking Inventory

The first step in building the system involves ferreting out all of the systems that you store data in today.  These will likely be applications, like case or client management systems, finance databases, human resources systems and constituent relationship management (CRM) systems.  It will also include Access databases, Excel spreadsheets, Word documents, email, and, of course, paper.  In most organizations (and this isn’t limited to nonprofits), data isn’t centrally managed.  It’s stored by application and/or department, and by individuals.

The challenge is to identify the data that you need to report on, wherever it might be hidden, and catalogue it. Write down what it is, where it is, what format it is in, and who maintains it.  Catalogue your information security: what content is subject to limited availability within the company (e.g., HR data and HIPAA-related information)? What can be seen organization-wide? What can be seen by the public?

Traditionally, companies have defaulted to securing data by department. While this offers a high-level of security, it can stifle collaboration and result in data sprawl, as copies of secured documents are printed and emailed to those who need to see the information, but don’t have access. Consider a data strategy that keeps most things public (within the organization), and only secures documents when there is clear reason to do so.

You’ll likely find a fair amount of redundant data.  This, in particular, should be catalogued.  For example, say that you work at a social services organization.  When a new client comes on, they’re entered into the case management system, the CRM, a learning management system, and a security system database, because you’ve given them some kind of access card. Key to our data management strategy is to identify redundant data entry and remove it.  We should be able to enter this client information once and have it automatically replicated in the other systems.

Systems Integration

Chances are, of course, that all of your data is not in one system, and the systems that you do have (finance, CRM, etc.) don’t easily integrate with each other.  The first question to ask is, how are we going to get all of our systems to share with each other? One approach, of course, is to replace all of your separate databases with one database.  Fortune 500 companies use products from Oracle and SAP to do this, systems that incorporate finance, HR, CRM and inventory management.  Chances are that these will not work at your nonprofit; the software is expensive and the developers that know how to customize it are, as well.  More affordable options exist from companies like MicroSoft, Salesforce, NetSuite and IBM, at special pricing for 501(c)(3)’s.

Data Platforms

A data platform is one of these systems that stores your data in a single database, but offers multiple ways of working with the data.  Accordingly, a NetSuite platform can handle your finance, HR, CRM/Donor Management and e-commerce without maintaining separate data stores, allowing you to report on combined metrics on things like fundraiser effectiveness (Donor Management and HR) and mail vs online donations (E-commerce and Donor Management).  Microsoft’s solution will incorporate separate products, such as Sharepoint, Dynamics CRM, and the Dynamics ERP applications (HR, Finance).  Solutions like Salesforce and NetSuite are cloud only, whereas Microsoft  and IBM can be installed locally or run from the cloud.

Getting from here to there

Of course, replacing all of your key systems overnight is neither a likely option nor an advisable one.  Change like this has to be implemented over a period of time, possibly spanning years (for larger organizations where the system changes will be costly and complex). As part of the earlier system evaluation, you’ll want to factor in the state of each system.  Are some approaching obsoletion?  Are some not meeting your needs? Prioritize based on the natural life of the existing systems and the particular business requirements. Replacing major data systems can be difficult and complex — the point isn’t to gloss over this.  You need to have a strong plan that factors in budget, resources, and change management.  Replacing too many systems too quickly can overwhelm both the staff implementing the change and the users of the systems being changed.  If you don’t have executive level IT Staff on board, working with consultants to accomplish this is highly recommended.

Business Process Mapping

BPM_Example

The success of the conversion is less dependent on the platform you choose than it is on the way you configure it.  Systems optimize and streamline data management; they don’t manage the data for you.  In order to insure that this investment is realized, a prerequisite investment is one in understanding how you currently work with data and optimizing those processes for the new platform.

To do this, take a look at the key reports and types of information in the list that you compiled and draw the process that produces each piece, whether it’s a report, a chart, a list of addresses or a board report.  Drawing processes, aka business process mapping, is best done with a flowcharting tool, such as Microsoft Visio.  A simple process map will look like this:

In particular, look at the processes that are being done on paper, in Word, or in Excel that would benefit from being in a database.  Aggregating information from individual documents is laborious; the goal is to store data in the data platform and make it available for combined reporting.  If today’s process involves cataloguing data in an word processing table or a spreadsheet, then you will want to identify a data platform table that will store that information in the future.

Design Considerations

Once you have catalogued your data stores and the processes in place to interact with the data, and you’ve identified the key relationships between sets of data and improved processes that reduce redundancy, improve data integrity and automate repetitive tasks, you can begin designing the data platform.  This is likely best done with consulting help from vendors who have both expertise in the platform and knowledge of your business objectives and practices.

As much as possible, try and use the built-in functionality of the platform, as opposed to custom programming.  A solid CRM like Salesforce or MS CRM will let you create custom objects that map to your data and then allow you to input, manage, and report on the data that is stored in them without resorting to actual programming in Java or .NET languages.  Once you start developing new interfaces and adding functionality that isn’t native to the platform, things become more difficult to support.  Custom training is required; developers have to be able to fully document what they’ve done, or swear that they’ll never quit, be laid off, or get hit by a bus. And you have to be sure that the data platform vendor won’t release updates that break the home-grown components.

Conclusion

The end game is to have one place where all staff working with your information can sign on and work with the data, without worrying about which version is current or where everything might have been stored.  Ideally, it will be a cloud platform that allows secure access from any internet-accessible location, with mobile apps as well as browser-based.  Further considerations might include restricted access for key constituents and integration with document management systems and business intelligence tools. But key to the effort is a systematic approach that includes a deep investment in taking stock of your needs and understanding what the system will do for you before the first keypress or mouse click occurs, and patience, so that you get it all and get it right.  It’s not an impossible dream.

 

Using RSS Tools to Feed Your Information Needs

This article was originally published at Idealware in March of 2009.

The Internet gives you access to a virtual smorgasbord of information. From the consequential to the trivial, the astonishing to the mundane, it’s all within your reach. This means you can keep up with the headlines, policies, trends, and tools that interest your nonprofit, and keep informed about what people are saying about your organization online. But the sheer volume of information can pose challenges, too: namely, how do you separate the useful data from all the rest? One way is to use RSS, which brings the information you want to you.

rss-40674_640 Many of the Web sites that interest you are syndicated. With RSS, or Really Simple Syndication, you subscribe to them, and when they’re updated, the content is delivered to you — much like a daily newspaper, except you choose the content. On the Web, you can not only get most of what the newspapers offer, but also additional, vital information that informs your organizational and mission-related strategies. You subscribe only to the articles and features that you want to read. It’s absolutely free, and the only difficult part is deciding what to do with all the time you used to spend surfing.

Since TechSoup first published RSS for Nonprofits, there has been an explosion of tools that support RSS use. There are now almost as many ways to view RSS data as there are types of information to manage. Effective use of RSS means determining how you want your information served. What kind of consumer are you? What type of tool will help you manage your information most efficiently, day in and day out? Read on to learn more.

What’s on the Menu?

You probably already check a set of information sources regularly. The first step in considering your RSS needs is to take stock of what you are already reading, and what additional sources you’d like to follow. Some of that information may already be in your browser’s lists of Bookmarks or Favorites, but consider seeking out recommendations from trusted industry sources, friends, and co-workers as well. As you review the Web sites that you’ve identified as important, check them to make sure you can subscribe to them using RSS. You can find this out by looking for “subscribe” options on the Web page itself, or for an orange or blue feed icon resembling a radio signal in the right side of your Web browser’s address bar.

Consider the whole range of information that people are providing in this format. Some examples are:

  • News feeds, from traditional news sources or other nonprofits.
  • Blogs, particularly those that might mention or inform your mission.
  • Updates from social networking sites like Facebook or MySpace (for instance, through FriendFeed).
  • Podcasts and videos.
  • Updates from your own software applications, such as notifications of edits on documents from a document management system, or interactions with a donor from your CRM. (Newer applications support this.)
  • Information from technical support forums and discussion boards.
  • All sorts of regularly updated data, such as U.S. Census information, job listings, classified ads, or even TV listings and comic strips.

 

You can get a good idea of what’s out there and what’s popular by browsing the recommendations at Yahoo! Directory oriGoogle, while a tool like PostRank can help you analyze feeds and determine which are valuable.

RSS also shines as a tool for monitoring your organization and your cause on the Web. For instance, Google Alerts lets you subscribe, for free, to RSS updates that notify you when a particular word or phrase is used on the Web. (To learn more about “listening” to what others are saying about your organization online, see We Are Media’s wiki article on online listening.)

How Hungry Are You?

Dining options abound: you can order take-out, or go out to eat; you can snack on the go, or take all your meals at home; you can pick at your food, or savor each bite. Your options for RSS reading are equally diverse, and you’ll want to think carefully about your own priorities. Before choosing the tool or tools that suit you, ask some questions about the information you plan to track.

  • How much information is it? Do you follow a few blogs that are updated weekly? Or news feeds, like the New York Times or Huffington Post, which are updated 50 to 200 times a day?
  • How intently do you need to monitor this information? Do you generally want to pore over every word of this information, or just scan for the tidbits that are relevant to you? Is it a problem if you miss some items?
  • Are you generally Web-enabled? Can you use a tool over the Internet, as opposed to one installed on your desktop?
  • Do you jump from one computer to another? Do your feeds need to be synchronized so you can access them from multiple locations?
  • Is this information disposable, or will it need to be archived? Do you read articles, perhaps email the link to a colleague, and then forget about it? Or do you want to archive items of particular interest so you can find them in the future?
  • Will you refer a lot of this information to co-workers or constituents? Would you like to be able to forward items via email, or publish favorites to a Web page?
  • Do you need mobile access to the information? Will you want to be able to see all your feeds from a smartphone, on the run?

Enjoying the Meal

Once you have a solid understanding of your information needs, it’s time to consider the type of tool that you want to use to gather your information. First, let’s look at the terminology:

  • An Article (or Item) is a bit of information, such as a news story, blog entry, job listing or podcast.
  • A Feed is a collection of articles from a single source (such as a blog or Web site).
  • An Aggregated Feed is a collection of articles from numerous feeds displayed together in one folder.

So, what RSS options are available?

Tickers

Like the “crawl” at the bottom of CNN or MSNBC television broadcasts, RSS tickers show an automatically scrolling display of the titles of articles from your RSS feeds. Tickers can be a useful way to casually view news and updates. They’re a poor choice for items that you don’t want to miss, though, as key updates might move past when you’re not paying attention.

Snackr. For a very TV-news-like experience, try Snackr, an Adobe Air application. You can load up a variety of feeds which scroll in an aggregated stream across your desktop while you work.

Gmail users can use the email browser’s Web Clips feature to create a rotating display of RSS headlines above their inbox and messages. Because Gmail is Web-based, your headlines will be available from any computer.

Web Browsers

Your current Web browser — such as Internet Explorer (IE) or Firefox — can likely act as a simple RSS reader, with varying functionality depending on the browser and browser version. Browsers can either display feeds using their built-in viewers, or associate Web pages in RSS format with an installed RSS Feed Reader (much as files ending in “.doc” are associated with Microsoft Word). Even without an installed feed reader, clicking on the link to an RSS feed will typically display the articles in a readable fashion, formatting the items attractively and adding links and search options that assist in article navigation. This works in most modern browsers (IE7 and up, Firefox 2 and up, Safari and Opera). If your browser doesn’t understand feeds, then they will display as hard-to-read, XML-formatted code.

Firefox also supports plug-ins like Wizz RSS News Reader and Sage, which integrate with the browser’s bookmarks so that you can read feeds one at a time by browsing recent entries from the bookmark menu.

Portals

Portals, like iGoogle, My Yahoo!, and Netvibes, are Web sites that provide quick access to search, email, calendars, stocks, RSS feeds, and more. The information is usually presented in a set of boxes on the page, with one box per piece of information. While each RSS feed is typically displayed in a separate box, you can show as many feeds as you like on a single page. This is a step up from a ticker or standard Web browser interface, where you can only see one feed at a time.

Email Browsers

Asmany of us spend a lot of time dealing with email, your email browser can be a convenient place to read your RSS feeds. Depending on what email browser you use, RSS feeds can often be integrated as additional folders. Each RSS feed that you subscribe to appears as a separate email folder, and each article as a message. You can’t, of course, reply to RSS articles — but you can forward and quote them, or arrange them in subfolders by topic.

If you use Microsoft Outlook or Outlook Express, the very latest versions (Vista’s Windows Mail and Outlook 2007) have built-in feed reading features. (Earlier versions of Outlook can support this through powerful, free add-ons, such as RSS Popper andAttensa.)

Mozilla’s Thunderbird email application and Yahoo! Mail also allow you to subscribe to RSS feeds. Gmail doesn’t, however, as Google assumes that you’ll use the powerful Google Reader application (discussed below) to manage your feeds.

RSS Feed Readers

Another advantage of the full-featured feed readers is that you can tag and archive important information for quick retrieval. The best ones let you easily filter out items you have already read, mark the articles that are important to you so that you can easily return to them later (kind of like TiVo for the Web), and easily change your view between individual feeds and collections of feeds.

In practice, feed readers make it very effective to quickly scan many different sources of information to filter out items that are worth reading. This is a much more efficient way to process new information on the Web than visiting sites individually, or even subscribing to them with a tool that doesn’t support aggregation, like a Web browser or portal.

Feed Readers come in two primary flavors, offline and online. Offline feed readers are Windows, Mac, or Linux applications that collect articles from your feeds when you’re online, store them on your computer, and allow you to read them at any time. Online feed readers are Web sites that store articles on the Internet, along with your history and preferences. The primary difference between an online and an offline reader is the state of synchronization. An online reader will keep track of what you’ve read, no matter what computer or device that you access it from, whereas an offline reader will only update your status on the machine that it’s installed on.

Offline feed readers, such as FeedDemon (for PCs) and Vienna (for Macs), allow you to subscribe to as many feeds as you like and keep them updated, organized and manageable. During installation, they will register as the default application for RSS links in your browser, so that subscribing to new sites is as easy as clicking on an RSS icon on a Web page and confirming that you want to subscribe to it.

Online feed readers, such as Google Reader or NewsGator, offer most of the same benefits as desktop readers. While offline readers collect mail at regular intervals and copy it to your PC, online readers store all of the feeds at their Web site, and you access them with any Web browser. This means that feeds are often updated more frequently, and you can access your account — with all your RSS feeds, markings, and settings intact — from any computer. You could be home, at the office, on a smartphone, or in an Internet cafe. The products mentioned even emulate offline use. NewsGator can be synchronized with its companion offline browser FeedDemon, and Google Reader has an offline mode supported by Google Gears.

Online Readers also provide a social aspect to feed reading. Both Google Reader and NewsGator allow you to mark and republish items that you want to share with others. NewsGator does this by letting you create your own feeds to share, while Google Reader lets you subscribe to other Google Reader users’ shared items. Google Reader also lets you tag Web pages that you find outside of Google Reader and save them to your personal and shared lists. If your team members don’t do RSS, Google has that covered as well — your shared items can also be published to a standalone Web page that others can visit. You can, of course, email articles from an offline reader, but any more sophisticated sharing will require an online reader.

For many of us, mining data on the Web isn’t a personal pursuit — we’re looking to share our research with co-workers and colleagues. This ability to not only do your own research, but share valuable content with others, ultimately results in a more refined RSS experience, as members of a given community stake their own areas of expertise and share highlights with each other.

Online browsers are less intuitive than offline ones, however, for subscribing to new feeds. While an offline browser can automatically add a feed when you click on it, online browsers will require you to take another step or two (for instance, clicking an “Add” button in your browsers’ toolbar). You’re also likely to have a more difficult time connecting to a secure feed, like a list of incoming donations from your donor database, with an online reader than you would with an offline one.

The online feed readers are moving beyond the question of “How do I manage all of my information?” to “How do I share items of value with my network?”, allowing us to not only get a handle on important news, views, and information, but to act as conduits for the valuable stuff. This adds a dimension we could call “information crowd-sourcing,” where discerning what’s important and relevant to us within the daily buffet of online information becomes a community activity.


In Summary

RSS isn’t just another Internet trend — it’s a way to conquer overload without sacrificing the information. It’s an answer to the problem that the Web created: If there’s so much information out there, how do you separate the wheat from the chaff? RSS is a straightforward solution: Pick your format, sit back, and let the information feast come to you.


Thanks to TechSoup for their financial support of this article. Marshall Kirkpatrick of ReadWriteWeb, Laura Quinn of Idealware, Thomas Taylor of the Greater Philadelphia Cultural Alliance and Marnie Webb of TechSoup Global, also contributed to this article.


Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

The Perfect Fit: A Guide To Evaluating And Purchasing Major Software Systems

This article was originally published at Idealware in September of 2008.

A major software package shouldn’t be chosen lightly. In this detailed guide, Peter Campbell walks through how to find software options, evaluate them, make a good decision, and then purchase the system in a way that protects you.

cd-437723_640 A smart shopper evaluates the item they want to purchase before putting money down. You wouldn’t shop for shoes without checking the size and taking a stroll up and down the aisle in order to make sure they fit, would you? So what’s the equivalent process of trying on a software package will size? How can you make sure your substantial software purchase won’t leave you sore and blistered after the cash has been exchanged?

That’s the goal of this article—to provide some guidance for properly evaluating major software investments. We’ll walk through how to find potential software options, gather the detailed information you need to evaluate them, make a solid decision and purchase a package in a way that protects you if it doesn’t do what you hoped it would for you.

Is it A Major Software System?

The evaluation process described here is detailed, so it’s probably not cost effective to apply it to every software tool and utility you purchase. How do you know if the package you’re considering is major enough to qualify? Major systems have a dramatic impact on your ability to operate and achieve your mission—they aren’t measured by budget, they’re measured by impact.

To help identify a major purchase, ask yourself:

  • Will the application be used by a significant percentage of your staff?
  • Will multiple departments or organizational units be using it?
  • Will this software integrate with other data systems?
  • If this software becomes unstable or unusable once deployed, will it have significant impact on your nonprofit’s ability to operate?

Giving significant attention to these types of major purchases is likely to save your organization time in the long run.

 

Taking Preliminary Measurements

Prior to even looking at available software options, make sure you thoroughly define your needs and what the application you select should be able to do for you. Nonprofits are process-driven. They receive, acknowledge, deposit and track donations; they identify, serve and record transactions with clients; and they recruit, hire and manage employees. Technology facilitates the way your organization manages these processes. A successful software installation will make this work easier, more streamlined and more effective. But a new system that doesn’t take your processes and needs into account will only make running your organization more difficult.

So it’s critical that, before you begin looking for that donor database or client-tracking system, you clearly understand the processes that need to be supported and the software features critical to support that work.

This is an important and complex area that could easily be an article—or a book—in its own right. We could also write numerous articles that delve into project management, getting company buy-in and change management—all critical factors in organizational readiness. However, for the purposes of this article, we’re focusing on the process of evaluating and purchasing software once you’ve already identified your needs and prepped the organization for the project.

Finding the Available Options

Once you know what you need and why you need it, the next step is to identify the pool of applications that might fit. An expert consultant can be a huge help. A consultant who knows the market and is familiar with how the systems are working for other nonprofits can save you research time, and can direct you to systems more likely to meet your true needs. While a consultant can be more expensive than going it alone, money spent up front on the selection and planning phases is almost always recouped through lower costs and greater efficiency down the road.

If a consultant isn’t warranted, take advantage of the resources available to the nonprofit community, such as Idealware, Social Source Commons, Techsoup’s forums or NTEN’s surveys. Ask your peers what they’re using, how they like it and why. Ideally you want to identify no less than three, and probably no more than eight, suitable products to evaluate.

Considering an RFP

With your list of possible software candidates in hand, the next step is to find out more about how those packages meet your needs. This is traditionally done through a Request for Proposal (RFP), a document that describes your environment and asks for the information you need to know about the products you’re evaluating.

Well-written RFPs can be extremely valuable for understanding the objective aspects of large software purchases. For example, if you are looking for a Web site content management system (CMS), questions such as “does the blogging feature support trackbacks?” or “Can the CMS display individualized content based on cookie or user authentication?” are good ones for an RFP.

What you want from the RFP is information you can track with checkboxes. For example, “It can/can’t do this,” “It can/can’t export to these formats: XML, SQL, CSV, PDF,” or “They can program in PHP and Ruby, but not Java or Cold Fusion.” Questions that encourage vendors to answer unambiguously, with answers that can be compared in a simple matrix, will be useful for assessing and documenting the system capabilities.

An RFP can’t address all the concerns you’re likely to have. Subjective questions like “How user-friendly is your system?” or “Please describe your support” are unlikely to be answered meaningfully through an RFP process.

Certainly, you can arrange for demonstrations, and use that opportunity to ask your questions without going through an RFP process. But while the formality of an RFP might seem unnecessary, there are some key reasons for getting your critical questions answered in writing:

  • You can objectively assess the responses and only pursue the applications that aren’t clearly ruled out, saving some time later in the process.
  • A more casual phone or demo approach might result in different questions asked and answered by different vendors. An RFP process puts all of the applications and vendors on a level field for assessing.
  • The RFP responses of the vendor you select are routinely attached to the signed contract. An all-too-common scenario is that the vendor answers all of your questions with “yes, yes, yes,” but the answers change once you start to implement the software. If you don’t have the assurances that the software will do what you require in writing, you won’t have solid legal footing to void a contract.

Structuring Your RFP

RFPs work well as a four section document. Below, we walk through each of those sections.

Introduction

The introduction provides a summary of your organization, mission and the purpose of the RFP

Background

The background section provides context the vendor will need to understand your situation. Consider including a description of your organization—for instance, number of locations, number of staff and organizational structure, the processes the system should support, and such technology infrastructure as network operating system(s) and other core software packages. Include any upcoming projects that might be relevant.

Questionnaire

The questionnaire is the critical piece of the document—you want to be sure you ask all of the questions that you need answered. In preparing these questions, it’s best to envision what the vendor responses might look like. What will have to be in those responses for you to properly assess them? Consider asking about:

  • Functionality. In order to get answers you’ll be able to compare, ask your questions at a granular level. Does a CRM support householding? Does a donor database have a method for storing soft credits? Can multiple users maintain and view records of donor interactions? Can alerts or notifications be programmed in response to particular events? Use the results of your business requirements work to focus in on the functions that are critical to you and your more unusual needs.
  • Technology specifics. Make sure the software will integrate properly with other applications, that the reporting is robust and customizable by end users, and that the platform is well-supported. Ask which formats data can be exported to and imported from, how many tables can be queried simultaneously and what type of support is available—both from the vendor and third parties. Ask for a data dictionary, which a technical staffer or consultant can review, because a poorly designed database will complicate reporting and integration. And ask for a product roadmap. If the next version is going to be a complete rewrite of the application, you might want to rule out the current version for consideration.
  • Company information. Think through what you’ll want to know about the company itself. How big is it? Do they have an office near you? How long have they been in business? Are they public or private? Can they provide some documentation of financial viability? Who are the staff members that would be assigned to your project? References from similar clients with similar-scope projects can also be very useful. For more information on this area, see Idealware’s article Vendors as Allies: How to Evaluate Viability, Service, and Commitment.
  • Pricing and availability. What are their hourly rates, broken down by role, if applicable? What are their payment terms? What is their total estimate for the project as described? How do they handle changes in project scope that might arise during implementation? What are their incidental rates and policies (travel, meals)? Do they discount their services or software costs for 501(c)(3)s? How long do they estimate this project will take? When are they available to start?

 

While it’s important to be thorough, don’t ask a lot of questions you don’t plan to actually use to evaluate the systems. Asking questions “just in case” increases the amount of information you’ll need to sift through later, and increases the possibility that vendors might decide your RFP isn’t worth the time to respond to.

Instructions

Close with a deadline and details about how to submit replies. For a sizeable RFP, allow a minimum of four to six weeks for a response. Remember that this isn’t a confrontational process—a good vendor will appreciate and want to work with a client that has thought things out this well, and the questionnaire is also an opportunity for them to understand the project up front and determine their suitability for it. Respect their schedules and give them ample time to provide a detailed response.

Include an indication as to how additional questions will be handled. In general, if one vendor asks for clarification or details, your answers should be shared with all of the RFP participants. You want to keep things on a level playing field, and not give one vendor an advantage over the rest. You might do this via a group Q&A, with all the vendors invited to participate in a meeting or conference call after the RFP has been sent to them but well before they are due to respond. With all vendors asking their questions in the same room, you keep them all equally informed. Alternatively, you can specify a deadline by which written questions must be submitted. All participants would then receive the questions and answers.

Evaluating the Answers

Once you receive RFP responses, you’ll need to winnow down your list to determine which packages you’d like to demo.

If you asked straightforward, granular questions, you’ll now reap the benefit: you can set up a comparative matrix. Create a table or spreadsheet with columns for each vendor and rows for each question, summarizing the responses as much as possible in order to have a readable chart. You might add columns that weight the responses, both on the suitability of the vendor’s response (e.g. 1, unacceptable; 2, fair; 3, excellent) and/or on the importance of the question (for instance, some features are going to be much more important to you than others).

Going through the features and technology sections, you’ll see the strong and weak points of the applications. In determining which fit your needs, there will likely be some trade-offs—perhaps one application has a stronger model for handling soft credits, but another has more flexible reporting. It’s unlikely that any will jump out as the perfect application, but you’ll be able to determine which are generally suitable, and which aren’t.

For example, if you’re looking for software to manage your e-commerce activities, inventory management might be a critical function for you. If a submitted software package lacks that feature, then you’ll need to eliminate it.  As long as you understand your own critical needs, the RFP responses will identify unsuitable candidates.

You might rule out a vendor or two based on what the RFP response tells you about their availability or company stability. Take care, though, in eliminating vendors based on their RFP pricing information. RFP responses can be very subjective. Before determining that a vendor is too pricy based on their project estimate, dig deeper—other vendors might be underestimating the actual cost. If you feel you have a solid grasp on the project timeline, use the hourly rates as a more significant measurement.

The RFP responses will tell you a lot about the vendors. You’re asking questions that are important to your ability to operate. Their ability to read, comprehend and reasonably reply to those questions will offer a strong indication as to how important your business is to them, and whether they’ll consider your needs as the software is implemented and into the future. If they respond (as many will) to your critical questions with incomplete answers, or with stacks of pre-printed literature—saying, in effect, “the answers are in here”–then they’re telling you they won’t take a lot of time to address your concerns.

Keep in mind, though, that a weak sales representative might not mean a weak vendor, particularly if they’re representing a product that comes recommended or looks particularly suitable on all other fronts. It’s acceptable to reject the response and ask the vendor to resubmit if you really feel they have done you, and themselves, a disservice—but temper this with the knowledge that they blew it the first time.

Trying It All On for Size

At this point the process will hopefully have narrowed the field of potential applications down to three-to-five options. The next step is to schedule software demos. A well-written RFP will offer important, factual and comprehensive details about the application that might otherwise be missed, either by too narrow a demo or by one the vendor orchestrates to highlight product strengths and gloss over weaknesses. But the demos serve many additional purposes:

  • Evaluating look and feel. As good as the specs might look, you’ll know quickly in a demo if an application is really unusable. For instance, an application might technically have that great Zip code lookup feature you asked about in the RFP, but it may be implemented in a way that makes it a pain to use. Prior to the demo, try to present the vendors with a script of the functions you want to see. It can also be useful to provide them with sample data, if they are willing—evaluating a program with data similar to your own data will be less distracting. Be careful not to provide them with actual data that might compromise your—or your constituents’—privacy and security. The goal is to provide a level and familiar experience that unifies the demos and puts you in the driver’s seat, not the vendor.
  • Cross training. The demo is another opportunity for the vendor to educate you regarding the operating assumptions of the software, and for you to provide them with more insight into your needs. A generic donor management system is likely to make very good assumptions about how you track individuals, offer powerful tools for segmentation and include good canned reports, because the donor-courting processes are very similar. But in less standardized areas—or if you have more unusual needs—the model used by the software application can differ dramatically from your internal process, making it difficult for your organization to use. Use the demo to learn how the software will address your own process and less conventional needs.
  • Internal training. Even more valuable is the opportunity to use the demos to show internal staff what they’ll be able to do with the software. Demos are such a good opportunity to get staff thinking about the application of technology that you should pack the room with as many people as you can. Get a good mix of key decision-makers and application end-users—the people who design and perform the business processes the software facilitates. The people who will actually use the software are the ones who can really tell if the package will work for them.

Making the Decision

With luck, your vendor selection process will now be complete, with one package clearly identified as the best option. If key constituents are torn between two options or unimpressed with the lot, senior decision-makers might have to make the call. Be careful, however, not to alienate a group of people whose commitment and enthusiasm for the project might be needed.

If none of the applications you evaluated completely meets your needs, but one comes close, you might consider customizations or software modifications to address the missing areas. Note that any alterations of the basic software package will likely be costly, will not be covered in the packaged documentation and help files, and might break if and when you upgrade the software. Be very sure there isn’t an alternate, built-in way to accomplish your goal. If f the modification is justified, make sure it’s done in such a way that it won’t be too difficult to support as the software is developed.

Before making a final decision, you should always check vendor references, but take them with a healthy grain of salt. An organization’s satisfaction with software depends not only on how well it meets their needs, but how familiar they are with their options—there are a lot of people who are happy using difficult, labor-heavy, limited applications simply because they don’t know there are better alternatives.

If you still have a tie after RFPs, demos and reference checks, the best next step is to conduct on-site visits with an existing customer for each software package. As with demos, bring a representative group of management, technical staff and users. Assuming the reference can afford the time to speak with you, the visit will highlight how the software meets their needs, and will give you a good, real world look at its strengths and weaknesses. You’ll also likely walk away with new ideas as to how you might use it.

Signing on the Dotted Line

You’ve selected an application. Congratulations! You might be tired, but you aren’t finished yet. You still need to work with the vendor to define the scope of the engagement, and an agreement that will cover you in case of problems. A good contract clearly articulates and codifies everything that has been discussed to date into a legally binding agreement. If, down the road, the vendor isn’t living up to their promises, or the software can’t do what you were told it would do, then this is your recourse for getting out of an expensive project.

Contract negotiations can take time. It’s far more dangerous to sign a bad contract in the interest of expediency, though, than it is to delay a project while you ensure that both parties—you and the vendor—completely understand each other’s requirements. Don’t start planning the project until the papers have been signed.

A software contract should include a number of parts, including the actual agreement, the license, the scope of work and the RFP.

The Agreement

This is the legal document itself, with all of the mumbo jumbo about force majeure and indemnity. The key things to look for here are:

  • Equal terms and penalties. Are terms and penalties equally assessed? Vendors will write all sorts of terms into contracts that outline what you will do or pay if you don’t live up to your end of the agreement. But they’ll often leave out any equivalent controls on their behavior. You should find every “if this happens, customer will do this” clause and make sure the conditions are acceptable, and that there are complementary terms specified for the vendor’s actions.
  • Reasonable cancellation penalties. If there are penalties defined for canceling a consulting or integration contract, these should not be exorbitant. It’s reasonable for the vendor to impose a limited penalty to cover expenses incurred in anticipation of scheduled work, such as airfare purchased or materials procured. But unless this is a fixed cost agreement, which is highly unusual, don’t let them impose penalties for work they don’t have to do—for example, for a large percentage of the estimated project cost.
  • Agreement under the laws of a sensible state. If the vendor is in California, and you’re in California, then the agreement should be covered by California laws rather than some random other state. In particular, Virginia’s laws highly favor software companies and vendors. In most cases, you want the jurisdiction to be where you live, or at least where the vendor’s headquarters actually are.

The Software License

The license specifies the allowed uses of the software you’re purchasing. This, too, can contain some unacceptable conditions.

  • Use of your data. A software license should not restrict your rights to access or work with your data in any way you see fit. The license agreement will likely contain conditions under which the software warranty would be voided. It’s perfectly acceptable for a commercial software vendor to bar re-engineering their product, but it’s not acceptable for them to void the warranty if you are only modifying the data contained within the system. So conditions that bar the exporting, importing, archiving or mass updating of data should be challenged. If the system is hosted, the vendor should provide full access to your data, and the license should include language providing that client shall have reasonable access for using, copying and backing up all customer information in the database. There should be no language in the contract implying that the vendor owns your data, or that they can use it for any additional purposes.
  • Responsibility avoidance. Software warranties should not include blanket “software provider is not responsible if nothing works” statements. This shouldn’t need to be said, but, sadly, there are often warranty sections in license agreements that say just that.
  • Back doors. The license should not allow for any post-sale reversals of licensing, such as language stating that the contract will be void if the customer uses the software in perfectly reasonable ways they don’t anticipate. For instance, if you want to use the CRM functions of your donor database to track contacts that aren’t potential donors, you shouldn’t sign a contract limiting use of the software to “fundraising purposes”. Also, there should not be any “back doors” programmed into the application that the vendor can maintain for purposes of disabling the software.

The Scope of Work

The Scope of Work (SOW) describes exactly what the project will consist of. It’s an agreement between the vendor and the customer as to what will happen, when, and how long it will take. Good scopes include estimates of hours and costs by task and/or stage of the project. The scope should be attached as a governing exhibit to the contract. Usually, this is negotiated prior to receiving the actual contract. By having it attached to the contract, the vendor is now legally obligated to, basically, do what they said they would do.

The RFP

Like the Scope of Work, the RFP should also be attached as a governing document that assures that the software does what the vendor claimed it would.

In Conclusion

For big ticket purchases, it’s well worth having an attorney review or assist in negotiations. Keep in mind that the goal is to end up with a contract that equally defends the rights of both parties. True success, of course, is a solid contract that is never revisited after signing. Litigation doesn’t serve anyone’s interest.

Bringing It Home

There’s a lot of talk and plenty of examples of technology jumpstarting an organization’s effectiveness. But if someone were to do the tally, there would probably be more stories of the reverse. All too often, organizations make decisions about their software based on uninformed recommendations or quick evaluations of the prospective solutions. Decisions are often based more on expediency than educated selection.

Rushing a major investment can be a critical error. Learn about the available options, thoroughly assess their suitability to your needs and prepare your staff to make the most of them. Then, sign a contract that protects you if, after all else is done, the application and/or vendor fails to live up to the promises. Finding the right application and setting it up to support, not inhibit, your workflow is a matter of finding something that really fits. You can’t do that with your eyes closed.

 

For More Information

Vendors as Allies: How to Evaluate Viability, Service, and Commitment
An Idealware article on how to think specifically about the less-concrete aspects of software selection.

How To Find Data-Exchange-Friendly Software
An overview of how to ensure you’re going to be able to get data in and out of a software package. (For much more detailed considerations, see our Framework to Evaluate Data Exchange Features.)

 

Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

Robert Weiner and Steve Heye also contributed to this article.

 

XML, API, CSV, SOAP! Understanding The Alphabet Soup Of Data Exchange

This article was originally published at Idealware in October of 2007.

Let’s say you have two different software packages, and you’d like them to be able to share data. What would be involved? Can you link them so they exchange data automatically? And what do all those acronyms mean? Peter Campbell explains.

There has been a lot of talk lately about data integration, Application Programming Interfaces (APIs), and how important these are to non-profits. Much of this talk has focused on the major non-profit software packages from companies like Blackbaud, Salesforce.com, Convio, and Kintera. But what is it really about, and what does it mean to the typical org that has a donor database, a web site, and standard business applications for Finance, Human Resources and payroll? In this article, we’ll bypass all of the acronyms for a while and then put the most important ones into perspective.

The Situation

Nonprofits have technology systems, and they live and die by their ability to manage the data in those systems to effectively serve their missions. Unfortunately, however, nonprofits have a history of adopting technology without a plan for how different applications will share data. This isn’t unique to the nonprofit sector: throughout the business world, data integration is often underappreciated.
Here’s simple example: Your mid-sized NPO has five fundraising staff people that together bring in $3,000,000 in donations every year. How much more would you bring in with six fundraising people? How much less with four? If you could tie your staffing cost data to hours worked and donations received, you would have a payroll-to-revenue metric that could inform critical staffing decisions. But if the payroll data is in an entirely different database from the revenue data, they can’t be easily compared.
Similarly, donations are often tracked in both a donor database and a financial system. If you’ve ever had to explain to the board why the two systems show different dollar amounts (perhaps because finance operates on a cash basis while fund development works on accrual), you can see the value in having systems that can reconcile these differences.

How can you solve these data integration challenges? Short of buying a system that tracks every piece of data you may ever need, data exchange is the only option. This process of communicating data from one system to another could be done by a straightforward manual method, like asking a staff member to export data from one system and import it into another. Alternatively, automatic data transfers can save on staff time and prevent trouble down the road – and they don’t have to be as complex as you might think.
What does it take to make a data exchange work? What is possible with your software applications? This article explains what you’ll need to consider.

 

Components of Data Exchange

Let’s get down to the nitty-gritty. You have two applications, and you’d like to integrate them to share data in some way: to pull data from one into another, or to exchange data in both directions. What has to happen? You’ll need four key components:

  • An Initiating Action. Things don’t happen without a reason, particularly in the world of programming. Some kind of triggering action is needed to start the data interchange process. For an automatic data exchange, this is likely to be either a timed process such as a scheduler kicking off a program at 2AM every night, or a user action – for instance, a visitor clicking the Submit button on your website form.
  • A Data Format. The data to be transferred needs to be stored and transferred in some kind of logical data format – for instance, a comma delineated text file – that both systems can understand.
  • A Data Transfer Mechanism. If both applications reside on your own network, then a transfer is likely to be straightforward – perhaps you can just write a file to a location where another application can read it. But if one or both applications live offsite, you might need to develop a process that transfers the data over the internet.

Let’s look at each of these components in more detail.

 

Initiating Action

An initiating action is what starts things rolling in the data exchange process. In most cases, it would take one of three forms:

  • Human Kickoff. If you’re manually exporting and importing files, or need to run a process on a schedule that’s hard to determine in advance, regular old human intervention can start the process. An administrator might download a file, run a command line program, or click a button in an admin interface.
  • Scheduler. Many data exchanges rely on a schedule – checking for new information every day, every hour, every five minutes, or some other period. These kinds of exchanges are initiated by a scheduler application. More complex applications might have a scheduling application built-in, or might integrate with Windows Scheduler or Unix/Linux Chron commands.
  • End User Action. If you want two applications to be constantly in synch, you’ll need to try to catch updates as they happen. Typically, this is done by initiating a data exchange based on some end user action, such as a visitor clicking the Submit button on an online donation form.

 

 

Data Formats

In order to transfer data from one system to another, the systems need to have a common understanding of how the data will be formatted. In the old days, things were pretty simple: you could store data in fixed format text files, or as bits of information with standard delimiting characters, commonly called CSV for “Comma Separated Values”. Today, we have a more dynamic format called XML (eXtensible Markup Language).
An example fixed format file could be made up of three lines, each 24 characters long:

Name (20)  Gender(1)   Age(3)
Susan          f                    25
Mark             m                  37

 

A program receiving this data would have to be told the lengths and data types of each field, and programmed to receive data in that exact format.

 

“Susan”,”f”,25
“Mark”,”m”,37

CSV is easier to work with than fixed formats, because the receiving system doesn’t have to be as explicitly informed about the incoming data. CSV is almost universally supported by applications, but it poses challenges as well. What if your data has quotes and commas in it already? And as with fixed formats, the receiving system will still need to be programmed (or “mapped”) to know what type of data it’s receiving.
CSV is the de facto data format standard for one-time exports and data migration projects. However, automating CSV transfers requires additional programming – batch files or scripts that will work with a scheduling function. Newer standards, like XML, are web-based and work in browsers, allowing for a more dynamic relationship with the data sets and less external programming.
The XML format is known as a “self-describing” format, which makes it a bit harder to look at but far easier to work with. The information about the data, such as field names and types, is encoded with the data, so a receiving system that ‘speaks” XML can dynamically receive it. A simple XML file looks like this:

-<PEOPLE>
-<PERSON>
<NAME>Susan</NAME>
<GENDER>f</GENDER>
<AGE>25</AGE>
</PERSON>
-<PERSON>
<NAME>Mark</NAME>
<GENDER>m</GENDER>
<AGE>37</AGE>
</PERSON>

An XML friendly system can use the information file itself to dynamically map the data to its own database, making the process of getting a data set from one application to another far less laborious than with a CSV or fixed width file. XML is the de facto standard for transferring data over the internet.

Data Transfer Mechanisms

As we’ve talked about, an initiating action can spur an application to create a formatted. data set. However, getting that data set from one application to another requires some additional work.
If both of your applications are sitting on the same network, then this work is likely pretty minimal. One application’s export file can easily be seen and uploaded by another, or you might even be able to establish a database connection directly from one application to another. However, what if the applications are in different locations? Or if one or both are hosted by outside vendors? This is where things get interesting.
There are multiple ways to exchange data over the web. Many of them are specific to the type of web server (Apache vs. Microsoft’s IIS) or operating system (Unix vs Linux vs Microsoft) you’re using. However, two standards – called “web services” – have emerged as by far the most common methods for simple transfers: SOAP (Standard Object Access Protocol) and REST (Representational State Transfer).
Both SOAP and REST transfer data via the standard transfer protocol mechanism of the web: HTTP. To explain the difference between REST and SOAP, we’ll take a brief detour and look at HTTP itself.
HTTP is a very simple minded thing. It allows you to send data from one place to another and, optionally, receive data back. Most of it is done via the familiar Uniform Resource Identifier (URI) that is typed into the address bar of a web browser, or encoded in a link on a web page, with a format similar to:

http://www.somewhere.com?parameter1=something&parameter2=somethingelse

There are two methods built into HTTP for exchanging data: GET and POST.

  • GET exchanges data strictly through the parameters to the URL, which are always in “this equals that” pairs. It is a one-way communication method – once the information is sent to the receiving page, the web server doesn’t retain the parameter data or do anything else with it.
  • POST stores the transferred information in a packet that is sent along with the URI – you don’t see the information attached to the URI in the address bar. Post values can be altered by the receiving page and returned. In almost any situation where you’re creating an account on a web page or doing a shopping transaction, POST is used.

The advantage to GET is that it’s very simple and easy to share. The advantages to POST are that it is more flexible and more secure. You can put a GET URI in any link, on or offline, while a POST transfer has to be initiated via an HTML Form.
However, add to the mix that Microsoft was one of the principal developers of the SOAP specification, and most Microsoft applications require that you use SOAP to transfer data. REST might be more appealing if you only need to do a simple data exchange, but if you’re working with Microsoft servers or applications, it is likely not an option.

Transformation and Validation Processes

While this article is focused on the mechanics of extracting and moving data, it’s important not to lose sight of the fact that data often needs a lot of work before it should be loaded into another system. Automated data exchange processes need to be designed with extreme care, as it’s quite possible to trash an entire application by corrupting data, introducing errors, or flooding the system with duplicates.
In order to get the data ready for upload, use transformation and validation processes. These processes could be kicked off either before or after the data transfer, or multiple processes could even take place at different points in time. An automated process could be written in almost any programming language, depending on the requirements of your target applications and your technical environment.

 

  • Converting file formats. Often, one application will export a data file with a particular layout of columns and field names, while the destination application will demand another.
  • Preventing duplicates. Before loading in a new record, it’s important to ensure that it doesn’t already exist in the destination application.
  • Backup and logging. It’s likely a good idea to kickoff a backup of your destination database before importing the data, or at least to log what you’ve changed.
  • User interface. For complex processes, it can be very useful to provide an administrative interface that allows someone to review what data will change and resolve errors prior to the import
  • Additional Data Mining. If you’re writing a process that analyzes data, adding routines that flag unusual occurrences for review can be very useful. Or if you’re uploading donation data that also has to go to Finance, why not concurrently save that into a CSV file that Finance can import into their system? There are plenty of organizational efficiencies that can be folded into this process.

As described in the API section below, a sophisticated application may provide considerable functionality that will help in these processes.

Application Programming Interfaces (APIs)

What about APIs? How do they fit in? We’re hundreds of words into this article without even a mention of them – how can that be? Well, APIs are a fuzzy concept that might encompass all the aspects of data exchange we just discussed, or some of them, or none of them at all. Clear as mud, right?
An API is exactly what it says – an interface, or set of instructions, for interacting with an application via a programming language.
Originally, APIs were built so that third party developers could create integrating functions more easily. For instance, a phone system vendor might write specific functions into their operating system so that a programmer for a voice mail company could easily import, extract, and otherwise work with the phone system data. This would usually be written in the same programming logic as the operating system, and the assumption was that the third party programmer knew that language. Operating systems like Unix and Windows have long had APIs, allowing third parties to develop hardware drivers and business applications that use OS functions, such as Windows’ file/open dialog boxes.

 

APIs are written to support one or more programming languages – such as PHP or Java – and require a programmer skilled in one of these languages. An API is also likely to be geared around specific data format and transfer standards – for instance, it may only accept data in a particularly XML format, and only via a SOAP interface. In most cases, you’ll be limited to working with the supported standards for that API.

 

Choose Your Own Data Exchange Adventure

The type of data exchange that makes sense and how complex it will be varies widely. A number of factors come into play: the applications you would like to integrate, the available tools, the location of the data, and the platform (i.e Windows, Linux, web) you’re using. Integration methods vary widely. For instance:

  • Striped all the way down to the basics, manual data exchange is always an option. In this case, an administrator (a Human Kickoff initiating action) might download a file into CSV, save it to the network, perform some manual transformations to put it into the appropriate file format, and upload it into a different system.
  • For two applications on the same network, the process might not be too much more complex. In this case, a Scheduler initiating action might prompt one application to export a set of data as a CSV file and save it to a network drive. A transformation program might then manipulate the file and tell the destination application to upload the new data.
  • Many web-based tools offer simple ways to extract data. For instance, to get your blog’s statistics from the popular tracking service FeedBurner, you could use a scheduled initiating action to simply request a FeedBurner page via HTTP, which would then provide you the statistics on a XML page. Your program could then parse and transform the data in order to load into your own reporting application or show it on your own website. Many public applications, such as GoogleMaps, offer similarly easy functionality to allow you to interact with them, leading to the popularity of Mashups- applications that pull data (generally via APIs) from two or more website.
  • If you are using a website Content Management System which is separate from your main constituent management system, you may find yourself with two silos containing constituent data – members who enrolled on your web site and donors tracked in a donor database. In this circumstance, you might setup a process that kicks off whenever someone submits the Become a Member form. This process could write the data for the new member into an XML file, transfer that file to your server, and there kickoff a new process that import the new members while checking for duplicates.

Finding Data-Exchange-Friendly Software

As is likely clear by now, the methods you can use to exchange data depend enormously on the software packages that you chose. The average inclination when evaluating software is to look for the features that you require. That’s an important step in the process, but it’s only half of the evaluation. It’s also critical to determine how you can – or if you can – access the data. Buying into systems that overcomplicate or restrict this access will limit your ability to manage your business.
Repeat this mantra: I will not pay a vendor to lock me out of my own data. Sadly, this is what a lot of data management systems do, either by maintaining poor reporting and exporting interfaces; or by including license clauses that void the contract if you try to interact with your data in unapproved ways (including leaving the vendor).
To avoid lock-in and ensure the greatest amount of flexibility when looking to buy any new application – particularly the ones that store your data off-site and give you web-based access to it – ask the following questions:

  • Can I do mass imports and updates on my data? If the vendor doesn’t allow you to add or update the system in bulk with data from other systems, or their warrantee prohibits mass updates, then you will have difficulty smoothly integrating data into this system.
  • Can I take a report or export file; make a simple change to it, and save my changes? The majority of customized formats are small variations on the standard formats that come with a system. But it’s shocking how many web-based platforms don’t allow you to save your modifications.
  • Can I create the complex data views that are useful to me? Most modern donor, client/case management and other databases are relational. They store data in separate tables. That’s good – it allows these systems to be powerful and dynamic. But it complicates the process of extracting data and creating customized reports. A donor’s name, address, and amount that they have donated might be stored in three different, but related tables. If that’s the case, and your reporting or export interface doesn’t allow you to report on multiple tables in one report, then you won’t be able to do a report that extracts names and addresses of all donors who contributed a certain amount or more. You don’t want to come up with a need for information and find that, although you’ve input all the data, you can’t get it out of the system in a useful fashion.
  • Does the vendor provide a data dictionary? A data dictionary is a chart identifying exactly how the database is laid out. If you don’t have this, and you don’t have ways of mapping the database, you will again be very limited in reporting on and extracting data from the application.
  • What data formats can I export data to? As discussed, there are a number of formats that data can be stored in. You want a variety of options for industry standard formats.
  • Can I connect to the database itself? Particularly if the application is installed on your own local network, you might be access the database directly. The ability to establish an ODBC connection to the data, for instance, can provide a comparatively easy way to extract or update data. Consider, however, what will happen to your interface if the vendor upgrades the database structure.
  • Can I initiate data exports without human intervention? Check to see if there are ways to schedule exports, using built-in scheduling features or by saving queries that can be run by the Windows Scheduler (or something similar). If you want to integrate data in real time, determine what user actions you can use to kick off a process. Don’t allow a vendor to lock you out of the database administrator functions for a system installed on your own network.
  • Is there an API? APIs can save a lot of time if you’re building a complex data exchange. For some systems, it may be the only way to get data in or out without human intervention. Don’t assume any API is a good API, however – make sure it has the functions that will be useful to you.
  • Is there a data exchange ecosystem? Are there consultants who have experience working with the software? Does the software support third party packages that specialize in extracting data from one system, transforming it, and loading it into another? Is there an active community developing add-ons and extensions to the application that might serve some of your needs?

Back to Reality

So, again, what does all of this really mean to a nonprofit organization? From a historical perspective, it means that despite the preponderance of acronyms and the lingering frustrations of some companies limiting their options, integration has gotten easier and better. If you picked up this article thinking that integrating and migrating data between applications and web sites is extremely complex, well, it isn’t, necessarily – it’s sometimes as simple as typing a line in your browser’s address bar. But it all depends on the complexity of the data that you’re working with, and the tools that your software application gives you to manage that data.

 

For More Information

An Introduction to Integrating Constituent Data: Three Basic Approaches
A higher level, less technical look at data integration options

The SOAP/XML-RPC/REST Saga
A blog article articulating the differences – from a more technical perspective – between REST and SOAP.

Mashup Tools for Consumers
New York Times article on the Mashup phenomenon

W3 Hardcore Data Standard Definition
W3, the standards body for the internet. The hardcore references for HTTP, XML, SOAP, REST and other things mentioned here.

Web API List
Techmagazine’s recent article linking to literally hundreds of applications that have popular Web APIs

Peter Campbell is currently the Director of Information Technology at Earthjustice, an non-profit law firm dedicated to defending the earth. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo & Marin Counties, Inc. Peter has been managing technology for non-profits and law firms for over 20 years, and has a broad knowledge of systems, email and the web. In 2003, he won a “Top Technology Innovator” award from InfoWorld for developing a retail reporting system for Goodwill thrift. Peter’s focus is on advancing communication, collaboration and efficiency through creative use of the web and other technology platforms. In addition to his work at SF Goodwill, Peter maintains a number of personal and non-profit web sites; blogs on NPTech tools and strategies at http://techcafeteria.com; is active in the non-profit community as member of NTEN; and spends as much quality time as possible with his wife, Linda, and eight year old son, Ethan.

Steve Anderson of ONE/ Northwest, Steven Backman of Design Database Associates, Paul Hagen of Hagen20/20, Brett Meyer of NTEN, and Laura Quinn of Idealware also contributed to this article

Better Organization Through Document Management Systems

This article was originally published at Idealware in January of 2007.

Is your organization drowning in a virtual sea of documents? Document management systems can provide invaluable document searching, versioning, comparison, and collaboration features. Peter Campbell explains.

tax-468440_640For many of us, logging on to a network or the Internet can be like charting the ocean with a rowboat. There may be a sea of information at our fingertips, but if we lack the proper vessel to navigate it, finding what we need — even within our own organization’s information system — can be a significant challenge.

Organizations today are floating in a virtual sea of documents. Once upon a time, this ocean was limited to computer files and printed documents, but these days we must also keep track of the information we email, broadcast, publish online, collaborate on, compare, and present — as well as the related content that others send us. Regulatory measures like the Sarbanes-Oxley actand the Health Insurance Portability and Accountability Act (HIPAA) have created a further burden on organizations to produce more documents and track them more methodically.Taken as a whole, this flood of created and related content acts as our nonprofit’s knowledge base. Yet when we simply create and collect documents, we miss the opportunity to take advantage of this knowledge. Not only do these documents contain information we can reuse, we can also study them to understand past organizational decisions and parse them to produce metrics on organizational goals and efficiencies.

Just as effective document management has become an increasing priority for large companies, it has also become more important — and viable — at smaller nonprofits. And while free tools like Google Desktop or Windows Desktop Search can help increase your document-management efficiency, more sophisticated and secure document-management tools — called Document Management Systems (DMSs) — are likely within your reach. Document management systems offer integrated features to support Google-esque searching, document versioning, comparison, and collaboration. What’s more, when you save a document to a DMS, you record summary information about your document to a database. That database can then be used to analyze your work in order to improve your organization’s efficiency and effectiveness.

Basic Document Management

One way to increase the overall efficiency of your document management is simply to use your existing file-system tools in an agreed upon, standardized fashion. For instance, naming a document “Jones Fax 05-13-08.doc” instead of “Jones.doc” is a rudimentary form of document management. By including the document type (or other descriptive data) your document will be easier to locate when you’re looking for the fax that you sent to Jones on May 13, as opposed to other erstwhile “Jones” correspondence. Arranging documents on a computer or file server in standard subfolders, with meaningful names and topics, can also be useful when managing documents.

For small organizations with a manageable level of document output, these basic document-storing techniques may suffice, especially if all document editors understand the conventions and stick by them. But this kind of process can be difficult to impose and enforce effectively, especially if your organization juggles thousands of documents. If you find that conventions alone aren’t working, you may wish to turn to a Document Management System.

One huge advantage of this system is that it names and stores your documents using a standardized, organization-wide convention, something that can be difficult to maintain otherwise, especially given a typical nonprofit’s turnover rate and dependence on volunteers. What’s more, a DMS will track not just the date the file was last modified (as Windows does), but also the date the document was originally created — which is often more useful in finding a particular document.

In fact, a DMS’s “File > Open” dialogue box can locate files based on any of the information saved about a document. A DMS can narrow a search by date range, locate documents created by particular authors, or browse through recently modified documents, sparing you the necessity of clicking through multiple folders to find what you’re looking for. It will also allow you to search the content of documents using a variety of methods, including the Boolean system (e.g. “includes Word A OR Word B but NOT Word C”) and proximity criteria (e.g., “Word A and word B within n words of each other”). Just as Google has become the quickest way to pull Web-page “needles” out of a gigantic Internet haystack, a solid DMS allows you to quickly find what you’re looking for on your own network.

A good DMS also allows the document creator to define which co-workers can read, edit, or delete his or her work via the document profile. On most networks, this type of document protection is handled by network access rules, and making exceptions to them requires a call to the help desk for assistance.

  • Document check-in and check-out.

    If you try to open a file that someone else is already editing, a network operating system, like Windows Server 2003, will alert you that the file is in use and offer you the option to make a copy. A DMS will tell you more: who is editing the document, what time she checked it out, and the information she provided about the purpose of her revision and when she plans to be done with the document.

  • Document comparison.

    A DMS not only supports Word’s track-changes and document-merging features, but allows you to compare your edited document to an unedited version, highlighting the differences between the two within the DMS. This is a great feature when your collaborator has neglected to track his or her changes, particularly because it allows you to view the updates without actually adding the revision data to your original files, making them less susceptible to document corruption.

  • Web publishing.

    Most DMSs provide content-management features for intranets and even public Web sites. Often, you can define that specific types of documents should be automatically published to your intranet as soon as they’re saved to the DMS. (Note, however, that if your core need is to publish documents on a Web site, rather than track versions or support check-ins and check-outs, a dedicated Content Management System [CMS] will likely be a better fit than a DMS.)

  • Workflow automation.

    A DMS can incorporate approvals and routing rules to define who should see the document and in what order. This allows the system to support not only the creation and retrieval of documents, but also the editing and handoff process. For example, when multiple authors need to work on a single document, the DMS can route the file from one to the next in a pre-defined order.

  • Email Integration.

    Most DMSs integrate with Microsoft Outlook, Lotus Notes, and other email platforms, allowing you to not only view your document folders from within your email client, but to also to save emails to your DMS. If, for example, you send out a document for review, you can associate any feedback and comments you receive via email with that document, which you can retrieve whenever you search for your original file.

  • Document Recovery.

    DMSs typically provide strong support for document backup, archiving, and disaster recovery, working in conjunction with your other backup systems to safeguard your work.

Three Types of Document Management Systems

If you decide that your organization would benefit from a DMS, there are a variety of choices and prices available. In general, we can break up DMSs into three types:

  • Photocopier- and Scanner-Bundled Systems

    Affordable DMS systems are often resold along with photocopiers and
    scanners. While primarily intended as an image and PDF management
    system, these DMSs integrate with the hardware but can also manage files created on the network. Bundled systems may not include the very high-end features features offered by enterprise-level DMSs, but will offer the basics and usually come with very competitive, tiered pricing. A popular software package is offered by Laserfiche.

  • Enterprise-Level Systems

    These robust, sophisticated systems usually require a strong database
    back end such as Microsoft SQL or Oracle and tend to be expensive.
    Enterprise-level systems include the advanced features listed above, and some are even tailored to particular industries, such as legal or
    accounting firms. Examples of powerful enterprise systems include Open
    Text eDocs, Interwoven WorkSite, and EMC’s Documentum.

  • Microsoft Office SharePoint (MOSS 2007)

    Microsoft SharePoint is an interesting and fairly unique offering in the DMS area. While it’s best know as a corporate intranet platform, the 2007 version of the package provides building blocks for content-, document-, and knowledge-management, with tight integration with Microsoft Office documents, sophisticated workflow and routing features, and extensive document and people-searching capabilities. It is a powerful tool and — typically — an expensive one, but because it is available to qualifying nonprofits for a low administrative free through TechSoup (which offers both SharePoint Standard Edition andEnterprise Edition), it is also a far more affordable option for nonprofits than similar DMS products on the market. One caveat: Sharepoint, unlike the other systems mentioned above, stores documents in a database rather than in your file system, which can make the documents more susceptible to corruption. (Note: SharePoint Server is a discreet product that should not be confused with Windows Shared Services, which comes bundled with Windows Server 2003.

The Future of Document Management

The most significant changes in document management over the last decade have been the migration of most major DMS systems from desktop to browser-based applications, as well as their ever-increasing efficiency and search functionality. The growing popularity of Software as a Service (SaaS), tagging, and RSS tools are likely to impact the DMS space as well.

Software as a Service

SaaS platforms like Google Apps and Salesforce.com store documents online, on hosted servers, as opposed to on traditional internal file servers. Google Apps doesn’t currently offer the detailed document profile options standard DMSs do, but it will be interesting to see how that platform evolves.

Another SaaS product, Salesforce, has been active in the document management space. Salesforce’s constituent relationship management (CRM) platform currently allows organizations to simply upload documents for a constituent. Salesforce has recently purchased a commercial DMS called Koral, however, and is in the process of incorporating it into its platform, an enhancement that will help tie documents to the other aspects of constituent relationships.

Tagging

A startup called Wonderfile has introduced an online DMS that incorporates the heavy use of tagging to identify and describe documents. Using this software, you would move your documents to the Wonderfile servers and manage them online with Del.icio.us-style methods of tagging and browsing. A drawback to Wonderfile is that, although a creative solution, storing and sharing your documents online is more valuable when you can edit and collaborate on them as well. As full-fledged, Web-based document creation and editing platforms, Google Apps and its peers are a better alternative, despite their lack of tagging functionality.

Microsoft has also been quietly adding tagging capability to their file-browsing utility Windows Explorer, allowing to you add keywords to your documents that show up as columns that you can sort and filter by. This works in both Windows XP and Vista.

RSS

While none of the existing DMSs are currently doing much with RSS — an online syndication technique that could allow users to “subscribe” to changes to documents or new content via a Web browser — Salesforce plans to integrate RSS functionality with its new Koral system. This type of syndication could be a useful feature, allowing groups of people to track document revisions, communicate about modifications, or monitor additions to folders.

Finding What You’re Looking For

Is it time for your organization to trade in that rowboat for a battle cruiser? With an ever-expanding pool of documents and resources, nonprofits need ways to find the information we need that are richer and more sophisticated than the standard filenames and folders. If your organization struggles to keep track of important documents and information, a DMS can help you move beyond the traditional “file-and-save” method to an organizational system that allows you to sort by topics and projects using a variety of techniques and criteria.

But we should all hope that even better navigational systems are coming down the road. Having seen the creative advances in information management provided by Web 2.0 features like tagging and syndication, it’s easy to envision how these features, which work well with photos, bookmarks, and blog entries, could be extended to documents as well.

 

Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

Thanks to TechSoup for their financial support of this article. Tim Johnson, Laura Quinn of Idealware, and Peter Crosby ofalltogethernow also contributed to this article.