Tag Archives: idealware

Using RSS Tools to Feed Your Information Needs

This article was originally published at Idealware in March of 2009.

The Internet gives you access to a virtual smorgasbord of information. From the consequential to the trivial, the astonishing to the mundane, it’s all within your reach. This means you can keep up with the headlines, policies, trends, and tools that interest your nonprofit, and keep informed about what people are saying about your organization online. But the sheer volume of information can pose challenges, too: namely, how do you separate the useful data from all the rest? One way is to use RSS, which brings the information you want to you.

rss-40674_640 Many of the Web sites that interest you are syndicated. With RSS, or Really Simple Syndication, you subscribe to them, and when they’re updated, the content is delivered to you — much like a daily newspaper, except you choose the content. On the Web, you can not only get most of what the newspapers offer, but also additional, vital information that informs your organizational and mission-related strategies. You subscribe only to the articles and features that you want to read. It’s absolutely free, and the only difficult part is deciding what to do with all the time you used to spend surfing.

Since TechSoup first published RSS for Nonprofits, there has been an explosion of tools that support RSS use. There are now almost as many ways to view RSS data as there are types of information to manage. Effective use of RSS means determining how you want your information served. What kind of consumer are you? What type of tool will help you manage your information most efficiently, day in and day out? Read on to learn more.

What’s on the Menu?

You probably already check a set of information sources regularly. The first step in considering your RSS needs is to take stock of what you are already reading, and what additional sources you’d like to follow. Some of that information may already be in your browser’s lists of Bookmarks or Favorites, but consider seeking out recommendations from trusted industry sources, friends, and co-workers as well. As you review the Web sites that you’ve identified as important, check them to make sure you can subscribe to them using RSS. You can find this out by looking for “subscribe” options on the Web page itself, or for an orange or blue feed icon resembling a radio signal in the right side of your Web browser’s address bar.

Consider the whole range of information that people are providing in this format. Some examples are:

  • News feeds, from traditional news sources or other nonprofits.
  • Blogs, particularly those that might mention or inform your mission.
  • Updates from social networking sites like Facebook or MySpace (for instance, through FriendFeed).
  • Podcasts and videos.
  • Updates from your own software applications, such as notifications of edits on documents from a document management system, or interactions with a donor from your CRM. (Newer applications support this.)
  • Information from technical support forums and discussion boards.
  • All sorts of regularly updated data, such as U.S. Census information, job listings, classified ads, or even TV listings and comic strips.

 

You can get a good idea of what’s out there and what’s popular by browsing the recommendations at Yahoo! Directory oriGoogle, while a tool like PostRank can help you analyze feeds and determine which are valuable.

RSS also shines as a tool for monitoring your organization and your cause on the Web. For instance, Google Alerts lets you subscribe, for free, to RSS updates that notify you when a particular word or phrase is used on the Web. (To learn more about “listening” to what others are saying about your organization online, see We Are Media’s wiki article on online listening.)

How Hungry Are You?

Dining options abound: you can order take-out, or go out to eat; you can snack on the go, or take all your meals at home; you can pick at your food, or savor each bite. Your options for RSS reading are equally diverse, and you’ll want to think carefully about your own priorities. Before choosing the tool or tools that suit you, ask some questions about the information you plan to track.

  • How much information is it? Do you follow a few blogs that are updated weekly? Or news feeds, like the New York Times or Huffington Post, which are updated 50 to 200 times a day?
  • How intently do you need to monitor this information? Do you generally want to pore over every word of this information, or just scan for the tidbits that are relevant to you? Is it a problem if you miss some items?
  • Are you generally Web-enabled? Can you use a tool over the Internet, as opposed to one installed on your desktop?
  • Do you jump from one computer to another? Do your feeds need to be synchronized so you can access them from multiple locations?
  • Is this information disposable, or will it need to be archived? Do you read articles, perhaps email the link to a colleague, and then forget about it? Or do you want to archive items of particular interest so you can find them in the future?
  • Will you refer a lot of this information to co-workers or constituents? Would you like to be able to forward items via email, or publish favorites to a Web page?
  • Do you need mobile access to the information? Will you want to be able to see all your feeds from a smartphone, on the run?

Enjoying the Meal

Once you have a solid understanding of your information needs, it’s time to consider the type of tool that you want to use to gather your information. First, let’s look at the terminology:

  • An Article (or Item) is a bit of information, such as a news story, blog entry, job listing or podcast.
  • A Feed is a collection of articles from a single source (such as a blog or Web site).
  • An Aggregated Feed is a collection of articles from numerous feeds displayed together in one folder.

So, what RSS options are available?

Tickers

Like the “crawl” at the bottom of CNN or MSNBC television broadcasts, RSS tickers show an automatically scrolling display of the titles of articles from your RSS feeds. Tickers can be a useful way to casually view news and updates. They’re a poor choice for items that you don’t want to miss, though, as key updates might move past when you’re not paying attention.

Snackr. For a very TV-news-like experience, try Snackr, an Adobe Air application. You can load up a variety of feeds which scroll in an aggregated stream across your desktop while you work.

Gmail users can use the email browser’s Web Clips feature to create a rotating display of RSS headlines above their inbox and messages. Because Gmail is Web-based, your headlines will be available from any computer.

Web Browsers

Your current Web browser — such as Internet Explorer (IE) or Firefox — can likely act as a simple RSS reader, with varying functionality depending on the browser and browser version. Browsers can either display feeds using their built-in viewers, or associate Web pages in RSS format with an installed RSS Feed Reader (much as files ending in “.doc” are associated with Microsoft Word). Even without an installed feed reader, clicking on the link to an RSS feed will typically display the articles in a readable fashion, formatting the items attractively and adding links and search options that assist in article navigation. This works in most modern browsers (IE7 and up, Firefox 2 and up, Safari and Opera). If your browser doesn’t understand feeds, then they will display as hard-to-read, XML-formatted code.

Firefox also supports plug-ins like Wizz RSS News Reader and Sage, which integrate with the browser’s bookmarks so that you can read feeds one at a time by browsing recent entries from the bookmark menu.

Portals

Portals, like iGoogle, My Yahoo!, and Netvibes, are Web sites that provide quick access to search, email, calendars, stocks, RSS feeds, and more. The information is usually presented in a set of boxes on the page, with one box per piece of information. While each RSS feed is typically displayed in a separate box, you can show as many feeds as you like on a single page. This is a step up from a ticker or standard Web browser interface, where you can only see one feed at a time.

Email Browsers

Asmany of us spend a lot of time dealing with email, your email browser can be a convenient place to read your RSS feeds. Depending on what email browser you use, RSS feeds can often be integrated as additional folders. Each RSS feed that you subscribe to appears as a separate email folder, and each article as a message. You can’t, of course, reply to RSS articles — but you can forward and quote them, or arrange them in subfolders by topic.

If you use Microsoft Outlook or Outlook Express, the very latest versions (Vista’s Windows Mail and Outlook 2007) have built-in feed reading features. (Earlier versions of Outlook can support this through powerful, free add-ons, such as RSS Popper andAttensa.)

Mozilla’s Thunderbird email application and Yahoo! Mail also allow you to subscribe to RSS feeds. Gmail doesn’t, however, as Google assumes that you’ll use the powerful Google Reader application (discussed below) to manage your feeds.

RSS Feed Readers

Another advantage of the full-featured feed readers is that you can tag and archive important information for quick retrieval. The best ones let you easily filter out items you have already read, mark the articles that are important to you so that you can easily return to them later (kind of like TiVo for the Web), and easily change your view between individual feeds and collections of feeds.

In practice, feed readers make it very effective to quickly scan many different sources of information to filter out items that are worth reading. This is a much more efficient way to process new information on the Web than visiting sites individually, or even subscribing to them with a tool that doesn’t support aggregation, like a Web browser or portal.

Feed Readers come in two primary flavors, offline and online. Offline feed readers are Windows, Mac, or Linux applications that collect articles from your feeds when you’re online, store them on your computer, and allow you to read them at any time. Online feed readers are Web sites that store articles on the Internet, along with your history and preferences. The primary difference between an online and an offline reader is the state of synchronization. An online reader will keep track of what you’ve read, no matter what computer or device that you access it from, whereas an offline reader will only update your status on the machine that it’s installed on.

Offline feed readers, such as FeedDemon (for PCs) and Vienna (for Macs), allow you to subscribe to as many feeds as you like and keep them updated, organized and manageable. During installation, they will register as the default application for RSS links in your browser, so that subscribing to new sites is as easy as clicking on an RSS icon on a Web page and confirming that you want to subscribe to it.

Online feed readers, such as Google Reader or NewsGator, offer most of the same benefits as desktop readers. While offline readers collect mail at regular intervals and copy it to your PC, online readers store all of the feeds at their Web site, and you access them with any Web browser. This means that feeds are often updated more frequently, and you can access your account — with all your RSS feeds, markings, and settings intact — from any computer. You could be home, at the office, on a smartphone, or in an Internet cafe. The products mentioned even emulate offline use. NewsGator can be synchronized with its companion offline browser FeedDemon, and Google Reader has an offline mode supported by Google Gears.

Online Readers also provide a social aspect to feed reading. Both Google Reader and NewsGator allow you to mark and republish items that you want to share with others. NewsGator does this by letting you create your own feeds to share, while Google Reader lets you subscribe to other Google Reader users’ shared items. Google Reader also lets you tag Web pages that you find outside of Google Reader and save them to your personal and shared lists. If your team members don’t do RSS, Google has that covered as well — your shared items can also be published to a standalone Web page that others can visit. You can, of course, email articles from an offline reader, but any more sophisticated sharing will require an online reader.

For many of us, mining data on the Web isn’t a personal pursuit — we’re looking to share our research with co-workers and colleagues. This ability to not only do your own research, but share valuable content with others, ultimately results in a more refined RSS experience, as members of a given community stake their own areas of expertise and share highlights with each other.

Online browsers are less intuitive than offline ones, however, for subscribing to new feeds. While an offline browser can automatically add a feed when you click on it, online browsers will require you to take another step or two (for instance, clicking an “Add” button in your browsers’ toolbar). You’re also likely to have a more difficult time connecting to a secure feed, like a list of incoming donations from your donor database, with an online reader than you would with an offline one.

The online feed readers are moving beyond the question of “How do I manage all of my information?” to “How do I share items of value with my network?”, allowing us to not only get a handle on important news, views, and information, but to act as conduits for the valuable stuff. This adds a dimension we could call “information crowd-sourcing,” where discerning what’s important and relevant to us within the daily buffet of online information becomes a community activity.


In Summary

RSS isn’t just another Internet trend — it’s a way to conquer overload without sacrificing the information. It’s an answer to the problem that the Web created: If there’s so much information out there, how do you separate the wheat from the chaff? RSS is a straightforward solution: Pick your format, sit back, and let the information feast come to you.


Thanks to TechSoup for their financial support of this article. Marshall Kirkpatrick of ReadWriteWeb, Laura Quinn of Idealware, Thomas Taylor of the Greater Philadelphia Cultural Alliance and Marnie Webb of TechSoup Global, also contributed to this article.


Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

RSS Article is up

I spent a good chunk of December and January writing what I hope is a very complete guide to RSS (Really Simple Syndication) and how you (whomever you might be) can use it. The article takes on the ambitious goal of identifying the types of information available in RSS format, the types of programs that can be used to read RSS feeds, and the best ones for different types of use, from tickers to email add-ons to full fledged RSS readers. I’m proud of this one – I think it’s a new approach to the topic that should be helpful for anyone who’s tired of hearing that they should be using RSS and, instead, would like to know why and how. Choose your portal, as it’s at Idealware and Techsoup.

The Perfect Fit: A Guide To Evaluating And Purchasing Major Software Systems

This article was originally published at Idealware in September of 2008.

A major software package shouldn’t be chosen lightly. In this detailed guide, Peter Campbell walks through how to find software options, evaluate them, make a good decision, and then purchase the system in a way that protects you.

cd-437723_640 A smart shopper evaluates the item they want to purchase before putting money down. You wouldn’t shop for shoes without checking the size and taking a stroll up and down the aisle in order to make sure they fit, would you? So what’s the equivalent process of trying on a software package will size? How can you make sure your substantial software purchase won’t leave you sore and blistered after the cash has been exchanged?

That’s the goal of this article—to provide some guidance for properly evaluating major software investments. We’ll walk through how to find potential software options, gather the detailed information you need to evaluate them, make a solid decision and purchase a package in a way that protects you if it doesn’t do what you hoped it would for you.

Is it A Major Software System?

The evaluation process described here is detailed, so it’s probably not cost effective to apply it to every software tool and utility you purchase. How do you know if the package you’re considering is major enough to qualify? Major systems have a dramatic impact on your ability to operate and achieve your mission—they aren’t measured by budget, they’re measured by impact.

To help identify a major purchase, ask yourself:

  • Will the application be used by a significant percentage of your staff?
  • Will multiple departments or organizational units be using it?
  • Will this software integrate with other data systems?
  • If this software becomes unstable or unusable once deployed, will it have significant impact on your nonprofit’s ability to operate?

Giving significant attention to these types of major purchases is likely to save your organization time in the long run.

 

Taking Preliminary Measurements

Prior to even looking at available software options, make sure you thoroughly define your needs and what the application you select should be able to do for you. Nonprofits are process-driven. They receive, acknowledge, deposit and track donations; they identify, serve and record transactions with clients; and they recruit, hire and manage employees. Technology facilitates the way your organization manages these processes. A successful software installation will make this work easier, more streamlined and more effective. But a new system that doesn’t take your processes and needs into account will only make running your organization more difficult.

So it’s critical that, before you begin looking for that donor database or client-tracking system, you clearly understand the processes that need to be supported and the software features critical to support that work.

This is an important and complex area that could easily be an article—or a book—in its own right. We could also write numerous articles that delve into project management, getting company buy-in and change management—all critical factors in organizational readiness. However, for the purposes of this article, we’re focusing on the process of evaluating and purchasing software once you’ve already identified your needs and prepped the organization for the project.

Finding the Available Options

Once you know what you need and why you need it, the next step is to identify the pool of applications that might fit. An expert consultant can be a huge help. A consultant who knows the market and is familiar with how the systems are working for other nonprofits can save you research time, and can direct you to systems more likely to meet your true needs. While a consultant can be more expensive than going it alone, money spent up front on the selection and planning phases is almost always recouped through lower costs and greater efficiency down the road.

If a consultant isn’t warranted, take advantage of the resources available to the nonprofit community, such as Idealware, Social Source Commons, Techsoup’s forums or NTEN’s surveys. Ask your peers what they’re using, how they like it and why. Ideally you want to identify no less than three, and probably no more than eight, suitable products to evaluate.

Considering an RFP

With your list of possible software candidates in hand, the next step is to find out more about how those packages meet your needs. This is traditionally done through a Request for Proposal (RFP), a document that describes your environment and asks for the information you need to know about the products you’re evaluating.

Well-written RFPs can be extremely valuable for understanding the objective aspects of large software purchases. For example, if you are looking for a Web site content management system (CMS), questions such as “does the blogging feature support trackbacks?” or “Can the CMS display individualized content based on cookie or user authentication?” are good ones for an RFP.

What you want from the RFP is information you can track with checkboxes. For example, “It can/can’t do this,” “It can/can’t export to these formats: XML, SQL, CSV, PDF,” or “They can program in PHP and Ruby, but not Java or Cold Fusion.” Questions that encourage vendors to answer unambiguously, with answers that can be compared in a simple matrix, will be useful for assessing and documenting the system capabilities.

An RFP can’t address all the concerns you’re likely to have. Subjective questions like “How user-friendly is your system?” or “Please describe your support” are unlikely to be answered meaningfully through an RFP process.

Certainly, you can arrange for demonstrations, and use that opportunity to ask your questions without going through an RFP process. But while the formality of an RFP might seem unnecessary, there are some key reasons for getting your critical questions answered in writing:

  • You can objectively assess the responses and only pursue the applications that aren’t clearly ruled out, saving some time later in the process.
  • A more casual phone or demo approach might result in different questions asked and answered by different vendors. An RFP process puts all of the applications and vendors on a level field for assessing.
  • The RFP responses of the vendor you select are routinely attached to the signed contract. An all-too-common scenario is that the vendor answers all of your questions with “yes, yes, yes,” but the answers change once you start to implement the software. If you don’t have the assurances that the software will do what you require in writing, you won’t have solid legal footing to void a contract.

Structuring Your RFP

RFPs work well as a four section document. Below, we walk through each of those sections.

Introduction

The introduction provides a summary of your organization, mission and the purpose of the RFP

Background

The background section provides context the vendor will need to understand your situation. Consider including a description of your organization—for instance, number of locations, number of staff and organizational structure, the processes the system should support, and such technology infrastructure as network operating system(s) and other core software packages. Include any upcoming projects that might be relevant.

Questionnaire

The questionnaire is the critical piece of the document—you want to be sure you ask all of the questions that you need answered. In preparing these questions, it’s best to envision what the vendor responses might look like. What will have to be in those responses for you to properly assess them? Consider asking about:

  • Functionality. In order to get answers you’ll be able to compare, ask your questions at a granular level. Does a CRM support householding? Does a donor database have a method for storing soft credits? Can multiple users maintain and view records of donor interactions? Can alerts or notifications be programmed in response to particular events? Use the results of your business requirements work to focus in on the functions that are critical to you and your more unusual needs.
  • Technology specifics. Make sure the software will integrate properly with other applications, that the reporting is robust and customizable by end users, and that the platform is well-supported. Ask which formats data can be exported to and imported from, how many tables can be queried simultaneously and what type of support is available—both from the vendor and third parties. Ask for a data dictionary, which a technical staffer or consultant can review, because a poorly designed database will complicate reporting and integration. And ask for a product roadmap. If the next version is going to be a complete rewrite of the application, you might want to rule out the current version for consideration.
  • Company information. Think through what you’ll want to know about the company itself. How big is it? Do they have an office near you? How long have they been in business? Are they public or private? Can they provide some documentation of financial viability? Who are the staff members that would be assigned to your project? References from similar clients with similar-scope projects can also be very useful. For more information on this area, see Idealware’s article Vendors as Allies: How to Evaluate Viability, Service, and Commitment.
  • Pricing and availability. What are their hourly rates, broken down by role, if applicable? What are their payment terms? What is their total estimate for the project as described? How do they handle changes in project scope that might arise during implementation? What are their incidental rates and policies (travel, meals)? Do they discount their services or software costs for 501(c)(3)s? How long do they estimate this project will take? When are they available to start?

 

While it’s important to be thorough, don’t ask a lot of questions you don’t plan to actually use to evaluate the systems. Asking questions “just in case” increases the amount of information you’ll need to sift through later, and increases the possibility that vendors might decide your RFP isn’t worth the time to respond to.

Instructions

Close with a deadline and details about how to submit replies. For a sizeable RFP, allow a minimum of four to six weeks for a response. Remember that this isn’t a confrontational process—a good vendor will appreciate and want to work with a client that has thought things out this well, and the questionnaire is also an opportunity for them to understand the project up front and determine their suitability for it. Respect their schedules and give them ample time to provide a detailed response.

Include an indication as to how additional questions will be handled. In general, if one vendor asks for clarification or details, your answers should be shared with all of the RFP participants. You want to keep things on a level playing field, and not give one vendor an advantage over the rest. You might do this via a group Q&A, with all the vendors invited to participate in a meeting or conference call after the RFP has been sent to them but well before they are due to respond. With all vendors asking their questions in the same room, you keep them all equally informed. Alternatively, you can specify a deadline by which written questions must be submitted. All participants would then receive the questions and answers.

Evaluating the Answers

Once you receive RFP responses, you’ll need to winnow down your list to determine which packages you’d like to demo.

If you asked straightforward, granular questions, you’ll now reap the benefit: you can set up a comparative matrix. Create a table or spreadsheet with columns for each vendor and rows for each question, summarizing the responses as much as possible in order to have a readable chart. You might add columns that weight the responses, both on the suitability of the vendor’s response (e.g. 1, unacceptable; 2, fair; 3, excellent) and/or on the importance of the question (for instance, some features are going to be much more important to you than others).

Going through the features and technology sections, you’ll see the strong and weak points of the applications. In determining which fit your needs, there will likely be some trade-offs—perhaps one application has a stronger model for handling soft credits, but another has more flexible reporting. It’s unlikely that any will jump out as the perfect application, but you’ll be able to determine which are generally suitable, and which aren’t.

For example, if you’re looking for software to manage your e-commerce activities, inventory management might be a critical function for you. If a submitted software package lacks that feature, then you’ll need to eliminate it.  As long as you understand your own critical needs, the RFP responses will identify unsuitable candidates.

You might rule out a vendor or two based on what the RFP response tells you about their availability or company stability. Take care, though, in eliminating vendors based on their RFP pricing information. RFP responses can be very subjective. Before determining that a vendor is too pricy based on their project estimate, dig deeper—other vendors might be underestimating the actual cost. If you feel you have a solid grasp on the project timeline, use the hourly rates as a more significant measurement.

The RFP responses will tell you a lot about the vendors. You’re asking questions that are important to your ability to operate. Their ability to read, comprehend and reasonably reply to those questions will offer a strong indication as to how important your business is to them, and whether they’ll consider your needs as the software is implemented and into the future. If they respond (as many will) to your critical questions with incomplete answers, or with stacks of pre-printed literature—saying, in effect, “the answers are in here”–then they’re telling you they won’t take a lot of time to address your concerns.

Keep in mind, though, that a weak sales representative might not mean a weak vendor, particularly if they’re representing a product that comes recommended or looks particularly suitable on all other fronts. It’s acceptable to reject the response and ask the vendor to resubmit if you really feel they have done you, and themselves, a disservice—but temper this with the knowledge that they blew it the first time.

Trying It All On for Size

At this point the process will hopefully have narrowed the field of potential applications down to three-to-five options. The next step is to schedule software demos. A well-written RFP will offer important, factual and comprehensive details about the application that might otherwise be missed, either by too narrow a demo or by one the vendor orchestrates to highlight product strengths and gloss over weaknesses. But the demos serve many additional purposes:

  • Evaluating look and feel. As good as the specs might look, you’ll know quickly in a demo if an application is really unusable. For instance, an application might technically have that great Zip code lookup feature you asked about in the RFP, but it may be implemented in a way that makes it a pain to use. Prior to the demo, try to present the vendors with a script of the functions you want to see. It can also be useful to provide them with sample data, if they are willing—evaluating a program with data similar to your own data will be less distracting. Be careful not to provide them with actual data that might compromise your—or your constituents’—privacy and security. The goal is to provide a level and familiar experience that unifies the demos and puts you in the driver’s seat, not the vendor.
  • Cross training. The demo is another opportunity for the vendor to educate you regarding the operating assumptions of the software, and for you to provide them with more insight into your needs. A generic donor management system is likely to make very good assumptions about how you track individuals, offer powerful tools for segmentation and include good canned reports, because the donor-courting processes are very similar. But in less standardized areas—or if you have more unusual needs—the model used by the software application can differ dramatically from your internal process, making it difficult for your organization to use. Use the demo to learn how the software will address your own process and less conventional needs.
  • Internal training. Even more valuable is the opportunity to use the demos to show internal staff what they’ll be able to do with the software. Demos are such a good opportunity to get staff thinking about the application of technology that you should pack the room with as many people as you can. Get a good mix of key decision-makers and application end-users—the people who design and perform the business processes the software facilitates. The people who will actually use the software are the ones who can really tell if the package will work for them.

Making the Decision

With luck, your vendor selection process will now be complete, with one package clearly identified as the best option. If key constituents are torn between two options or unimpressed with the lot, senior decision-makers might have to make the call. Be careful, however, not to alienate a group of people whose commitment and enthusiasm for the project might be needed.

If none of the applications you evaluated completely meets your needs, but one comes close, you might consider customizations or software modifications to address the missing areas. Note that any alterations of the basic software package will likely be costly, will not be covered in the packaged documentation and help files, and might break if and when you upgrade the software. Be very sure there isn’t an alternate, built-in way to accomplish your goal. If f the modification is justified, make sure it’s done in such a way that it won’t be too difficult to support as the software is developed.

Before making a final decision, you should always check vendor references, but take them with a healthy grain of salt. An organization’s satisfaction with software depends not only on how well it meets their needs, but how familiar they are with their options—there are a lot of people who are happy using difficult, labor-heavy, limited applications simply because they don’t know there are better alternatives.

If you still have a tie after RFPs, demos and reference checks, the best next step is to conduct on-site visits with an existing customer for each software package. As with demos, bring a representative group of management, technical staff and users. Assuming the reference can afford the time to speak with you, the visit will highlight how the software meets their needs, and will give you a good, real world look at its strengths and weaknesses. You’ll also likely walk away with new ideas as to how you might use it.

Signing on the Dotted Line

You’ve selected an application. Congratulations! You might be tired, but you aren’t finished yet. You still need to work with the vendor to define the scope of the engagement, and an agreement that will cover you in case of problems. A good contract clearly articulates and codifies everything that has been discussed to date into a legally binding agreement. If, down the road, the vendor isn’t living up to their promises, or the software can’t do what you were told it would do, then this is your recourse for getting out of an expensive project.

Contract negotiations can take time. It’s far more dangerous to sign a bad contract in the interest of expediency, though, than it is to delay a project while you ensure that both parties—you and the vendor—completely understand each other’s requirements. Don’t start planning the project until the papers have been signed.

A software contract should include a number of parts, including the actual agreement, the license, the scope of work and the RFP.

The Agreement

This is the legal document itself, with all of the mumbo jumbo about force majeure and indemnity. The key things to look for here are:

  • Equal terms and penalties. Are terms and penalties equally assessed? Vendors will write all sorts of terms into contracts that outline what you will do or pay if you don’t live up to your end of the agreement. But they’ll often leave out any equivalent controls on their behavior. You should find every “if this happens, customer will do this” clause and make sure the conditions are acceptable, and that there are complementary terms specified for the vendor’s actions.
  • Reasonable cancellation penalties. If there are penalties defined for canceling a consulting or integration contract, these should not be exorbitant. It’s reasonable for the vendor to impose a limited penalty to cover expenses incurred in anticipation of scheduled work, such as airfare purchased or materials procured. But unless this is a fixed cost agreement, which is highly unusual, don’t let them impose penalties for work they don’t have to do—for example, for a large percentage of the estimated project cost.
  • Agreement under the laws of a sensible state. If the vendor is in California, and you’re in California, then the agreement should be covered by California laws rather than some random other state. In particular, Virginia’s laws highly favor software companies and vendors. In most cases, you want the jurisdiction to be where you live, or at least where the vendor’s headquarters actually are.

The Software License

The license specifies the allowed uses of the software you’re purchasing. This, too, can contain some unacceptable conditions.

  • Use of your data. A software license should not restrict your rights to access or work with your data in any way you see fit. The license agreement will likely contain conditions under which the software warranty would be voided. It’s perfectly acceptable for a commercial software vendor to bar re-engineering their product, but it’s not acceptable for them to void the warranty if you are only modifying the data contained within the system. So conditions that bar the exporting, importing, archiving or mass updating of data should be challenged. If the system is hosted, the vendor should provide full access to your data, and the license should include language providing that client shall have reasonable access for using, copying and backing up all customer information in the database. There should be no language in the contract implying that the vendor owns your data, or that they can use it for any additional purposes.
  • Responsibility avoidance. Software warranties should not include blanket “software provider is not responsible if nothing works” statements. This shouldn’t need to be said, but, sadly, there are often warranty sections in license agreements that say just that.
  • Back doors. The license should not allow for any post-sale reversals of licensing, such as language stating that the contract will be void if the customer uses the software in perfectly reasonable ways they don’t anticipate. For instance, if you want to use the CRM functions of your donor database to track contacts that aren’t potential donors, you shouldn’t sign a contract limiting use of the software to “fundraising purposes”. Also, there should not be any “back doors” programmed into the application that the vendor can maintain for purposes of disabling the software.

The Scope of Work

The Scope of Work (SOW) describes exactly what the project will consist of. It’s an agreement between the vendor and the customer as to what will happen, when, and how long it will take. Good scopes include estimates of hours and costs by task and/or stage of the project. The scope should be attached as a governing exhibit to the contract. Usually, this is negotiated prior to receiving the actual contract. By having it attached to the contract, the vendor is now legally obligated to, basically, do what they said they would do.

The RFP

Like the Scope of Work, the RFP should also be attached as a governing document that assures that the software does what the vendor claimed it would.

In Conclusion

For big ticket purchases, it’s well worth having an attorney review or assist in negotiations. Keep in mind that the goal is to end up with a contract that equally defends the rights of both parties. True success, of course, is a solid contract that is never revisited after signing. Litigation doesn’t serve anyone’s interest.

Bringing It Home

There’s a lot of talk and plenty of examples of technology jumpstarting an organization’s effectiveness. But if someone were to do the tally, there would probably be more stories of the reverse. All too often, organizations make decisions about their software based on uninformed recommendations or quick evaluations of the prospective solutions. Decisions are often based more on expediency than educated selection.

Rushing a major investment can be a critical error. Learn about the available options, thoroughly assess their suitability to your needs and prepare your staff to make the most of them. Then, sign a contract that protects you if, after all else is done, the application and/or vendor fails to live up to the promises. Finding the right application and setting it up to support, not inhibit, your workflow is a matter of finding something that really fits. You can’t do that with your eyes closed.

 

For More Information

Vendors as Allies: How to Evaluate Viability, Service, and Commitment
An Idealware article on how to think specifically about the less-concrete aspects of software selection.

How To Find Data-Exchange-Friendly Software
An overview of how to ensure you’re going to be able to get data in and out of a software package. (For much more detailed considerations, see our Framework to Evaluate Data Exchange Features.)

 

Peter Campbell is the director of Information Technology at Earthjustice, a nonprofit law firm dedicated to defending the earth, and blogs about NPTech tools and strategies at Techcafeteria.com. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo, and Marin Counties, and has been managing technology for non-profits and law firms for over 20 years.

Robert Weiner and Steve Heye also contributed to this article.

 

Here, There and Idealware

It’s official – I’m not even trying to keep this blog up to date anymore, because I aaccepted a volunteer gig blogging regularly at Idealware. As I’ve mentioned before, Idealware strives to be the Consumer Reports of nonprofit software, and, in my opinion, that description doesn’t do the site justice – it’s long been one of my most referenced resources; the place that a nonprofit can go to get focused, concise answers to those tricky questions like “What software is out there?”, “Which one fits my needs?” and “What are the best practices for deploying it?”.

I have two things up on Idealware this week: My new article, “The Perfect Fit: A Guide to Evaluating and Purchasing Major Software Systems” and my first blog entry “Smartphone Follies“.

Needless to say, I’m honored and excited to be publishing regularly to Idealware, and urge you all to go there and subscribe to the articles and blog, which features some very sharp friends of mine, as well: Steve Backman, Heather Gardner-Madras, Paul Hagen, Eric Leland, Michelle Murrain, and, of course, Laura Quinn, the founder and genius behind Idealware. See you over there!

Web Site Update

Over the weekend, I downsized Techcafeteria.com, something I probably should have done close to a year ago, when I started my job at Earthjustice. What’s left is pretty thin, and is less of a web site than it is a supplement to other things online.

Some say that we’re moving away from blogging to the next trend, dubbed “Lifestreaming“. But I wouldn’t call this a lifestream. “Stream-supplementing” might be more to the point. I hang out in a number of places online, the key ones being, in some kind of meaningful order:

LinkedIn – this is where I keep my resume and stay connected with people I know through work and community.

Twitter – This is where I do most of my online communication lately. My Twitter community is mostly made up of people I know through NTEN and other NPTech circles. You may think I’ve been pretty quiet in the two or three months since I last blogged, but I’ve published about 700 tweets.

NTEN, or, more accurately, the NTEN Groups like NTEN-Discuss and the SF-501TechClub. These are online lists, sponsored by NTEN. I’m also reasonable active on Deborah Elizabeth Finn‘s excellent Information Systems Forum, a Yahoo Group.

Idealware – Laura’s made me a staff writer, of sorts, and I should be contributing more articles this summer. I also comment on the blog regularly. Some of my Idealware articles are also picked up by Techsoup.

So, those are great places to find me. And this is where you come to contact me, or catch up on where I’ve been. I can’t call it “lifestreaming” – my life isn’t a show, and if it was, it wouldn’t be a very interesting one. But I do publish he pieces of it that I think might be valuable to others, and I’d rather publish them in places that others go, so it makes sense to have a web site that serves more as an signpost than a destination.

What I’ve been up to

Ah, poor, neglected blog. Wanted to post a few things here:

  • The Techcafteria website has been cleaned up a bit – consulting pitch removed, as I’m fully employed at Earthjustice; I also beefed up the documents section. I was happy to find my Non-Profit Times article on Data Management Strategy is now available in their free archives.
  • Upcoming articles: I’ve submitted a draft of an article on Document Management to Idealware, which might see publication in the next month or two. I’m a big proponent of enhancing the process of saving and opening documents, and I have a lot of experience with it, having spent most of my career at law firms. I’m also one revision away from a good guide to dealing with your domain name – how to register it, what to look out for, and what to do if things go wrong. My impression is that this is a big headache for NPO’s and I can’t find much written on it at Techsoup or other logical places.
  • The NTC is coming up quickly! I’m really looking forward to NTEN’s annual Non-Profit Technology Conference in New Orleans in March. I’m leading a panel on Change Management (“the human side of technology adoption”) and I’m participating in one or two Open API-related sessions, following up on my first Idealware article. I’ll say it again: Holly and the team at NTEN put on the absolute best event you can hope to go to. I’ve been to tech conferences put on by Microsoft, O’Reilly and others, and they should simply be ashamed of themselves. The planning and quality of the event, meals, sessions, locations for NTC always excel.
  • And I’m on the committee for NetSquared’s next Developer Challenge, tying in with the 3rd annual NetSquared Conference in May. Billy Bickett and others at Techsoup/Compumentor are looking to make it even more exciting this year than last, with a host of big name companies sponsoring and participating.

XML, API, CSV, SOAP! Understanding The Alphabet Soup Of Data Exchange

This article was originally published at Idealware in October of 2007.

Let’s say you have two different software packages, and you’d like them to be able to share data. What would be involved? Can you link them so they exchange data automatically? And what do all those acronyms mean? Peter Campbell explains.

There has been a lot of talk lately about data integration, Application Programming Interfaces (APIs), and how important these are to non-profits. Much of this talk has focused on the major non-profit software packages from companies like Blackbaud, Salesforce.com, Convio, and Kintera. But what is it really about, and what does it mean to the typical org that has a donor database, a web site, and standard business applications for Finance, Human Resources and payroll? In this article, we’ll bypass all of the acronyms for a while and then put the most important ones into perspective.

The Situation

Nonprofits have technology systems, and they live and die by their ability to manage the data in those systems to effectively serve their missions. Unfortunately, however, nonprofits have a history of adopting technology without a plan for how different applications will share data. This isn’t unique to the nonprofit sector: throughout the business world, data integration is often underappreciated.
Here’s simple example: Your mid-sized NPO has five fundraising staff people that together bring in $3,000,000 in donations every year. How much more would you bring in with six fundraising people? How much less with four? If you could tie your staffing cost data to hours worked and donations received, you would have a payroll-to-revenue metric that could inform critical staffing decisions. But if the payroll data is in an entirely different database from the revenue data, they can’t be easily compared.
Similarly, donations are often tracked in both a donor database and a financial system. If you’ve ever had to explain to the board why the two systems show different dollar amounts (perhaps because finance operates on a cash basis while fund development works on accrual), you can see the value in having systems that can reconcile these differences.

How can you solve these data integration challenges? Short of buying a system that tracks every piece of data you may ever need, data exchange is the only option. This process of communicating data from one system to another could be done by a straightforward manual method, like asking a staff member to export data from one system and import it into another. Alternatively, automatic data transfers can save on staff time and prevent trouble down the road – and they don’t have to be as complex as you might think.
What does it take to make a data exchange work? What is possible with your software applications? This article explains what you’ll need to consider.

 

Components of Data Exchange

Let’s get down to the nitty-gritty. You have two applications, and you’d like to integrate them to share data in some way: to pull data from one into another, or to exchange data in both directions. What has to happen? You’ll need four key components:

  • An Initiating Action. Things don’t happen without a reason, particularly in the world of programming. Some kind of triggering action is needed to start the data interchange process. For an automatic data exchange, this is likely to be either a timed process such as a scheduler kicking off a program at 2AM every night, or a user action – for instance, a visitor clicking the Submit button on your website form.
  • A Data Format. The data to be transferred needs to be stored and transferred in some kind of logical data format – for instance, a comma delineated text file – that both systems can understand.
  • A Data Transfer Mechanism. If both applications reside on your own network, then a transfer is likely to be straightforward – perhaps you can just write a file to a location where another application can read it. But if one or both applications live offsite, you might need to develop a process that transfers the data over the internet.

Let’s look at each of these components in more detail.

 

Initiating Action

An initiating action is what starts things rolling in the data exchange process. In most cases, it would take one of three forms:

  • Human Kickoff. If you’re manually exporting and importing files, or need to run a process on a schedule that’s hard to determine in advance, regular old human intervention can start the process. An administrator might download a file, run a command line program, or click a button in an admin interface.
  • Scheduler. Many data exchanges rely on a schedule – checking for new information every day, every hour, every five minutes, or some other period. These kinds of exchanges are initiated by a scheduler application. More complex applications might have a scheduling application built-in, or might integrate with Windows Scheduler or Unix/Linux Chron commands.
  • End User Action. If you want two applications to be constantly in synch, you’ll need to try to catch updates as they happen. Typically, this is done by initiating a data exchange based on some end user action, such as a visitor clicking the Submit button on an online donation form.

 

 

Data Formats

In order to transfer data from one system to another, the systems need to have a common understanding of how the data will be formatted. In the old days, things were pretty simple: you could store data in fixed format text files, or as bits of information with standard delimiting characters, commonly called CSV for “Comma Separated Values”. Today, we have a more dynamic format called XML (eXtensible Markup Language).
An example fixed format file could be made up of three lines, each 24 characters long:

Name (20)  Gender(1)   Age(3)
Susan          f                    25
Mark             m                  37

 

A program receiving this data would have to be told the lengths and data types of each field, and programmed to receive data in that exact format.

 

“Susan”,”f”,25
“Mark”,”m”,37

CSV is easier to work with than fixed formats, because the receiving system doesn’t have to be as explicitly informed about the incoming data. CSV is almost universally supported by applications, but it poses challenges as well. What if your data has quotes and commas in it already? And as with fixed formats, the receiving system will still need to be programmed (or “mapped”) to know what type of data it’s receiving.
CSV is the de facto data format standard for one-time exports and data migration projects. However, automating CSV transfers requires additional programming – batch files or scripts that will work with a scheduling function. Newer standards, like XML, are web-based and work in browsers, allowing for a more dynamic relationship with the data sets and less external programming.
The XML format is known as a “self-describing” format, which makes it a bit harder to look at but far easier to work with. The information about the data, such as field names and types, is encoded with the data, so a receiving system that ‘speaks” XML can dynamically receive it. A simple XML file looks like this:

-<PEOPLE>
-<PERSON>
<NAME>Susan</NAME>
<GENDER>f</GENDER>
<AGE>25</AGE>
</PERSON>
-<PERSON>
<NAME>Mark</NAME>
<GENDER>m</GENDER>
<AGE>37</AGE>
</PERSON>

An XML friendly system can use the information file itself to dynamically map the data to its own database, making the process of getting a data set from one application to another far less laborious than with a CSV or fixed width file. XML is the de facto standard for transferring data over the internet.

Data Transfer Mechanisms

As we’ve talked about, an initiating action can spur an application to create a formatted. data set. However, getting that data set from one application to another requires some additional work.
If both of your applications are sitting on the same network, then this work is likely pretty minimal. One application’s export file can easily be seen and uploaded by another, or you might even be able to establish a database connection directly from one application to another. However, what if the applications are in different locations? Or if one or both are hosted by outside vendors? This is where things get interesting.
There are multiple ways to exchange data over the web. Many of them are specific to the type of web server (Apache vs. Microsoft’s IIS) or operating system (Unix vs Linux vs Microsoft) you’re using. However, two standards – called “web services” – have emerged as by far the most common methods for simple transfers: SOAP (Standard Object Access Protocol) and REST (Representational State Transfer).
Both SOAP and REST transfer data via the standard transfer protocol mechanism of the web: HTTP. To explain the difference between REST and SOAP, we’ll take a brief detour and look at HTTP itself.
HTTP is a very simple minded thing. It allows you to send data from one place to another and, optionally, receive data back. Most of it is done via the familiar Uniform Resource Identifier (URI) that is typed into the address bar of a web browser, or encoded in a link on a web page, with a format similar to:

http://www.somewhere.com?parameter1=something&parameter2=somethingelse

There are two methods built into HTTP for exchanging data: GET and POST.

  • GET exchanges data strictly through the parameters to the URL, which are always in “this equals that” pairs. It is a one-way communication method – once the information is sent to the receiving page, the web server doesn’t retain the parameter data or do anything else with it.
  • POST stores the transferred information in a packet that is sent along with the URI – you don’t see the information attached to the URI in the address bar. Post values can be altered by the receiving page and returned. In almost any situation where you’re creating an account on a web page or doing a shopping transaction, POST is used.

The advantage to GET is that it’s very simple and easy to share. The advantages to POST are that it is more flexible and more secure. You can put a GET URI in any link, on or offline, while a POST transfer has to be initiated via an HTML Form.
However, add to the mix that Microsoft was one of the principal developers of the SOAP specification, and most Microsoft applications require that you use SOAP to transfer data. REST might be more appealing if you only need to do a simple data exchange, but if you’re working with Microsoft servers or applications, it is likely not an option.

Transformation and Validation Processes

While this article is focused on the mechanics of extracting and moving data, it’s important not to lose sight of the fact that data often needs a lot of work before it should be loaded into another system. Automated data exchange processes need to be designed with extreme care, as it’s quite possible to trash an entire application by corrupting data, introducing errors, or flooding the system with duplicates.
In order to get the data ready for upload, use transformation and validation processes. These processes could be kicked off either before or after the data transfer, or multiple processes could even take place at different points in time. An automated process could be written in almost any programming language, depending on the requirements of your target applications and your technical environment.

 

  • Converting file formats. Often, one application will export a data file with a particular layout of columns and field names, while the destination application will demand another.
  • Preventing duplicates. Before loading in a new record, it’s important to ensure that it doesn’t already exist in the destination application.
  • Backup and logging. It’s likely a good idea to kickoff a backup of your destination database before importing the data, or at least to log what you’ve changed.
  • User interface. For complex processes, it can be very useful to provide an administrative interface that allows someone to review what data will change and resolve errors prior to the import
  • Additional Data Mining. If you’re writing a process that analyzes data, adding routines that flag unusual occurrences for review can be very useful. Or if you’re uploading donation data that also has to go to Finance, why not concurrently save that into a CSV file that Finance can import into their system? There are plenty of organizational efficiencies that can be folded into this process.

As described in the API section below, a sophisticated application may provide considerable functionality that will help in these processes.

Application Programming Interfaces (APIs)

What about APIs? How do they fit in? We’re hundreds of words into this article without even a mention of them – how can that be? Well, APIs are a fuzzy concept that might encompass all the aspects of data exchange we just discussed, or some of them, or none of them at all. Clear as mud, right?
An API is exactly what it says – an interface, or set of instructions, for interacting with an application via a programming language.
Originally, APIs were built so that third party developers could create integrating functions more easily. For instance, a phone system vendor might write specific functions into their operating system so that a programmer for a voice mail company could easily import, extract, and otherwise work with the phone system data. This would usually be written in the same programming logic as the operating system, and the assumption was that the third party programmer knew that language. Operating systems like Unix and Windows have long had APIs, allowing third parties to develop hardware drivers and business applications that use OS functions, such as Windows’ file/open dialog boxes.

 

APIs are written to support one or more programming languages – such as PHP or Java – and require a programmer skilled in one of these languages. An API is also likely to be geared around specific data format and transfer standards – for instance, it may only accept data in a particularly XML format, and only via a SOAP interface. In most cases, you’ll be limited to working with the supported standards for that API.

 

Choose Your Own Data Exchange Adventure

The type of data exchange that makes sense and how complex it will be varies widely. A number of factors come into play: the applications you would like to integrate, the available tools, the location of the data, and the platform (i.e Windows, Linux, web) you’re using. Integration methods vary widely. For instance:

  • Striped all the way down to the basics, manual data exchange is always an option. In this case, an administrator (a Human Kickoff initiating action) might download a file into CSV, save it to the network, perform some manual transformations to put it into the appropriate file format, and upload it into a different system.
  • For two applications on the same network, the process might not be too much more complex. In this case, a Scheduler initiating action might prompt one application to export a set of data as a CSV file and save it to a network drive. A transformation program might then manipulate the file and tell the destination application to upload the new data.
  • Many web-based tools offer simple ways to extract data. For instance, to get your blog’s statistics from the popular tracking service FeedBurner, you could use a scheduled initiating action to simply request a FeedBurner page via HTTP, which would then provide you the statistics on a XML page. Your program could then parse and transform the data in order to load into your own reporting application or show it on your own website. Many public applications, such as GoogleMaps, offer similarly easy functionality to allow you to interact with them, leading to the popularity of Mashups- applications that pull data (generally via APIs) from two or more website.
  • If you are using a website Content Management System which is separate from your main constituent management system, you may find yourself with two silos containing constituent data – members who enrolled on your web site and donors tracked in a donor database. In this circumstance, you might setup a process that kicks off whenever someone submits the Become a Member form. This process could write the data for the new member into an XML file, transfer that file to your server, and there kickoff a new process that import the new members while checking for duplicates.

Finding Data-Exchange-Friendly Software

As is likely clear by now, the methods you can use to exchange data depend enormously on the software packages that you chose. The average inclination when evaluating software is to look for the features that you require. That’s an important step in the process, but it’s only half of the evaluation. It’s also critical to determine how you can – or if you can – access the data. Buying into systems that overcomplicate or restrict this access will limit your ability to manage your business.
Repeat this mantra: I will not pay a vendor to lock me out of my own data. Sadly, this is what a lot of data management systems do, either by maintaining poor reporting and exporting interfaces; or by including license clauses that void the contract if you try to interact with your data in unapproved ways (including leaving the vendor).
To avoid lock-in and ensure the greatest amount of flexibility when looking to buy any new application – particularly the ones that store your data off-site and give you web-based access to it – ask the following questions:

  • Can I do mass imports and updates on my data? If the vendor doesn’t allow you to add or update the system in bulk with data from other systems, or their warrantee prohibits mass updates, then you will have difficulty smoothly integrating data into this system.
  • Can I take a report or export file; make a simple change to it, and save my changes? The majority of customized formats are small variations on the standard formats that come with a system. But it’s shocking how many web-based platforms don’t allow you to save your modifications.
  • Can I create the complex data views that are useful to me? Most modern donor, client/case management and other databases are relational. They store data in separate tables. That’s good – it allows these systems to be powerful and dynamic. But it complicates the process of extracting data and creating customized reports. A donor’s name, address, and amount that they have donated might be stored in three different, but related tables. If that’s the case, and your reporting or export interface doesn’t allow you to report on multiple tables in one report, then you won’t be able to do a report that extracts names and addresses of all donors who contributed a certain amount or more. You don’t want to come up with a need for information and find that, although you’ve input all the data, you can’t get it out of the system in a useful fashion.
  • Does the vendor provide a data dictionary? A data dictionary is a chart identifying exactly how the database is laid out. If you don’t have this, and you don’t have ways of mapping the database, you will again be very limited in reporting on and extracting data from the application.
  • What data formats can I export data to? As discussed, there are a number of formats that data can be stored in. You want a variety of options for industry standard formats.
  • Can I connect to the database itself? Particularly if the application is installed on your own local network, you might be access the database directly. The ability to establish an ODBC connection to the data, for instance, can provide a comparatively easy way to extract or update data. Consider, however, what will happen to your interface if the vendor upgrades the database structure.
  • Can I initiate data exports without human intervention? Check to see if there are ways to schedule exports, using built-in scheduling features or by saving queries that can be run by the Windows Scheduler (or something similar). If you want to integrate data in real time, determine what user actions you can use to kick off a process. Don’t allow a vendor to lock you out of the database administrator functions for a system installed on your own network.
  • Is there an API? APIs can save a lot of time if you’re building a complex data exchange. For some systems, it may be the only way to get data in or out without human intervention. Don’t assume any API is a good API, however – make sure it has the functions that will be useful to you.
  • Is there a data exchange ecosystem? Are there consultants who have experience working with the software? Does the software support third party packages that specialize in extracting data from one system, transforming it, and loading it into another? Is there an active community developing add-ons and extensions to the application that might serve some of your needs?

Back to Reality

So, again, what does all of this really mean to a nonprofit organization? From a historical perspective, it means that despite the preponderance of acronyms and the lingering frustrations of some companies limiting their options, integration has gotten easier and better. If you picked up this article thinking that integrating and migrating data between applications and web sites is extremely complex, well, it isn’t, necessarily – it’s sometimes as simple as typing a line in your browser’s address bar. But it all depends on the complexity of the data that you’re working with, and the tools that your software application gives you to manage that data.

 

For More Information

An Introduction to Integrating Constituent Data: Three Basic Approaches
A higher level, less technical look at data integration options

The SOAP/XML-RPC/REST Saga
A blog article articulating the differences – from a more technical perspective – between REST and SOAP.

Mashup Tools for Consumers
New York Times article on the Mashup phenomenon

W3 Hardcore Data Standard Definition
W3, the standards body for the internet. The hardcore references for HTTP, XML, SOAP, REST and other things mentioned here.

Web API List
Techmagazine’s recent article linking to literally hundreds of applications that have popular Web APIs

Peter Campbell is currently the Director of Information Technology at Earthjustice, an non-profit law firm dedicated to defending the earth. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo & Marin Counties, Inc. Peter has been managing technology for non-profits and law firms for over 20 years, and has a broad knowledge of systems, email and the web. In 2003, he won a “Top Technology Innovator” award from InfoWorld for developing a retail reporting system for Goodwill thrift. Peter’s focus is on advancing communication, collaboration and efficiency through creative use of the web and other technology platforms. In addition to his work at SF Goodwill, Peter maintains a number of personal and non-profit web sites; blogs on NPTech tools and strategies at http://techcafeteria.com; is active in the non-profit community as member of NTEN; and spends as much quality time as possible with his wife, Linda, and eight year old son, Ethan.

Steve Anderson of ONE/ Northwest, Steven Backman of Design Database Associates, Paul Hagen of Hagen20/20, Brett Meyer of NTEN, and Laura Quinn of Idealware also contributed to this article

Data Exchange Article Up at Idealware

My article “XML, API, CSV, SOAP! Understanding the Alphabet Soup of Data Exchange” is up at idealware.org. This is intended as a primer for those of you trying to make sense of all of this talk about Application Programming Interfaces (APIs) and data integration. It discusses, with examples, the practical application of some of the acronyms, and suggests some recommended practices around data system selection and deployment. Credit has to go to Laura Quinn, webmaster at Idealware, who really co-wrote the article with me, but didn’t take much credit, and our reviewers, Paul Hagan, Steve Anderson and Stephen Backman, who added great insights to a pretty heady topic.

The article went through a lot of rewrites, and we had to cut out a fair amount in order to turn it into something cohesive, so I hope to blog a bit on some of the worthwhile omissions soon, but my day job at Earthjustice has been keeping me pretty busy.

What happened?

Well, work happened, and I have to admit that I am not the driven blogger who can maintain a steady flow of posts while working full-time. I’ve been doing a consulting/contracting gig in San Jose that not only keeps me busy, but takes huge chunks out of my day for the commute, so my attention to Techcafeteria has suffered unduly. I’ll be wrapping up the work in San Jose and transitioning to a new, full-time position over the next month or two, returning to the ranks of Non-Profit IT Directors that I didn’t imagine I’d stay out of for long. More on that position later – I’ve been asked to keep it under wraps for a week or so.

So I’ll be closing the consulting services section of Techcafeteria, but I’ll be keeping the website going as time affords. It’s been an interesting year for me, so far. From 1986 until 2007, I held three jobs. I stayed at each one for at least six years, and I secured the next one before leaving the prior. I haven’t been unemployed (aka self-employed) for over two decades. But I have a bit of a self-imposed challenge – I want a job with deep business and technology challenges, at an organization with a worthwhile mission, at a pay scale that, while not extravagant, is enough to support my family living in the Bay Area, where my partner spends most of her time homeschooling our son. Those opportunities aren’t a dime a dozen. I reached a point early in the year where I was downright desperate to leave the job that I was at (a long story that I have no intention of relating here!), and applied at some for-profit companies. I think I sabotaged myself in the interviews, because it eventually became clear to me that having day to day work that combats social or environmental injustice is a personal requirement of mine. My partner supports this — she was proud to tell people that I worked for Goodwill and she’s even more excited about my new gig, which sports a killer tagline. So setting up the consulting practice was — and probably will be again — a means of staying solvent while I was very picky about what I applied for.

One job that I pursued was with an org called the Pachamama Alliance. They are a fascinating group of people. Their story is that the indigenous people of Ecuador put out a call for help to the Western World as they saw the earth and their culture being destroyed by the clearing of the rainforests. The group forming Pachamama answered that call, and their mission is to “change the dream of the western world” into one that is in harmony with nature, as opposed to dominance and disrespect of it. They maintain that environmental injustice and social injustice are tied at the knees – where you find one, you’ll find the other. For those of you who saw Gore’s “An Inconvenient Truth”, you’ll recall the fact that the main water source for the Sudan dried up a few years ago. That bit of trivia puts the subsequent genocide in Darfur in an interesting perspective. Pachamama has adopted Gore’s tactics with a multimedia presentation that both educates and inspires people to adopt a more sustainable dream. It’s a timely movement, as it’s becoming clear to all of us that our current rate of consumption of natural resources is having dramatic impacts on the environment. Pachamama spreads the word by training volunteers to share the presentation. Well worth checking out.

In other news, I’m hard at work on an article for Idealware that attempts to deflate all of this big talk about APIs and put it in terms that anyone can use to understand why they might want to migrate data and how they might do it. I’m also talking with my friends at NTEN about doing a webinar on the best practices for rolling out CRM at a non-profit. As long-time blog readers have probably picked up, I consider Constituent Relationship Management software to be the type of technology that, deployed correctly, completely alters the way a business is run. It’s not just about maintaining business relationships and tracking donors – it’s about working collaboratively and breaking down the silos of business relationships and data. So installing the software (if software even needs to be installed) is the least of it, and data migration is just a chore. But aligning business strategy to CRM technology is the real challenge.

So, I’ll post next week about my new gig, and look forward to a long life for Techcafeteria as a resource on non-profit technology, with less of the hawking of services.