Tag Archives: outcomes

Architecting Healthy Data Management Systems

This article was originally published in the NTEN eBook “Collected Voices: Data-Informed Nonprofits” in January of 2014.

tape-403593_640Introduction

The reasons why we want to make data-driven decisions are clear.  The challenge, in our cash-strapped, resource-shy environments is to install, configure and manage the systems that will allow us to easily and efficiently analyze, report on and visualize the data.  This article will offer some insight into how that can be done, while being ever mindful that the money and time to invest is hard to come by.  But we’ll also point out where those investments can pay off in more ways than just the critical one: the ability to justify our mission-effectiveness.

Right off the bat, acknowledge that it might be a long-term project to get there.  But, acknowledge as well, that you are already collecting all sorts of data, and there is a lot more data available that can put your work in context.  The challenge is to implement new systems without wasting earlier investments, and to funnel data to a central repository for reporting, as opposed to re-entering it all into a redundant system.  Done correctly, this project should result in greater efficiency once it’s completed.

Consider these goals:

  • An integrated data management and reporting system that can easily output metrics in the formats that constituents and funders desire;
  • A streamlined process for managing data that increases the validity of the data entered while reducing the amount of data entry; and
  • A broader, shared understanding of the effectiveness of our strategic plans.

Here are the steps you can take to accomplish these goals.

Taking Inventory

The first step in building the system involves ferreting out all of the systems that you store data in today.  These will likely be applications, like case or client management systems, finance databases, human resources systems and constituent relationship management (CRM) systems.  It will also include Access databases, Excel spreadsheets, Word documents, email, and, of course, paper.  In most organizations (and this isn’t limited to nonprofits), data isn’t centrally managed.  It’s stored by application and/or department, and by individuals.

The challenge is to identify the data that you need to report on, wherever it might be hidden, and catalogue it. Write down what it is, where it is, what format it is in, and who maintains it.  Catalogue your information security: what content is subject to limited availability within the company (e.g., HR data and HIPAA-related information)? What can be seen organization-wide? What can be seen by the public?

Traditionally, companies have defaulted to securing data by department. While this offers a high-level of security, it can stifle collaboration and result in data sprawl, as copies of secured documents are printed and emailed to those who need to see the information, but don’t have access. Consider a data strategy that keeps most things public (within the organization), and only secures documents when there is clear reason to do so.

You’ll likely find a fair amount of redundant data.  This, in particular, should be catalogued.  For example, say that you work at a social services organization.  When a new client comes on, they’re entered into the case management system, the CRM, a learning management system, and a security system database, because you’ve given them some kind of access card. Key to our data management strategy is to identify redundant data entry and remove it.  We should be able to enter this client information once and have it automatically replicated in the other systems.

Systems Integration

Chances are, of course, that all of your data is not in one system, and the systems that you do have (finance, CRM, etc.) don’t easily integrate with each other.  The first question to ask is, how are we going to get all of our systems to share with each other? One approach, of course, is to replace all of your separate databases with one database.  Fortune 500 companies use products from Oracle and SAP to do this, systems that incorporate finance, HR, CRM and inventory management.  Chances are that these will not work at your nonprofit; the software is expensive and the developers that know how to customize it are, as well.  More affordable options exist from companies like MicroSoft, Salesforce, NetSuite and IBM, at special pricing for 501(c)(3)’s.

Data Platforms

A data platform is one of these systems that stores your data in a single database, but offers multiple ways of working with the data.  Accordingly, a NetSuite platform can handle your finance, HR, CRM/Donor Management and e-commerce without maintaining separate data stores, allowing you to report on combined metrics on things like fundraiser effectiveness (Donor Management and HR) and mail vs online donations (E-commerce and Donor Management).  Microsoft’s solution will incorporate separate products, such as Sharepoint, Dynamics CRM, and the Dynamics ERP applications (HR, Finance).  Solutions like Salesforce and NetSuite are cloud only, whereas Microsoft  and IBM can be installed locally or run from the cloud.

Getting from here to there

Of course, replacing all of your key systems overnight is neither a likely option nor an advisable one.  Change like this has to be implemented over a period of time, possibly spanning years (for larger organizations where the system changes will be costly and complex). As part of the earlier system evaluation, you’ll want to factor in the state of each system.  Are some approaching obsoletion?  Are some not meeting your needs? Prioritize based on the natural life of the existing systems and the particular business requirements. Replacing major data systems can be difficult and complex — the point isn’t to gloss over this.  You need to have a strong plan that factors in budget, resources, and change management.  Replacing too many systems too quickly can overwhelm both the staff implementing the change and the users of the systems being changed.  If you don’t have executive level IT Staff on board, working with consultants to accomplish this is highly recommended.

Business Process Mapping

BPM_Example

The success of the conversion is less dependent on the platform you choose than it is on the way you configure it.  Systems optimize and streamline data management; they don’t manage the data for you.  In order to insure that this investment is realized, a prerequisite investment is one in understanding how you currently work with data and optimizing those processes for the new platform.

To do this, take a look at the key reports and types of information in the list that you compiled and draw the process that produces each piece, whether it’s a report, a chart, a list of addresses or a board report.  Drawing processes, aka business process mapping, is best done with a flowcharting tool, such as Microsoft Visio.  A simple process map will look like this:

In particular, look at the processes that are being done on paper, in Word, or in Excel that would benefit from being in a database.  Aggregating information from individual documents is laborious; the goal is to store data in the data platform and make it available for combined reporting.  If today’s process involves cataloguing data in an word processing table or a spreadsheet, then you will want to identify a data platform table that will store that information in the future.

Design Considerations

Once you have catalogued your data stores and the processes in place to interact with the data, and you’ve identified the key relationships between sets of data and improved processes that reduce redundancy, improve data integrity and automate repetitive tasks, you can begin designing the data platform.  This is likely best done with consulting help from vendors who have both expertise in the platform and knowledge of your business objectives and practices.

As much as possible, try and use the built-in functionality of the platform, as opposed to custom programming.  A solid CRM like Salesforce or MS CRM will let you create custom objects that map to your data and then allow you to input, manage, and report on the data that is stored in them without resorting to actual programming in Java or .NET languages.  Once you start developing new interfaces and adding functionality that isn’t native to the platform, things become more difficult to support.  Custom training is required; developers have to be able to fully document what they’ve done, or swear that they’ll never quit, be laid off, or get hit by a bus. And you have to be sure that the data platform vendor won’t release updates that break the home-grown components.

Conclusion

The end game is to have one place where all staff working with your information can sign on and work with the data, without worrying about which version is current or where everything might have been stored.  Ideally, it will be a cloud platform that allows secure access from any internet-accessible location, with mobile apps as well as browser-based.  Further considerations might include restricted access for key constituents and integration with document management systems and business intelligence tools. But key to the effort is a systematic approach that includes a deep investment in taking stock of your needs and understanding what the system will do for you before the first keypress or mouse click occurs, and patience, so that you get it all and get it right.  It’s not an impossible dream.

 

What’s Up With The TechSoup Global/GuideStar International Merger?

This article was first published on the Idealware Blog in April of 2010.

TechSoup/GuideStar Int'l Logos

TechSoup Global (TSG) mergedwith GuideStar International (GSI) last week. Idealware readers are likely well-familiar with TechSoup, formerly CompuMentor, a nonprofit that supports other nonprofits, most notably through their TechSoup Stock software (and hardware) donation program, but also via countless projects and initiatives over the last 24 years. GuideStar International is an organization based in London that also works to support nonprofits by reporting on their efforts and promoting their missions to potential donors and supporters.

I spoke with Rebecca Masisak and Marnie Webb, two of the three CEOs of TechSoup Global (Daniel Ben-Horin is the founder and third CEO), in hopes of making this merger easier for all of us to understand. What I walked away with was not only context for the merger, but also a greater understanding of TechSoup’s expanded mission.

Which GuideStar was that?

One of the confusing things about the merger is that, if you digested the news quickly, you might be under the impression that TechSoup is merging with the GuideStar that we in the U. S. are well acquainted with. That isn’t the case. GuideStar International is a completely separate entity from GuideStar US, but with some mutual characteristics:

  • Both organizations were originally founded by Buzz Schmidt, the current President of GuideStar International;
  • They share a name and some agreements as to branding;
  • They both report on the efforts of charitable organizations, commonly referred to as nonprofits (NPOs) in the U.S.; Civil Society Organizations (CSOs) in the U.K.; or Non Governmental Organizations (NGOs) across the world.

Will this merger change the mission of TechSoup?

TechSoup Global’s mission is working toward a time when every nonprofit and NGO on the planet has the technology resources and knowledge they need to operate at their full potential.

GuideStar International seeks to illuminate the work of every civil society organisation (CSO) in the world.

Per Rebecca, TechSoup’s mission has been evolving internally for some time. The recent name change from TechSoup to TechSoup Global is a clear indicator of their ambition to expand their effectiveness beyond the U.S. borders, and efforts like NGOSource, which helps U.S. Foundations identify worthy organizations across the globe to fund, show a broadening of their traditional model of coordinating corporate donors with nonprofits.

Unlikely Alliances

TechSoup opened their Fundacja TechSoup office in Warsaw, Poland two years ago, in order to better support their European partners and the NGO’s there. They currently work with 32 partners outside of the United States. The incorporation of GSI’s London headquarters strengthens their European base of operations, as well as their ties to CSOs, as both TechSoup and GSI have many established relationships. GSI maintains an extensive database, and TechSoup sees great potential in merging their strength, as builders of relationships between entities both inside and outside of the nonprofit community, with a comprehensive database of organization and missions.

This will allow them, as Rebecca puts it, to leverage an “unlikely alliance” of partners from the nonprofit/non-governmental groups, corporate world, funders and donors, and collaborative partners (such as Idealware) to educate and provide resources to worthwhile organizations.

Repeatable Practices

After Rebecca provided this context of TSG’s mission and GSI’s suitability as an integrated partner, Marnie unleashed the real potential payload. The goal, right in line with TSG’s mission, is to assist CSOs across the globe in the task of mastering technology in service to their missions. But it’s also to take the practices that work and recreate them. With a knowledge base of organizations and technology strategies, TechSoup is looking to grow external support for the organizations they serve by increasing and reporting on their effectiveness. Identify the organizations, get them resources, and expose what works.

All in all, I’m inspired by TSG’s expanded and ambitious goals, and look forward to seeing the great things that are likely to come out of this merger.

Won’t You Let me Take You On A Sea Change?

This post was originally published on the Idealware Blog in December of 2009.

seachange.png

Last week, I reported that Nonprofit assessors like Charity Navigator and Guidestar will be moving to a model of judging effectiveness (as opposed to thriftiness). The title of my post drew some criticism. People far more knowledgeable than I am on these topics questioned my description of this as a “sea change”, and I certainly get their point.  Sure, the intention to do a fair job of judging Nonprofits is sincere; but the task is daunting.  As with many such efforts, we might well wind up with something that isn’t a sea change at all, but, rather, a modified version of what we have today that includes some info about mission effectiveness, but still boils down to a financial assessment.

Why would this happen? Simple. Because metrics are numbers: ratios, averages, totals. It’s easy to make metrics from financial data.  It’s very difficult to make them out of less quantifiable things, such as measuring how successfully one organization changed the world; protected the planet; or stopped the spread of a deadly disease.

I used to work for an org whose mission was to end poverty in the San Francisco Bay Area. And, sure enough, at the time, poverty was becoming far less prevalent in San Francisco. So could we be judged as successful?  Could we grab the 2005 versus 2000 poverty statistics and claim the advances as our outcomes? Of course not. The reduction in poverty had far more to do with gentrification during the dotcom and real estate booms than our efforts.  Poverty wasn’t reduced at all; it was just displaced. And our mission wasn’t to move all of the urban poor to the suburbs; it was to bring them out of poverty.

So the announcement that our ratings will now factor in mission effectiveness and outcomes could herald something worse than we have today. The dangerous scenario goes like this:

  • Charity Navigator, Guidestar, et al, determine what additional info they need to request from nonprofits in order to measure outcomes.
  • They make that a requirement; nonprofits now have to jump through those hoops.
  • The data they collect is far too generalized and subjective to mean much; they draw conclusions anyway, based more on how easy it is to call something a metric than how accurate or valuable that metric is.
  • NPOs now have more reporting requirements and no better representation.

So, my amended title: “We Need A Sea Change In The Way That Our Organizations Are Assessed”.

I’m harping on this topic because I consider it a call to action; a chance to make sure that this self-assessment by the assessors is an opportunity for us, not a threat. We have to get the right people at the table to develop standardized outcome measurements that the assessing organizations can use.  They can’t develop these by themselves. And we need to use our influence in the nonprofit software development community to make sure that NPOs have software that can generate these reports.

The good news? Holly Ross of NTEN got right back to me with some ideas on how to get both of these actions going.  That’s a powerful start. We’ll need the whole community in on this.

Get Ready For A Sea Change In Nonprofit Assessment Metrics

This post was originally published on the Idealware Blog in December of 2009.

watchdogs.png

Last week, GuideStar, Charity Navigator, and three other nonprofit assessment and reporting organizations made a huge announcement: the metrics that they track are about to change.  Instead of scoring organizations on an “overhead bad!” scale, they will scrap the traditional metrics and replace them with ones that measure an organization’s effectiveness.

  • The new metrics will assess:
  • Financial health and sustainability;
  • Accountability, governance and transparency; and
  • Outcomes.

This is very good news. That overhead metric has hamstrung serious efforts to do bold things and have higher impact. An assessment that is based solely on annualized budgetary efficiency precludes many options to make long-term investments in major strategies.  For most nonprofits, taking a year to staff up and prepare for a major initiative would generate a poor Charity Navigator score. A poor score that is prominently displayed to potential donors.

Assuming that these new metrics will be more tolerant of varying operational approaches and philosophies, justified by the outcomes, this will give organizations a chance to be recognized for their work, as opposed to their cost-cutting talents.  But it puts a burden on those same organizations to effectively represent that work.  I’ve blogged before (and will blog again) on our need to improve our outcome reporting and benchmark with our peers.  Now, there’s a very real danger that neglecting to represent your success stories with proper data will threaten your ability to muster financial support.  You don’t want to be great at what you do, but have no way to show it.

More to the point, the metrics that value social organizational effectiveness need to be developed by a broad community, not a small group or segment of that community. The move by Charity Navigator and their peers is bold, but it’s also complicated.  Nonprofit effectiveness is a subjective thing. When I worked for a workforce development agency, we had big questions about whether our mission was served by placing a client in a job, or if that wasn’t an outcome as much as an output, and the real metric was tied to the individual’s long-term sustainability and recovery from the conditions that had put them in poverty.

Certainly, a donor, a watchdog, a funder a, nonprofit executive and a nonprofit client are all going to value the work of a nonprofit differently. Whose interests will be represented in these valuations?

So here’s what’s clear to me:

– Developing standardized metrics, with broad input from the entire community, will benefit everyone.

– Determining what those metrics are and should be will require improvements in data management and reporting systems. It’s a bit of a chicken and egg problem, as collecting the data wis a precedent to determining how to assess it, but standardizing the data will assist in developing the data systems.

– We have to share our outcomes and compare them in order to develop actual standards.  And there are real opportunities available to us if we do compare our methodologies and results.

This isn’t easy. This will require that NPO’s who have have never had the wherewith-all to invest in technology systems to assess performance do so.  But, I maintain, if the world is going to start rating your effectiveness on more than the 990, that’s a threat that you need to turn into an opportunity.  You can’t afford not to.

And I look to my nptech community, including Idealware, NTEN, Techsoup, Aspiration and many others — the associations, formal, informal, incorporated or not, who advocate for and support technology in the nonprofit sector — to lead this effort.  We have the data systems expertise and the aligned missions to lead the project of defining shared outcome metrics.  We’re looking into having initial sessions on this topic at the 2010 Nonprofit Technology Conference.

As the world starts holding nonprofits up to higher standards, we need a common language that describes those standards.  It hasn’t been written yet.  Without it, we’ll escape the limited, Form 990 assessments to something that might equally fail to reflect our best efforts and outcomes.