Tag Archives: erp

Architecting Healthy Data Management Systems

This article was originally published in the NTEN eBook “Collected Voices: Data-Informed Nonprofits” in January of 2014.

tape-403593_640Introduction

The reasons why we want to make data-driven decisions are clear.  The challenge, in our cash-strapped, resource-shy environments is to install, configure and manage the systems that will allow us to easily and efficiently analyze, report on and visualize the data.  This article will offer some insight into how that can be done, while being ever mindful that the money and time to invest is hard to come by.  But we’ll also point out where those investments can pay off in more ways than just the critical one: the ability to justify our mission-effectiveness.

Right off the bat, acknowledge that it might be a long-term project to get there.  But, acknowledge as well, that you are already collecting all sorts of data, and there is a lot more data available that can put your work in context.  The challenge is to implement new systems without wasting earlier investments, and to funnel data to a central repository for reporting, as opposed to re-entering it all into a redundant system.  Done correctly, this project should result in greater efficiency once it’s completed.

Consider these goals:

  • An integrated data management and reporting system that can easily output metrics in the formats that constituents and funders desire;
  • A streamlined process for managing data that increases the validity of the data entered while reducing the amount of data entry; and
  • A broader, shared understanding of the effectiveness of our strategic plans.

Here are the steps you can take to accomplish these goals.

Taking Inventory

The first step in building the system involves ferreting out all of the systems that you store data in today.  These will likely be applications, like case or client management systems, finance databases, human resources systems and constituent relationship management (CRM) systems.  It will also include Access databases, Excel spreadsheets, Word documents, email, and, of course, paper.  In most organizations (and this isn’t limited to nonprofits), data isn’t centrally managed.  It’s stored by application and/or department, and by individuals.

The challenge is to identify the data that you need to report on, wherever it might be hidden, and catalogue it. Write down what it is, where it is, what format it is in, and who maintains it.  Catalogue your information security: what content is subject to limited availability within the company (e.g., HR data and HIPAA-related information)? What can be seen organization-wide? What can be seen by the public?

Traditionally, companies have defaulted to securing data by department. While this offers a high-level of security, it can stifle collaboration and result in data sprawl, as copies of secured documents are printed and emailed to those who need to see the information, but don’t have access. Consider a data strategy that keeps most things public (within the organization), and only secures documents when there is clear reason to do so.

You’ll likely find a fair amount of redundant data.  This, in particular, should be catalogued.  For example, say that you work at a social services organization.  When a new client comes on, they’re entered into the case management system, the CRM, a learning management system, and a security system database, because you’ve given them some kind of access card. Key to our data management strategy is to identify redundant data entry and remove it.  We should be able to enter this client information once and have it automatically replicated in the other systems.

Systems Integration

Chances are, of course, that all of your data is not in one system, and the systems that you do have (finance, CRM, etc.) don’t easily integrate with each other.  The first question to ask is, how are we going to get all of our systems to share with each other? One approach, of course, is to replace all of your separate databases with one database.  Fortune 500 companies use products from Oracle and SAP to do this, systems that incorporate finance, HR, CRM and inventory management.  Chances are that these will not work at your nonprofit; the software is expensive and the developers that know how to customize it are, as well.  More affordable options exist from companies like MicroSoft, Salesforce, NetSuite and IBM, at special pricing for 501(c)(3)’s.

Data Platforms

A data platform is one of these systems that stores your data in a single database, but offers multiple ways of working with the data.  Accordingly, a NetSuite platform can handle your finance, HR, CRM/Donor Management and e-commerce without maintaining separate data stores, allowing you to report on combined metrics on things like fundraiser effectiveness (Donor Management and HR) and mail vs online donations (E-commerce and Donor Management).  Microsoft’s solution will incorporate separate products, such as Sharepoint, Dynamics CRM, and the Dynamics ERP applications (HR, Finance).  Solutions like Salesforce and NetSuite are cloud only, whereas Microsoft  and IBM can be installed locally or run from the cloud.

Getting from here to there

Of course, replacing all of your key systems overnight is neither a likely option nor an advisable one.  Change like this has to be implemented over a period of time, possibly spanning years (for larger organizations where the system changes will be costly and complex). As part of the earlier system evaluation, you’ll want to factor in the state of each system.  Are some approaching obsoletion?  Are some not meeting your needs? Prioritize based on the natural life of the existing systems and the particular business requirements. Replacing major data systems can be difficult and complex — the point isn’t to gloss over this.  You need to have a strong plan that factors in budget, resources, and change management.  Replacing too many systems too quickly can overwhelm both the staff implementing the change and the users of the systems being changed.  If you don’t have executive level IT Staff on board, working with consultants to accomplish this is highly recommended.

Business Process Mapping

BPM_Example

The success of the conversion is less dependent on the platform you choose than it is on the way you configure it.  Systems optimize and streamline data management; they don’t manage the data for you.  In order to insure that this investment is realized, a prerequisite investment is one in understanding how you currently work with data and optimizing those processes for the new platform.

To do this, take a look at the key reports and types of information in the list that you compiled and draw the process that produces each piece, whether it’s a report, a chart, a list of addresses or a board report.  Drawing processes, aka business process mapping, is best done with a flowcharting tool, such as Microsoft Visio.  A simple process map will look like this:

In particular, look at the processes that are being done on paper, in Word, or in Excel that would benefit from being in a database.  Aggregating information from individual documents is laborious; the goal is to store data in the data platform and make it available for combined reporting.  If today’s process involves cataloguing data in an word processing table or a spreadsheet, then you will want to identify a data platform table that will store that information in the future.

Design Considerations

Once you have catalogued your data stores and the processes in place to interact with the data, and you’ve identified the key relationships between sets of data and improved processes that reduce redundancy, improve data integrity and automate repetitive tasks, you can begin designing the data platform.  This is likely best done with consulting help from vendors who have both expertise in the platform and knowledge of your business objectives and practices.

As much as possible, try and use the built-in functionality of the platform, as opposed to custom programming.  A solid CRM like Salesforce or MS CRM will let you create custom objects that map to your data and then allow you to input, manage, and report on the data that is stored in them without resorting to actual programming in Java or .NET languages.  Once you start developing new interfaces and adding functionality that isn’t native to the platform, things become more difficult to support.  Custom training is required; developers have to be able to fully document what they’ve done, or swear that they’ll never quit, be laid off, or get hit by a bus. And you have to be sure that the data platform vendor won’t release updates that break the home-grown components.

Conclusion

The end game is to have one place where all staff working with your information can sign on and work with the data, without worrying about which version is current or where everything might have been stored.  Ideally, it will be a cloud platform that allows secure access from any internet-accessible location, with mobile apps as well as browser-based.  Further considerations might include restricted access for key constituents and integration with document management systems and business intelligence tools. But key to the effort is a systematic approach that includes a deep investment in taking stock of your needs and understanding what the system will do for you before the first keypress or mouse click occurs, and patience, so that you get it all and get it right.  It’s not an impossible dream.

 

Biting The Hand – Conclusion

This article was originally published on the Idealware Blog in October of 2008.

This is the final post in a three part series on Microsoft.  Be sure to read Part 1, on the history/state of the Windows operating system, and Part 2, on developing for the Microsoft platform.

Two More Stories – A Vicious Exchange

In late 2006, I moved an organization of about 500 people from Novell Groupwise to Microsoft Exchange 2007.  After evaluating the needs, I bought the standard edition, which supported message storageup to 16GB (Our Groupwise message base took up about 4GB).  A few days after we completed the migration, which included transferring the Groupwise messages to Exchange, an error popped up in the Event Viewer saying that our message store was larger than the 16GB limit, and, sure enough, it was – who knew that Microsoft messages were so much larger than Groupwise messages?

The next day, Event Viewer reported that our message store was too large and that it would be dismounted at 5:00 am, meaning that email would be, essentially, disconnected.  Huh?  I connected remotely the next morning and remounted at about 5:10.  I also scoured the Microsoft Knowledgebase, looking for a recommendation on this, and found nothing.  I called up my vendor and ordered the Enterprise version of Exchange, which supports a much larger message store.  A couple of days later, same thing.  My new software hadn’t arrived yet.  The next day, the message changed, saying that our message store was too large and would be dismounted randomly! What!?  This meant that the server could go down in the middle of the business day.  The software arrived, and I tossed the box on my desk and scheduled to come in on Sunday (which happened to be New Year’s Day, 2007) to do the upgrade. But when I opened the box, I discovered that my vendor had sent me Enterprise media, yes, but it was for Exchange Server 2005, the prior version.  I was hosed.

Frantic, I went to Google instead of the knowledge base and searched.  This yielded a blog entry explaining that, with Exchange Server 2007 Service Pack 2 (which I had applied as part of the initial installation), it was now legal to have message stores of up to 75GB.  All I had to do was modify a registry entry on the server – problem solved.  Wow, who woulda thunk?  Particularly if this had been documented anywhere on the Microsoft Knowledgebase?

But here’s my question: What Machiavellian mind came up with the compliance enforcement routine that I experienced, and why was my business continuity threatened by code designed to stop me from doing something perfectly legal under the Service Pack 2 licensing?  This was sloppy, and this was cruel, and this was not supportive of the customer.

Cheap ERP

In early 2007, I hired a consultant to help with assessing and developing our strategic technology plan.  This was at a social services agency, and one of our issues was that, since we hired our clients, having separate databases to track client progress and Human Resources/Payroll resulted in large amounts of duplicate data entry and difficult reporting. The consultant and I agreed that a merged HR/Client Management system would be ideal. So, at lunch one day, I nearly fell off my chair laughing when he suggested that we look at SAP.  SAP, for those who don’t know, is a database and development platform that large companies use in order to deploy highly customized and integrated business computing platforms.  Commonly referred to as Enterprise Resource Planning (ERP) software, it’s a good fit for businesses with the unique needs and ample budgets to support what is, at heart, an internally developed set of business applications.  The reason I found this so entertaining was that, even if we could afford SAP, then hiring the technical staff to develop it into something that worked for us would be way beyond our means.  SAP developers make at least six figures a year, and we would have needed two or more to get anywhere in a reasonable amount of time.  It’s unrealistic for even a mid-sized nonprofit to look at that kind of investment in technology.

So Microsoft holds a unique position — like SAP, or Oracle, they offer a class of integrated products that can run your business.  Unlike SAP or Oracle, they’re pretty much what they are – you can customize and integrate them, at a cost, but you can’t, for instance, extend Microsoft’s Dynamics HR package into a Client Management System.  But, if you have both Dynamics and Social Solutions, which runs on Microsoft SQL Server, you’d have a lot more compatibility and integration capabilities than we had at our social services org, where our HR system was outsourced and proprietary and the client management software ran on Foxpro.

Bangs for the Buck

So this is where it leaves me – Micosoft is a large, bureaucratic mess of a company that has so many developers on each product that one will be focusing on how to punish customers for non-compliance while another is making the customers compliant.  Their product strategy is driven far less by customer demands than it is by marketing strategy.  Their practices have been predatory, and, while that type of thing seems to be easing, there’s still a lot of it ingrained in their culture.  When they are threatened — and they are threatened, by Google and the migration from the desktop to the cloud — they’re more dangerous to their developers and customers, because they are willing to make decisions that will better position them in the market at the cost of our investments.

But Microsoft offers a bargain to businesses that can’t — and shouldn’t – spend huge percentages of their budget on platform development.  They do a lot out of the box, and they have a lot of products to do it with.  Most of their mature products — Office, Exchange, SQL Server — are excellent.  They’re really good at what they do.  The affordable alternative to the commercial ERP systems like SAP and Oracle is open source, but open source platforms are still relatively immature.  Building your web site on an open Source CMS powered by PHP or Ruby on Rails might be a good, economical move that leaves you better off, in terms of ease of use and capabilities, than many expensive commercial options.  But going  open source for Finance, HR and Client Tracking isn’t really much of an option yet.  The possibly viable alternatives to Microsoft are commercial outsourcers like NetSuite, but how well they’ll meet your full needs depends on how complex those needs are – one size fits all solutions tend to work better for small businesses than medium to large ones.

Finally, it’s all well and good to talk about adopting Microsoft software strictly on its merits, but, for many of us, it has far more to do with the critical, non-Microsoft applications we run that assume we’re using their products.  For many of us, considering alternatives like Linux for an operating system; Open Office or Google Apps for productivity; or PHP for our web scripting language are already nixed because our primary databases are all in SQL Server and ASP.  At the law firm where I work, we aren’t about to swap out Word for an alternative without the legal document-specific features that Microsoft richly incorporates into their product.  But it leaves me, as the technology planner, in a bit of a pickle. Windows XP, Office 2003/2007, Exchange 2007, SQL Server 2007, and Windows Server 2003 are all powerful, reliable products that we use and benefit from, and the price we paid for them, through Techsoup and their charity licensing, is phenomenal.  But as we look at moving to web-based computing, and we embark on custom development to meet information management and communication needs that are very specific to our organization, we’re now faced with adopting Microsoft’s more dynamic and, in my opinion, dangerous technologies.

This would all be different if I had more reason to trust the vendor to prioritize my organization’s stability and operating efficiency over their marketing goals.  Or, put differently, if their marketing philosophy was based less on trying to trump their competition and more on being a dependable, trustworthy vendor.  They’re the big dog, just about impossible to avoid, and they make a very compelling financial argument — at first take — for nonprofits.  But it’s a far more complicated price break than it seems at first glance.