Tag Archives: software

How I Spent My 2015 Technology Initiative Grants Conference

I’m back from our (Legal Services Corporation) 15th annual technology conference, which ran from January 14th through the 16th  in San Antonio, Texas.  It was a good one this year, with a great location, good food, great people – nearly 300 of them, which is quite a record for us. There were plenty of amazing sessions, kicked off by a fascinating keynote on international access to justice web app partnerships. Slides and videos will be up soon on LSC’s website. But I did want to share the slides from my sessions, which all seemed to go very well.  I did three:

Are You Agile

I kicked off the first morning doing a session on agile project management with Gwen Daniels of Illinois Legal Aid Online. My slides provided a basic overview of project management concepts, then Gwen did a live demo of how ILAO uses Jira and a SCRUM methodology to develop websites and applications. Having studied agile more than actually practicing it, I learned a lot from her.  The combined slides will be up on LSC’s site. I pulled my intro from this broader presentation that I did at the Nonprofit Technology Conference in 2013:

Shop Smart: How A Formal Procurement Process Can Safeguard Your Investments

On Thursday, I summarized everything I know about software and vendor selection, writing proposals, and negotiating contracts into this dense presentation on how to purchase major software systems.

Security Basics

And on Friday, usual suspect Steve Heye and I led a session on security, factoring all of the things that we think orgs should know in an era of frequent, major breaches and distributed data.

I’ll hit some of these same themes in March at the Nonprofit Technology Conference, where I’ll be speaking on contract negotiations (cloud and otherwise) and information policies (with Johan Hammerstrom of CommunityIT. See you there?

13 Lessons On Building Your Nonprofit Technology Culture

This article originally appeared on the Exponent Partners blog on December 19th, 2014. It was written by Kerry Vineburg, based on a phone interview with me.

EXPONENT PARTNERS SERIES: SMART PRACTICES

Is your nonprofit thinking about implementing a large database project like Salesforce? Nonprofit and technology veteran Peter Campbell, CIO at Legal Services Corporation, recently shared his valuable insights on how to prepare your team and culture for long-term success. His organization, the top funder of civil legal aid for low-income Americans in the country, is developing Salesforce as a data warehouse for their grantee information and document management. 

We asked Peter to tell us more about what practices he uses to help ensure a successful technology implementation. As you’ll see, it’s just as much about working with people! 

Embarking On Your Project

1. When beginning a technology project, agree on the problem you’re solving, that all staff can relate to. Organizational readiness is critical. I’ve worked at organizations that didn’t recognize that their casual approach to data management was a problem, and they weren’t looking for a solution. If your staff don’t understand why they need an application, then you’re in danger of installing something that won’t be utilized. When starting a new organizational project, I identify 2-3 core bullet points that will explain the goals of the project, and repeat them often. For example: “The new system will provide one-step access to all information and documents related to a grantee.” That’s the high-level goal. It should be something where the product users all agree, “Yes, I need that!”
2. When planning technology upgrades and projects, schedule the changes. Plan for gradual change. Early in my career, I had to deal with the Y2K bug and replace every system at a mid-sized law firm in a short period. It led me to this philosophy: replace only one major system each year. It’s a myth that people hate change — people hate disruption. Change is good, but needs to be managed at steady level. If you’re doing regular implementations every year, people can get used to that pace. If you do nothing for 3 years, then switch out everything: 1) you’re putting too much of a burden on your implementers to achieve everything at once and 2) you’re making too big of an imposition on staff. Suddenly, everything they know is gone and replaced.

Getting Buy-In

3. Gain full executive sponsorship. There’s a common misconception that a new system will just work for you once it’s installed. To fully realize the benefits of a CRM requires cultural change. Every level of the organization needs to buy into the project. You’ll need to harness a lot of attention and energy from your team to develop requirements, manage the project, learn the new system and adapt processes. Otherwise, you’ll invest in a big database implementation and only one or two people will use it.
The importance of a major system upgrade should be set by the executive director and/or board. Everyone should know that the system is a priority. At nonprofits, our executive directors are often better at fundraising than managing a business, and many are somewhat technophobic. They don’t need to be technology gurus, but they do need to understand what the technology should be doing for them, and to take ownership of those goals. The last thing that I want to hear from my boss is, “Here’s a budget — go do what you think is best.” Without their interest in my projects, I’m bound to fail.
4. If one buy-in approach needs help, try a combo. If you can’t convince your executive director or other leadership to be regular active participants, power users can sometimes help convince your team. I’m not recommending an either/or approach, there should be some of both, but power users can engage staff in cases where management isn’t setting clear expectations. For any project that impacts staff, I will invite key users to be on our evaluation team, help with product selection, and potentially be on our advisory committee during the project. For example, we have a grants department liaison, who is charged with getting the right people in the room when we need input from the staff that know much better than we do what the system should ultimately do for them.
5. Incorporate perspectives from around the table. In addition to power users, I also want feedback from “standard” users. Maybe they don’t love technology so much, and maybe they wouldn’t volunteer for this. But they have an important perspective: you need to understand their reactions and what they’re going to find difficult. As the IT director and CIO, I know important things about managing a project. The users know important things that I don’t. If we don’t have views from multiple sides of the table, the project will fail.

Working With Good People

6. Look for partners (vendors and consultants) who understand your mission, not just the technology. In the ideal situation, you want people who not only get database and programming work, but also really understand your mission and business priorities. I’m blessed to have developers on my team who not only understand grants management but are also sympathetic to what the people coming to them are trying to accomplish. When they get a request, they can prioritize with a good understanding of our organization’s requirements. They’re able to answer, “How can I make the most out of what this person needs with my available time?” while being skilled enough to capably choose between the technical options. Getting people that have a broader mindset than just technology is really important.
7. Vary your team and role strategy with your size. At nonprofits, we don’t usually have big internal teams. Someone becomes our accidental techie/database guru. Even large nonprofits are hurting for staff. It’s always been less “here’s the best practice and ideal way to staff this,” and more “let’s see what budget and people I have, and make it work as well as I can.” Not many nonprofits have developers on staff. Hiring can be challenging. It’s a popular skillset, and won’t be cheap. If you’re tiny, you probably won’t hire full-time, you’ll outsource to consultants. But if you have 30 people or more using the CRM, you might benefit from in-house expertise, even if it’s a half-time role.
If you already have developers on staff, that’s great. If they don’t have experience with, say, Salesforce, but they do know database design and a programming language or two, it’s not hard to pick up the concepts. You’re modeling a database, designing it, and then scripting on top of it in a similar language. They can probably adapt.
8. Practice good compensation and retention strategies for your technically savvy (and/or newly trained) staff. I’ve seen a trend over the past 10 years. A nonprofit decides to use a solution like Salesforce and they charge their accidental techie with the task of implementing. The accidental techie gets the implementation done, becomes a guru on it, trains all the users, and then because the organization is paying them an entry-level salary, they leave and go get a much higher paying job as a consultant! It’s a valuable skillset, so don’t be short-sighted about compensating them for what they do for you. You need to be careful and invest properly. Give them raises along with the skillset, to make sure they are fully motivated to stick with you.

Project Management

9. Avoid surprises with good communication. My rule of IT management now is: “No one should ever be surprised by anything I do.” From experience with good mentors, I learned important lessons about communication: if you’re going to make a change, communication is critical. Say it 3 times in 3 different mediums (in email, on the internet, on flyers on the wall on every floor!). Be sure staff know how the technology contributes to the well-being of the organization, rather than being a time-waster, so they are motivated to keep working with it. Communicate well.
10. If possible, hold out for the right team. I put off projects to have the right people in place, rather than hold tight to a project deadline with the wrong people in place. See above for how to find and keep the right people.

Training and Baking The Technology Into Your Culture

11. Don’t reinvent the wheel; take advantage of the ecosystem. It can be really common for staff not to reach out for help. They may feel like their job is to learn the technology on their own. They should know there are many resources available to them! For example, with Salesforce, I recommend making use of peer support in the community, the Power of Us Hub, and local user groups. When they do seasonal updates, they do a lot of webinars and are good about providing information about how the app is growing. Salesforce also offers training (the Salesforce Foundation discounts by half) and every consultant I’ve spoken with is capable of doing some customized training. I know that other technologies offer resources like this also. It also behooves anybody on staff to know the specific implementation that you’ve done.
12. Allocate a training budget. I always push to have a staff training budget. For my organization, we even hired for a role of training and implementation specialist. We wanted to have a person on staff whose full-time job was training and strategizing how users use software and how to involve them in the implementation process. This should be part of your budget. I can’t emphasize enough how important it is to have people in your organization who know how to train on your applications.
13. Engage staff and help them understand the big picture of the technology. It’s good to get your team working with the database early on in the process, learning what it’s capable of and what it looks like. Engage your users: get people involved in every step of the process, from selecting products to implementation to training and rollout. Make the product demos big group activities, so that everyone can envision how similar systems work and what they might do with the product beyond what they’re doing today. Beta-test your implementations, giving staff lots of opportunities to provide input. Take an Agile approach of regularly showing what you’re developing to the people who will be using it, and adjusting your development per their feedback.
With a committed team that understands your mission, great communication, well-allocated resources, and gradual change, your organization can lay the foundations for a successful solution that will actually be adopted!
Thanks to Peter Campbell for these great insights. Peter also blogs at techcafeteria.com
For even more strategies on ensuring that your culture is ready for your system, check out our free report Nonprofit Technology Adoption: Why It Matters and How to Be Successful.

– See more at: http://www.exponentpartners.com/building-your-nonprofit-technology-culture#sthash.QPFll78h.dpuf

Architecting Healthy Data Management Systems

This article was originally published in the NTEN eBook “Collected Voices: Data-Informed Nonprofits” in January of 2014.

tape-403593_640Introduction

The reasons why we want to make data-driven decisions are clear.  The challenge, in our cash-strapped, resource-shy environments is to install, configure and manage the systems that will allow us to easily and efficiently analyze, report on and visualize the data.  This article will offer some insight into how that can be done, while being ever mindful that the money and time to invest is hard to come by.  But we’ll also point out where those investments can pay off in more ways than just the critical one: the ability to justify our mission-effectiveness.

Right off the bat, acknowledge that it might be a long-term project to get there.  But, acknowledge as well, that you are already collecting all sorts of data, and there is a lot more data available that can put your work in context.  The challenge is to implement new systems without wasting earlier investments, and to funnel data to a central repository for reporting, as opposed to re-entering it all into a redundant system.  Done correctly, this project should result in greater efficiency once it’s completed.

Consider these goals:

  • An integrated data management and reporting system that can easily output metrics in the formats that constituents and funders desire;
  • A streamlined process for managing data that increases the validity of the data entered while reducing the amount of data entry; and
  • A broader, shared understanding of the effectiveness of our strategic plans.

Here are the steps you can take to accomplish these goals.

Taking Inventory

The first step in building the system involves ferreting out all of the systems that you store data in today.  These will likely be applications, like case or client management systems, finance databases, human resources systems and constituent relationship management (CRM) systems.  It will also include Access databases, Excel spreadsheets, Word documents, email, and, of course, paper.  In most organizations (and this isn’t limited to nonprofits), data isn’t centrally managed.  It’s stored by application and/or department, and by individuals.

The challenge is to identify the data that you need to report on, wherever it might be hidden, and catalogue it. Write down what it is, where it is, what format it is in, and who maintains it.  Catalogue your information security: what content is subject to limited availability within the company (e.g., HR data and HIPAA-related information)? What can be seen organization-wide? What can be seen by the public?

Traditionally, companies have defaulted to securing data by department. While this offers a high-level of security, it can stifle collaboration and result in data sprawl, as copies of secured documents are printed and emailed to those who need to see the information, but don’t have access. Consider a data strategy that keeps most things public (within the organization), and only secures documents when there is clear reason to do so.

You’ll likely find a fair amount of redundant data.  This, in particular, should be catalogued.  For example, say that you work at a social services organization.  When a new client comes on, they’re entered into the case management system, the CRM, a learning management system, and a security system database, because you’ve given them some kind of access card. Key to our data management strategy is to identify redundant data entry and remove it.  We should be able to enter this client information once and have it automatically replicated in the other systems.

Systems Integration

Chances are, of course, that all of your data is not in one system, and the systems that you do have (finance, CRM, etc.) don’t easily integrate with each other.  The first question to ask is, how are we going to get all of our systems to share with each other? One approach, of course, is to replace all of your separate databases with one database.  Fortune 500 companies use products from Oracle and SAP to do this, systems that incorporate finance, HR, CRM and inventory management.  Chances are that these will not work at your nonprofit; the software is expensive and the developers that know how to customize it are, as well.  More affordable options exist from companies like MicroSoft, Salesforce, NetSuite and IBM, at special pricing for 501(c)(3)’s.

Data Platforms

A data platform is one of these systems that stores your data in a single database, but offers multiple ways of working with the data.  Accordingly, a NetSuite platform can handle your finance, HR, CRM/Donor Management and e-commerce without maintaining separate data stores, allowing you to report on combined metrics on things like fundraiser effectiveness (Donor Management and HR) and mail vs online donations (E-commerce and Donor Management).  Microsoft’s solution will incorporate separate products, such as Sharepoint, Dynamics CRM, and the Dynamics ERP applications (HR, Finance).  Solutions like Salesforce and NetSuite are cloud only, whereas Microsoft  and IBM can be installed locally or run from the cloud.

Getting from here to there

Of course, replacing all of your key systems overnight is neither a likely option nor an advisable one.  Change like this has to be implemented over a period of time, possibly spanning years (for larger organizations where the system changes will be costly and complex). As part of the earlier system evaluation, you’ll want to factor in the state of each system.  Are some approaching obsoletion?  Are some not meeting your needs? Prioritize based on the natural life of the existing systems and the particular business requirements. Replacing major data systems can be difficult and complex — the point isn’t to gloss over this.  You need to have a strong plan that factors in budget, resources, and change management.  Replacing too many systems too quickly can overwhelm both the staff implementing the change and the users of the systems being changed.  If you don’t have executive level IT Staff on board, working with consultants to accomplish this is highly recommended.

Business Process Mapping

BPM_Example

The success of the conversion is less dependent on the platform you choose than it is on the way you configure it.  Systems optimize and streamline data management; they don’t manage the data for you.  In order to insure that this investment is realized, a prerequisite investment is one in understanding how you currently work with data and optimizing those processes for the new platform.

To do this, take a look at the key reports and types of information in the list that you compiled and draw the process that produces each piece, whether it’s a report, a chart, a list of addresses or a board report.  Drawing processes, aka business process mapping, is best done with a flowcharting tool, such as Microsoft Visio.  A simple process map will look like this:

In particular, look at the processes that are being done on paper, in Word, or in Excel that would benefit from being in a database.  Aggregating information from individual documents is laborious; the goal is to store data in the data platform and make it available for combined reporting.  If today’s process involves cataloguing data in an word processing table or a spreadsheet, then you will want to identify a data platform table that will store that information in the future.

Design Considerations

Once you have catalogued your data stores and the processes in place to interact with the data, and you’ve identified the key relationships between sets of data and improved processes that reduce redundancy, improve data integrity and automate repetitive tasks, you can begin designing the data platform.  This is likely best done with consulting help from vendors who have both expertise in the platform and knowledge of your business objectives and practices.

As much as possible, try and use the built-in functionality of the platform, as opposed to custom programming.  A solid CRM like Salesforce or MS CRM will let you create custom objects that map to your data and then allow you to input, manage, and report on the data that is stored in them without resorting to actual programming in Java or .NET languages.  Once you start developing new interfaces and adding functionality that isn’t native to the platform, things become more difficult to support.  Custom training is required; developers have to be able to fully document what they’ve done, or swear that they’ll never quit, be laid off, or get hit by a bus. And you have to be sure that the data platform vendor won’t release updates that break the home-grown components.

Conclusion

The end game is to have one place where all staff working with your information can sign on and work with the data, without worrying about which version is current or where everything might have been stored.  Ideally, it will be a cloud platform that allows secure access from any internet-accessible location, with mobile apps as well as browser-based.  Further considerations might include restricted access for key constituents and integration with document management systems and business intelligence tools. But key to the effort is a systematic approach that includes a deep investment in taking stock of your needs and understanding what the system will do for you before the first keypress or mouse click occurs, and patience, so that you get it all and get it right.  It’s not an impossible dream.

 

The Increasing Price We Pay For The Free Internet

The Price of Freedom is Visible HerePicture : Rhadaway.

This is a follow-up on my previous post, A Tale Of Two (Or Three) Facebook Challengers. A key point in that post was that we need to be customers, not commodities.  In the cases of Facebook, Google and the vast majority of free web resources, the business model is to provide a content platform for the public and fund the business via advertising.  In this model, simply, our content is the commodity.  The customer is the advertiser.  And the driving decisions regarding product features relate more to how many advertisers they can bring on and retain than how they can meet the public’s wants and needs.

It’s a delicate balance.  They need to make it compelling for us to participate and provide the members and content that the advertisers can mine and market.  But since we aren’t the ones signing the checks, they aren’t accountable to us, and, as we’ve seen with Facebook, ethical considerations about how they use our data are often afterthoughts.  We’ve seen it over and over, and again this week when they backed off on a real names policy that many of their users considered threatening to their well-being.  One can’t help but wonder, given the timing of their statement, how much new competitor Ello’s surge in popularity had to do with the retraction. After all, this is where a lot of the people who were offended by the real names policy went.  And they don’t want to lose users, or all of their advertisers will start working on Ello to make the Facebook deal.

Free Software is at the Heart of the Internet

Freeware has been around since the ’80’s, much of it available via Bulletin Boards and online services like CompuServe and AOL. It’s important to make some distinctions here.  There are several variants of freeware, and it’s really only the most recent addition that’s contributing to this ethically-challenged business model:

  • Freeware is software that someone creates and gives away, with no license restrictions or expectation of payment. The only business model that this supports is when the author has other products that they sell, and the freeware applications act as loss leaders to introduce their paid products.
  • Donationware is much like Freeware, but the author requests a donation. Donationware authors don’t get rich from it, but they’re usually just capitalizing on a hobby.
  • Freemium is software that you can download for free and use, but the feature set is limited unless you purchase a license.
  • Open Source is software that is free to download and use, as well as modify to better meet your needs. It is subject to a license that mostly insures that, if you modify the code, you will share your modifications freely. The business model is usually based on providing  training and support for the applications.
  • Adware is free or inexpensive software that comes with advertising.  The author makes money by charging the advertisers, not the users, necessarily.

Much of the Internet runs on open source: Linux, Apache, OpenSSL, etc. Early adopters (like me) were lured by the free software. In 1989, I was paying $20 an hour to download Novell networking utilities from Compuserve when I learned that I could get a command line internet account for $20 a month and download them from Novell’s FTP site. And, once I had that account, I found lots more software to download in addition to those networking drivers.

Adware Ascendant

Adware is now the prevalent option for free software and web-based services, and it’s certainly the model for 99% of the social media sites out there.  The expectation that software, web-based and otherwise, will be free originated with the freeware, open source and donationware authors. But the companies built on adware are not motivated by showing off what they’ve made or supporting a community.  Any company funded by venture capital is planning on making money off of their product.  Amazon taught the business community that this can be a long game, and there might be a long wait for a payoff, but the payoff is the goal.

Ello Doesn’t Stand A Chance

So Ello comes along and makes the same points that I’m making. Their revenue plan is to go to a freemium model, where basic social networking is free, but some features will cost money, presumably business features and, maybe, mobile apps. The problem is that the pricing has to be reasonable and, to many, any price is unreasonable, because they like being subsidized by the ad revenue. The expectation is that social media networks are free.  For a social network to replace something as established as Facebook, they will need to offer incentives, not disincentives, and, sadly, the vast majority of Facebook users aren’t going to leave unless they are severely inconvenienced by Facebook, regardless of how superior or more ethical the competition is.

So I don’t know where this is going to take us, but I’m tired of getting things for free.  I think we should simply outlaw Adware and return to the simple capitalist economy that our founders conceived of : the one where people pay each other money for products and services. Exchanging dollars for goods is one abstraction layer away from bartering. It’s not as complex and creepy as funding your business by selling the personal information about your users to third parties.  On the Internet, freedom’s just another word for something else to lose.

Why You Should Delete All Facebook Mobile Apps Right Now

fblogoIt’s nice that Facebook is so generous and they give us their service and apps for free. One should never look a gift horse in the mouth, right? Well, if the gift horse is stomping through my bedroom and texting all of my friends while I’m not looking, I think it bears my attention.  And yours. So tell me why Facebook needs these permissions on my Android phone:

  • read calendar events plus confidential information
  • add or modify calendar events and send email to guests without owners’ knowledge
  • read your text messages (SMS or MMS)
  • directly call phone numbers
  • create accounts and set passwords
  • change network connectivity
  • connect and disconnect from Wi-Fi

This is a cut and pasted subset of the list, which you can peruse at the Facebook app page on Google Play. Just scroll down to the “Additional Information” section and click the “View Details” link under the “Permissions” header. Consider:

  • Many of these are invitations for identify theft.  Facebook can place phone calls, send emails, and schedule appointments without your advance knowledge or explicit permission.
  • With full internet access and the ability to create accounts and set passwords, Facebook could theoretically lock you out of your device and set up an account for someone else.

Now, I’m not paranoid — I don’t think that the Facebook app is doing a lot of these things.  But I have no idea why it requires the permissions to do all of this, and the idea that an app might communicate with my contacts without my explicit okay causes me great concern. Sure, I want to be able to set up events on my tablet.  But I want a box to pop up saying that the app will now send the invites to Joe, Mary and Grace; and then ask “Is that okay?” before it actually does it.  I maintain some sensitive business relationships in my contacts.  I don’t think it’s a reasonable thing for Facebook to have the ability to manage them for me.

This is all the more reason to be worried about Facebook’s plan to remove the messaging features from the Facebook app and insist that we all install Facebook Messenger if we want to share mobile pictures or chat with our friends.  Because this means well have two apps with outrageous permissions if we want to use Facebook on the go.

I’ve always considered Facebook’s proposition to be a bit insidious. My family and friends are all on there.  I could announce that I’m moving over to Google Plus, but most of them would not follow me there.  That is the sole reason that I continue to use Facebook.

But it’s clear to me that Facebook is building it’s profit model on sharing a lot of what makes me a unique individual.  I share my thoughts and opinions, likes and dislikes, and relationships on their platform. They, in turn, let their advertisers know that they have far more insight into who I am, what I’ll buy, and what my friends will buy than the average website.  Google’s proposition is quite similar, but Google seems to be more upfront and respectful about it, and the lure I get from Google is “we’ll give you very useful tools in return”.  Google respects me enough to show some constraint: the Google+ app on Play requires none of the permissions listed above. So I don’t consider Facebook to be a company that has much respect for me in the first place.  And that’s all the more reason to not trust  them with my entire reputation on my devices.

Do you agree? Use the hashtag #CloseTheBook to share this message online.

The Future Of Technology

Jean_Dodal_Tarot_trump_01…is the name of the track that I am co-facilitating at NTEN’s Leading Change Summit. I’m a late addition, there to support Tracy Kronzak and Tanya Tarr. Unlike the popular Nonprofit Technology Conference, LCS (not to be confused with LSC, as the company I work for is commonly called, or LSC, my wife’s initials) is a smaller, more focused affair with three tracks: Impact Leadership, Digital Strategy, and The Future of Technology. The expectation is that attendees will pick a track and stick with it.  Nine hours of interactive sessions on each topic will be followed by a day spent at the Idea Accelerator, a workshop designed to jump-start each attendee’s work in their areas. I’m flattered that they asked me to help out, and excited about what we can do to help resource and energize emerging nptech leaders at this event.

The future of technology is also something that I think about often (hey, I’m paid to!) Both in terms of what’s coming, and how we (LSC and the nonprofit sector) are going to adapt to it. Here are some of the ideas that I’m bringing to LCS this fall:

  • At a tactical level, no surprise, the future is in the cloud; it’s mobile; it’s software as a service and apps, not server rooms and applications.
  • The current gap between enterprise and personal software is going to go away, and “bring your own app” is going to be the computing norm.
  • Software evaluation will look more at interoperability, mobile, and user interface than advanced functionality.  In a world where staff are more independent in their software use, with less standardization, usability will trump sophistication.  We’ll expect less of our software, but we’ll expect to use it without any training.
  • We’ll expect the same access to information and ability to work with it from every location and every device. There will still be desktop computers, and they’ll have more sophisticated software, but there will be less people using them.
  • A big step will be coming within a year or two, when mobile manufacturers solve the input problem. Today, it’s difficult to do serious content creation on mobile devices, due primarily to the clumsiness of the keyboards and, also, the small screens. They will come up with something creative to address this.
  • IT staffing requirements will change.  And they’ll change dramatically.  But here’s what won’t happen: the percentage of technology labor won’t be reduced.  The type of work will change, and the distribution of tech responsibility will be spread out, but there will still be a high demand for technology expertise.
  • The lines between individual networks will fade. We’ll do business on shared platforms like Salesforce, Box, and {insert your favorite social media platform here}.  Sharing content with external partners and constituents will be far simpler. One network, pervasive computing, no more firewalls (well, not literally — security is still a huge thing that needs to be managed).

This all sounds good! Less IT controlling what you can and can’t do. Consumerization demystifying technology and making it more usable.  No more need to toss around acronyms like “VPN.”

Of course, long after this future arrives, many nonprofits will still be doing things the old-fashioned ways.  Adapting to and adopting these new technologies will require some changes in our organizational cultures.  If technology is going to become less of a specialty and more of a commodity, then technical competency and comfort using new tools need to be common attributes of every employee. Here are the stereotypes that must go away today:

  1. The technophobic executive. It is no longer allowable to say you are qualified to lead an organization or a department if you aren’t comfortable thinking about how technology supports your work.  It disqualifies you.
  2. The control freak techie.  They will fight the adoption of consumer technology with tooth and claw, and use the potential security risks to justify their approach. Well, yes, security is a real concern.  But the risk of data breaches has to be balanced against the lost business opportunities we face when we restrict all technology innovation. I blogged about that here.
  3. The paper-pushing staffer. All staff should have basic data management skills; enough to use a spreadsheet to analyze information and understand when the spreadsheet won’t work as well as a database would.
  4. Silos, big and small. The key benefit of our tech future is the ability to collaborate, both inside our company walls and out. So data needs to be public by default; secured only when necessary.  Policy and planning has to cross department lines.
  5. The “technology as savior” trope. Technology can’t solve your problems.  You can solve your problems, and technology can facilitate your solution. It needs to be understood that big technology implementations have to be preceded by business process analysis.  Otherwise, you’re simply automating bad or outdated processes.

I’m looking forward to the future, and I can’t wait to dive into these ideas and more about how we use tech to enhance our operations, collaborate with our community and constituents, and change the world for the better.   Does this all sound right to you? What have I got wrong, and what have I missed?

Working With Proposal Requests Collaboratively

Okay, I know that it’s a problem worthy of psychoanalysis that I’m so fascinated with the Request for Proposal (RFP) process. But, hey, I do a lot of them. And they do say to write about what you know.

The presentation that I gave at NTEN’s conference in March focused on the process of developing and managing RFPs. I made the case that you want to approach a vendor RFP very differently than you would a software/system RFP. I pushed for less fixed bid proposals, because, in many cases, asking for a fixed bid is simply asking for a promise that will be hard to keep. ROI involves far more than just the dollars spent on projects like CRM deployments and web site revamps.

In the session, I learned that Requests for Information (RFIs), which are simpler for the vendors to respond to, can be a great tool for narrowing a field.  It’s important that clients are respectful of the fact that vendors don’t get paid to respond to proposals; they only get paid if they win the bid, and showing respect on both sides at the very glimmer of an engagement is a key step in developing a healthy relationship.

Since the conference, I’ve gotten a bit more creative about the software that we use to manage the RFP process, and I wanted to give a shout-out to the tools that have made it all easier.  There are alternatives, of course, and I still use the Microsoft apps that these have replaced on a daily basis for other work that they’re great at. But the key here is that these apps live in the cloud and support collaboration in ways that make a tedious process much easier.

Google Docs is replacing Microsoft Word as my RFP platform software. The advantages over Word are that I can:

  • Share the document with whomever I choose; the whole world or a select set of invitees. Google’s sharing permissions are very flexible. With Word, I had to email and upload a document; with Google Docs I only have to share a link.
  • I can share it as a read-only document that they can comment on. This simplifies the Q&A portion of the process, while maintaining the important transparency, as all participants can see every question and response.

We recently did an RFI for web development (it’s closed now, sorry!) and here’s what it looked like, exactly.

Smartsheet is replacing Microsoft Excel as my response matrix platform.

Example of a Smartsheet matrix

The first step upon receiving responses to a request is always to put them all in a spreadsheet for easy comparison.  Smartsheet beats Excel because it’s multi-user and collaborative. Since Smartsheet is a Spreadsheet/form builder/project management mashup app, I can add checkboxes and multiple choice fields to my matrix.

For simple proposals, you can also easily use Smartsheet to collect reviewer comments and votes. Just add a few columns (two for each reviewer). This puts the matrix and evaluation criteria all in one place, that can easily be exported to spreadsheet or PDF in order to document the decision.

Surveymonkey has replaced Excel for cases when the evaluation criteria is more complex than a yes/no vote. Using their simple but sophisticated questionnaire builder, you can ask a number of questions with weighted or scaled answers. The responses can be automatically tallied and, as with Smartsheet, exported to Excel for further analysis or published as charts to a PDF.

Consultant Selection Chart

As I’ve ranted elsewhere, making a good investment in software and vendor evaluation has a big impact on how successful a project will be.  Working with staff who are impacted by the project in order to choose the partner or technology increases buy-in and the validity of the initiative in the eyes of the people that will make or break it. And a healthy process insures that you are purchasing the right software or hiring the right people.  These tools help me make that process easier and more transparent.  What works for you?

Career Reflections: My Biggest Data Fail

This article was published on the NTEN Blog in February of 2014.  It originally appeared in the eBook “Collected Voices: Data-informed Nonprofits“.

Peter Campbell of Legal Services Corporation shares his biggest data fail, and what he’d do differently now.

This case study was originally published along with a dozen others in our free e-book, Collected Voices: Data-Informed Nonprofits. You can download the e-book here.

Note: names and dates have been omitted to protect the innocent. 

Years ago, I was hired at an organization that had a major database that everyone hated. My research revealed a case study in itself: how not to roll out a data management system. Long story short, they had bought a system designed to support a different business model, and then paid integrators to customize it beyond recognition. The lofty goal was to have a system that would replace people talking to each other. And the project was championed by a department that would not have to do the data entry; the department identified to do all of the work clearly didn’t desire the system.

The system suffered from a number of problems. It was designed to be the kitchen sink, with case info, board updates, contact management, calendaring, web content management, and other functions. The backend was terrible: a SQL database with tables named after the tabs in the user interface. The application itself had miserable search functionality, no dupe checking, and little in the way of data quality control. Finally, there were no organizational standards for data entry. Some people regularly updated information; others only went near it when nagged before reporting deadlines. One person’s idea of an update was three to five paragraphs; another’s two words.

I set out to replace it with something better. I believed (and will always believe) that we needed to build a custom application, not buy a commercial one and tweak it. What we did was not the same thing that the commercial systems were designed to track. But I did think we’d do better building it with consultants on a high-level platform than doing it by ourselves from scratch, so I proposed that we build a solution on Salesforce. The system had over 150 users, so this would be relatively expensive.

Timing is everything: I made my pitch the same week that financial news indicated that we were diving into a recession. Budgets were cut. Spending was frozen.  And I was asked if I could build the system in Access, instead?  And this is when I…

…explained to my boss that we should table the project until we had the budget to support it.

Or so I wish. Instead, I dusted off my amateur programming skills and set out to build the system from scratch. I worked with a committee of people who knew the business needs, and I developed about 90% of a system that wasn’t attractive, but did what needed to be done reasonably well. The goals for the system were dramatically scaled back to simply what was required.

Then I requested time with the department managers to discuss data stewardship. I explained to the critical VP that my system, like the last one, would only be as good as the data put into it, so we needed to agree on the requirements for an update and the timeliness of the data entry. We needed buy-in that the system was needed, and that it would be properly maintained. Sadly, the VP didn’t believe that this was necessary, and refused to set aside time in any meeting to address it. Their take was that the new system would be better than the old one, so we should just start using it.

This was where I had failed. My next decision was probably a good one: I abandoned the project. While my system would have been easier to manage (due to the scaled back functionality, a simple, logical database structure and a UI that included auto-complete and dupe-checking), it was going to fail, too, because, as every techie knows, garbage in equals garbage out. I wanted my system to be a success.  We went on with the flawed original system, and eventually started talking about a new replacement project, and that might have happened, but I left the company.

Lessons learned:

  1. If I’m the IT Director, I can’t be the developer. There was a lot of fallout from my neglected duties.
  2. Get the organizational commitment to the project and data quality standards confirmed before you start development.
  3. Don’t compromise on a vision for expediency’s sake.  There are plenty of times when it’s okay to put in a quick fix for a problem, but major system development should be done right.  Timing is everything, and it wasn’t time to put in a data management system at this company.

Trello: A Swiss Army Knife For Tasks, Prioritizing And Project Planning

This post was originally published on the LSC Technology Blog in May of 2013. Note that “LSC” is Legal services Corporation, my current employer.

One of the great services available to the legal aid tech community (lstech) is LSNTAP’s series of webinars on tech tools.  I’ve somehow managed to miss every one of these webinars, but I’m a big fan of sharing the tools and strategies that allow us to more effectively get things done. In that spirit, I wanted to talk about my new favorite free online tool, Trello.

Trello is an online Kanban board.  If you’re unfamiliar with that term, you are still likely familiar with the concept: most TV cop shows have a board in the squad room with columns for new, open and closed cases.  Kanban is the name for these To Do/Doing/Done boards, and they are a powerful, visual tool for keeping track of projects.

You don’t need Trello — you can do it with a whiteboard and a marker.  But Trello’s online version can become very useful very fast.  Like the best apps, the basic functionality is readily usable, but  advanced functionality lurks under the hood.  With no training, you can create a todo list that monitors what’s coming up, what you’re working on and what you’ve finished.  Explore a little bit, and you learn that each task can have a description, a due date, a file attached to it, it’s own task list and one or more people assigned to it. Because Trello is just as good as a one-person productivity tool as it is as a team coordination tool.

I can report that the IT team at LSC has dived into it.  Here are a few of the things we’re using it for:

  • Our project big board.  We keep all of our upcoming projects, with due dates and leads, in a Trello board.
  • Individual task lists.  The developers track their major deliverable dates, the rest of us the small things we’re working on.
  • Strategic Planning – anyone who has ever done a session involving slapping post-its on the wall will appreciate this simple, online version of that exercise.  SWOT analyses work particularly well.

At this year’s Nonprofit Technology Conference, where I first learned about Trello, it was successfully being used as a help desk ticket system.  I’d recommend this only for small programs.  A more powerful free ticket system like Spiceworks, or a commercial product will be able to handle the volume at a 50 person + company better than Trello can.

But here’s the real case for a tool like Trello: it goes from zero to compellingly useful in seconds.  While I won’t knock enterprise project management systems, I lean toward the ones that give me great functionality without taking up a lot of my time.  I’ve hit a couple of stages in my career where the immense workload begged for a such tools, but implementing one was too big a project to add to the list.  I bet that you’ve been there, too. Trello lacks the sophistication of a waterfall system like MS Project or an agile one, such as Jira. But it can get you organized in minutes.  And, in our case, it doesn’t replace those more sophisticated systems. It supplements them at the high level.  We do both traditional projects (deploy servers, install phone systems) and agile ones (build web sites, program our grants management system).  We can use the proper tools for those project plans, but keep the team coordinated with Trello.

Here’s our 2013 project board:

 

Note that we only assign the project leads, and the main use of this board is in the project review that kicks off our weekly staff meetings. But it’s helping us stay on task, and that is always the challenge.

What are your favorite tools for team coordination and project management?  Let us know in the comments.

Google Made Me Cry

Well, not real tears. But the announcement that Google Reader will no longer be available as of July 1st was personally updating news.  Like many people,  over the last eight years, this application has become as central a part of my online life as email. It is easily the web site that I spend the most time on, likely more than all of the other sites I frequent combined, including Facebook.

What do I do there? Learn. Laugh. Research. Spy. Reminisce. Observe. Ogle. Be outraged. Get motivated. Get inspired. Pinpoint trends. Predict the future.

With a diverse feed of nptech blogs,  traditional news,  entertainment, tech, LinkedIn updates, comic strips and anything else that I could figure out how to subscribe to,  this is the center of my information flow. I read the Washington Post every day,  but I skim the articles because they’re often old news. I don’t have a TV (well, I do have Amazon Prime and Hulu).

And I share the really good stuff.  You might say, “what’s the big deal? You can get news from Twitter and Facebook”  or “There are other feed readers.”

The big deal is that the other feed readers fall in three categories:

  1. Too smart: Fever
  2. Too pretty: Feedly, Pulse
  3. Too beta: Newsblur, TheOldReader
“Smart” readers hide posts that aren’t popular, assuming that I want to know what everyone likes, instead of research topics or discover information on my own. There’s a great value to knowing what others are reading; I use Twitter and Facebook to both share what I find and read what my friends and nptech peers recommend.  I use my feed reader to discover things.
Pretty readers present feed items in a glossy magazine format that’s hard to navigate through quickly and hell on my data plan.
The beta readers are the ones that look pretty good to me, until I have to wait 45 seconds for a small feed to refresh or note that their mobile client is the desktop website, not even an HTML5 variant.

What made Google Reader the reader for most of us was the sheer utility.  My 143 feeds generate about 1000 posts a day.  On breaks or commutes, I scan through them, starring anything that looks interesting as I go.  When I get home from work, and again in the morning, I go through the starred items, finding the gems.

Key functionality for me is the mobile support. Just like the web site, the Google Reader Android app wins no beauty contests, but it’s fast and simple and supports my workflow.

At this point, I’m putting my hopes on Feedly, listed above as a “too pretty” candidate.  It does have a list view that works more like reader does.  The mobile client has a list view that is still too graphical, but I’m optimistic that they’ll offer a fix for that before July.  Currently, they are a front-end to Google’s servers, which means that there is no need to export/import your feeds to join, and your actions stay synced with Google Reader (Feedly’s Saved Items are Google’s Starred, wherever you mark them).  Sometime before July, Feedly plans to move to their own back-end and the change should be seamless.

July is three months away. I’m keeping my eyes open.  Assuming that anyone who’s read this far is wrestling with the same challenge, please share your thoughts and solutions in the comments.