Monthly Archives: May 2009

The Road to Shared Outcomes

This post originally appeared on the Idealware Blog in May of 2009.

At the recent Nonprofit Technology Conference, I attended a somewhat misleadingly titled session called “Cloud Computing: More than just IT plumbing in the sky“. The cloud computing issues discussed were nothing like the things we blog about here (see Michelle’s and my recent “SaaS Smackdown” posts). Instead, this session was really a dive into the challenges and benefits of publishing aggregated nonprofit metrics. Steve Wright of the Salesforce Foundation led the panel, along with Lucy Bernholz and Lalitha Vaidyanathan. The session was video-recorded; you can watch it here.

Steve, Lucy and Lalithia painted a pretty visionary picture of what it would be like if all nonprofits standardized and aggregated their outcome reporting on the web. Lalithia had a case study that hit on the key levels of engagement: shared measurement systems; comparative performance measurement and a baked in learning process. Steve made it clear that this is an iterative process that changes as it goes — we learn from each iteration and measure more effectively, or more appropriately for the climate, each time.

I’m blogging about this because I’m with them — this is an important topic, and one that gets lost amidst all of the social media and web site metrics focus in our nptech community. We’re big on measuring donations, engagement, and the effectiveness of our outreach channels, and I think that’s largely because there are ample tools and extra-community engagement with these metrics — every retailer wants to measure the effectiveness of their advertising and their product campaigns as well. Google has a whole suite of analytics available, as do other manufacturers. But outcomes measurement is more particular to our sector, and the tools live primarily in the reporting functionality of our case and client management systems. They aren’t nearly as ubiquitous as the web/marketing analysis tools, and they aren’t, for the most part, very flexible or sophisticated.

Now, I wholly subscribe to the notion that you will never get anywhere if you can’t see where you’re going, so I appreciate how Steve and crew articulated that this vision of shared outcomes is more than just a way to report to our funders; it’s also a tool that will help us learn and improve our strategies. Instead of seeing how your organization has done, and striving to improve upon your prior year’s performance, shared metrics will offer a window into other’s tactics, allowing us all to learn from each others’ successes and mistakes.

But I have to admit to being a bit overwhelmed by the obstacles standing between us and these goals. They were touched upon in the talk, but not heavily addressed.

  • Outcome management is a nightmare for many nonprofits, particularly those who rely heavily on government and foundation funding. My brief forays into shared outcome reporting were always welcomed at first, then shot down completely, the minute it became clear that joint reporting would require standardization of systems and compromise on the definitions. Our case management software was robust enough to output whatever we needed, but many of our partners were in Excel or worse. Even if they’d had good systems, they didn’t have in-house staff that knew how to program them.
  • Outcomes are seen by many nonprofit executives as competitive data. If we place ours in direct comparison with the similar NPO down the street, mightn’t we just be telling our funders that they’re backing the wrong horse?
  • The technical challenges are huge — of the NPOs that actually have systems that tally this stuff, the data standards are all over the map, and the in-house skill, as well as time and availability to produce them, is generally thin. You can’t share metrics if you don’t have the means to produce them.

A particular concern is that all metrics are fairly subjective, as can happen when the metrics produced are determined more by the funding requirements than the NPO’s own standards. When I was at SF Goodwill, our funders were primarily concerned with job placements and wages as proof of our effectiveness. But our mission wasn’t one of getting people jobs; it was one of changing lives, so the metrics that we spent the most work on gathering were only partially reflective of our success – more outputs than outcomes. Putting those up against the metrics of an org with different funding, different objectives and different reporting tools and resources isn’t exactly apples to apples.

The benefits of shared metrics that Steve and crew held up is a worthwhile dream, but, to get there, we’re going to have to do more than hold up a beacon saying “This is the way”. We’re going to have to build and pave the road, working through all of the territorial disputes and diverse data standards in our path. Funders and CEOs are going to have to get together and agree that, in order to benefit from shared reporting, we’ll have to overcome the fact that these metrics are used as fodder in the battles for limited funding. Nonprofits and the ecosystem around them are going to have to build tools and support the art of data management required. These aren’t trivial challenges.

I walked into the session thinking that we’d be talking about cloud computing; the migration of our internal servers to the internet. Instead, I enjoyed an inspiring conversation that took place, as far as I’m concerned, in the clouds. We have a lot of work to do on the ground before we can get there.

How Technology Might Shape The Future Of Our Cities

This was originally posted on the Earthjustice Blog in May of 2009.

The future is now — at least, the future is now in theaters. And what the future looks like, particularly, our cities in the future, is highly disputed in the pop culture realm.

Take this article contrasting Star Trek‘s vision of San Francisco with Terminator: Salvation’s view of same. One movie envisions a future where the threat of global warming was either contained, or just not the threat that we know it is; the other a future where our technology stood up and ravaged the planet before climate change had a chance.

I’d say the chances that San Francisco will look as shiny and steely as Star Trek predicts are about as likely as the machines becoming sentient and taking over; we’re in for something different, and what our cities will look like depends heavily on how quickly and creatively we can harness technology to work with our planet, instead of against it.

Mitchell Joachim, one of the founders of Terrefuge, an Ecological Design Collaborative for Urban Infrastructure, Building, Planning, and Art, was on the Colbert Report recently, speaking about the radical work his group does in envisioning how an eco-friendly city might work.

It’s a vision that seems half scientific, half Dr. Seussian, but, given the impending dangers we face with climate change, seems particularly apt. We’re not going to solve these problems without a huge amount of creativity and a willingness to accept what would normally seem unacceptable. In that light, Joachim’s ideas are particularly refreshing. Consider these proposals:

The Fab Tree Hab is living, organic housing. Vegetation is prepped with technology that plots the growth; these homes are edible, producing food and shelter simultaneously. As Joachim explains it: “The Fab Tree Hab presents a sophisticated methodology to grow homes from living native trees. This 100% living habitat is prefabricated using Computer Numeric Controlled (CNC) reusable scaffolding, manufactured off-site in advance. These scaffold sections can be readily shipped and assembled to fit local tree and woody plant species. Therefore, we enable dwellings to be a fully integrated part of an ecological community.”

Joachim re-envisions transportation as something soft, squishy, and self-powering, in the form of SOFT Cars and Blimp Bumper Buses. S.O.F.T. stands for Sustainable Omni Flow Transport. Cars would be safer and recyclable, with most of their electronics stored in the wheels, allowing for comfortable rides, milder collisions, and stackable recharging stations.

The Blimps would be made of organic materials and self-charging. Going at a rate of 15 miles an hour, commuters would just hop on and off of the seats dangling down from the vehicles. The world that Joachim is pitching is not only one that is ecologically sustainable; it’s also pretty pleasant! It’s not a vision of “back to nature” as much as it’s a vision of moving forward with nature.

Of course, Joachim isn’t the only one thinking about cities and greenhouse gases.Cisco’s Urban Green IT Initiative proposes municipal wireless projects, enhanced public transportation, and environmentally-focused building standards as immediate priorities. Per Gavin Newsom, mayor of San Francisco, one of the three cities kicking off the initiative:

Cities are responsible for 75 percent of the planet’s energy use. Sixty percent of the world will live in cities by 2030, and global electricity use will grow by more than 35 percent. We’ve got to get something started now to hold off detrimental effects to the environment that have already begun.

I’m as big a fan of the Hollywood sci-fi epics as anyone, but I hope we’re also paying attention to people like Mitchell Joachim and the others who are truly envisioning a future where the benefits of technology work in concert with the natural power and beauty of our planet to support a sustainable urban lifestyle.

As Earthjustice works to stem the damage being done to our planet, let’s concurrently focus on the improvements that we can make as we face the sometimes daunting challenge of climate change.

Flying in Place: Videoconferencing

This was originally posted on the Earthjustice Blog in May of 2009.

As an information technology director whose livelihood depends pretty heavily on the use of electricity, I’m constantly looking for meaningful ways that the technology I’m immersed in can contribute to the reduction of greenhouse gases. The saying “If you aren’t part of the solution you’re part of the problem” doesn’t even suffice — technology is part of the problem, period, and it behooves people like me, who trade in it, to use it in ways that offset its debilitating effects on our environment.

This is why I’m very excited about an initiative that we have taken on to deploy videoconferencing systems in each of our nine locations.

Per a May, 2008 report by the Stockholm Environment Institute, aviation activities account for somewhere between 2% and 5% of the total anthropogenic Greenhouse Gas emissions. Our organization, with offices stretching from Honolulu to Anchorage to NYC and down to Tallahassee, has a great opportunity to eliminate much of our substantial air travel. If you’re in a similar circumstance, I thought it might be helpful to offer a rundown of the options ranging from free and easy to expensive but fantastic.

Cheap and easy means desktop video, which is far more suited for person-to-person chats at one’s desk than large meetings. While it’s certainly possible to hook up a PC to a projector and include someone in a conference room meeting this way, it’s a far cry from the experience you would have with actual videoconferencing equipment.

In general, the return on the investment will be in how successfully you can mimic being in the same room with your video attendees.

While only the richest of us can afford the systems that are installed as an actual wall in the conference room (commonly called “Telepresence”), connecting offices as if they were in the same place, a mid-range system with a large TV screen will, at least, make clear important things like body language and facial expressions, and be of a quality that syncs the voices to the images correctly. This makes a big difference in terms of the usefulness of the experience, and should be what justifies the expense over that of a simple conference phone.

Leader of the cheap and easy options is Skype. Once known as a way to do free phone calls over the Internet, Skype now does video as well. Of course, the quality of the call will vary greatly with the robustness of your internet connection, meaning it’s abysmal if a party is on dial-up and it’s great if all callers have very fast DSL/Cable connections or better.

Other free options might already be installed on your computer. the instant messaging applications like Windows Messenger, Yahoo! Messenger and iChat are starting to incorporate video, as well.

There are two ways to do Conference Room Video, one of which requires some investment, at least in a large TV display. One option is to do the conference in someone else’s room. Fedex/Kinko’s is one of many businesses that rent space with video equipment and support (note: it’s not supported at all locations). If your needs are occasional, this might prove more affordable than flying.

For a more permanent arrangement in your own digs, then you want to look at purchasing your own video equipment. This is the route that Earthjustice is taking. Vendors in this space include (and aren’t limited to) Polycom, Cisco, Tandberg andLifeSize. Options range from a simple setup, with a basic system in each office, to a more dynamic one using a multi-point bridge (definition below!). The key questions you need to ask before deciding what to buy are:

  • How many locations do I want to have video in?
  • What are is the maximum number of locations (“points”) that I want to connect in one call?
  • Do I want to regularly include parties from outside of my organization?
  • Do I have sufficient bandwidth to support this?
  • Do I want to incorporate presentations and computer access with the face to face meetings?
  • Do I want to support desktop computer connections to my system?
  • Do I want to have the ability to record conferences and optionally broadcast them over the web?

Standard videoconferencing equipment includes:

  • A Codec, which, much like a computer’s Central Processing Unit (CPU) is the brains of the equipment
  • One or two Displays (generally a standard TV set; for HD video an HDTV)
  • A Conference Phone
  • One or more Microphones
  • A Remote Control to control the camera and inputs
  • Cables to connect the network and optional input devices, such as a laptop computer

The Codec might be single point or multi-point, multi-point meaning that it is capable of connecting in multiple parties to the conference. You might want an additional display if you regularly do computer presentations at your meetings, so you can dedicate one screen to the presentation and the other to the remote participants. Most modern systems have a remote control that can not only control your camera, but also the camera in the remote location(s), assuming all systems are made by the same vendor.

Another option is to purchase a Conference Bridge (aka MCU). A bridge is a piece of equipment that provides additional functionality to the Codecs on your network, such as multi-point conferencing, session recording, and, possibly, desktop video.

Key questions that we had when we evaluated systems were: “How many points do your codecs connect to before we need to add a bridge?” and, “If numerous parties are connected, how does your system handle the video quality?” Some systems brought all connections down to the poorest quality connected; others were able to maintain different quality connections in different windows.

We also looked hard at the ease of use, but determined that all of these systems were about as complex as, say, adding a VCR or DVR to a cable TV setup. Some staff training is required.

On the real geeky side, we required that the systems do these protocols: Session Initialization Protocol (SIP) and H.323. These are the most common ways that one video system will connect with another over the Internet. By complying with these standards, we’ve had great success interoperating with other manufacturer’s systems.

Finally, we were able to go with a High Definition system, with great quality. This was largely enabled by the robust network we have here, as no system will work very well for you if you don’t have sufficient internet bandwidth to support this demanding application.

Conclusion: This is a somewhat simple distillation of a fairly complex topic, and the proper solution and impact of using video will vary from organization to organization. In our case, this will pay for itself quickly, and be scored as an easy win in our goal to reduce our carbon footprint. Compelling technology that supports our planet. Who can’t appreciate that?

Meet The Idealware Bloggers Part 3: Peter Campbell

This interview was conducted by Heather Gardner-Madras and originally published on the Idealware Blog in May of 2009.

The third interview of the series is with Peter Campbell and I had a good time putting a face with the twitter conversations we’ve been having in the past year, as well as finding out more about how he came to write for the Idealware blog.

Peter Campbell

On Connecting Nonprofits & Technology
Peter’s decision to combine technology with nonprofit work was very deliberate. Well into a career as an IT director for a law firm in San Francisco he had something of an epiphany and wanted to do something more meaningful in the social services sector. It took him 9 months to find just the right job and he landed at Goodwill. In both positions he was able to take advantage of good timing and having the right executive situations to create his own vision and really bring effective change to the organizations. At Goodwill Industries, Peter developed retail management software and introduced e-commerce. Now with Earth Justice, he is also sharing his experience with the broader community.

On Blogging
Although Peter always wanted to incorporate writing as a part of his work and wrote a good bit, the advent of blogs didn’t provide a lot of motivation for him because he wanted to be sure to have something worthwhile to say. A firm believer in blogging about what you know, he was intrigued by the opportunity to blog at Idealware since the topics and style were aligned with his knowledge and experience. So while the previous 3 years of blogging had only yielded about 50 entries, this was an opportunity to get on a roll, and if you have been following this blog you know that it has really paid off and provided a lot of great resources already.

The Magic Wand Question
One of the questions I asked in each interview was this: If you had a magic wand that could transform one aspect of nonprofit technology in an instant, what would it be and why?

Peter’s answer is simple and echoes a common thread in responses to this question: Change the way nonprofit management understands technology – help them realize the value it offers, the resources needed to get the most out of it, and how to use it.

The Next 5 Years
In response to a question about what he finds to be the most exciting trend in nonprofit technology in the next five years Peter felt there are many of things to be excited about right now.

He feels that transformations in technology are cropping up quickly and nonprofits have a real opportunity to be at the forefront of these changes. The data revolution and rise of cloud computing will liberate nonprofits and turn the things we struggle with now into an affordable solution. Virtualization, as well, will provide new freedom and efficiency. According to Peter, these trends will work together to change the way we manage and invest in technology. In his words – right now its still geeky and complex, but it will get easier.

Personal snapshots
First thing you launch on your computer when you boot/in the morning?
Twitter client, then FireFox with Gmail and Google Reader and 2 blogs open in tabs.

Is there a tech term or acronym that makes you giggle and why?
Not really, but there are some that infuriate me. I am a fan of BPM (Business Process Management) because it describes what you should do – manage your processes and realize that tech is the structure to do it with, not the brain.

Favorite non-technology related thing or best non-techy skill?
Besides technology, I hope my best skill is my writing.

Which do you want first – Replicator, holodeck, transporter or warp drive?
Transporter is the great one, but I don’t want to be the beta tester.

See previous posts to learn more about Steve Backman and Laura Quinn.

 

Oldstyle Community Management

This article was originally published on the Idealware Blog in May of 2009.

pcboard_disk.jpgPhoto by ferricide
It’s been a big month for Online Community Management in my circles. I attended a session at the Nonprofit Technology Conference on the subject; then, a few weeks later, ReadWriteWeb released a detailed report on the topic. I haven’t read the report, but people I respect who have are speaking highly of it.Do you run an online community? The definition is pretty sketchy, ranging from a blog with active commenters to, say, America Online. If we define an online community as a place where people share knowledge, support, and/or friendship via communication forums on web sites or via email, there are plenty of web sites, NING groups, mailing lists and AOL chat rooms that meet that criteria.

The current interest is spurred by the notion that this is the required web 2.0/3.0 direction for our organizational web sites. We’ve made the move to social media (as this recent report suggests); now we need to be the destination for this online interaction. I don’t think that’s really a given, any more than it’s clear that diving into Facebook and Twitter is a good use of every nonprofit’s resources. It all depends on who your constituents are and how they prefer to interact with you. But, certainly, engagement of all types (charitable, political, commercial) is expanding on the web, and most of us have an audience of supporters that we can communicate with here.

Buried deep in my techie past is a three year gig as an online community manager. It was a volunteer thing. More honestly, a hobby. In 1988, I set up a Fidonet Bulletin Board System (BBS); linked it to a number of international discussion groups (forums); and built up a healthy base of active participants.

This was before the world wide web was a household term. I ran specific software that allowed people to dial in, via modem, to my computer, and either read and type messages on line or download them into something called a “QWK reader“; read and reply off line, and then synchronize with my system later. There were about 1000 bulletin board systems within the local calling distance in San Francisco at the time. Many of them had specific topics, such as genealogy or cooking; mine was a bit more generally focused, but I appealed to birdwatchers, because I published rare bird alerts, and to people who liked to talk politics. This was during the first gulf war, and many of my friends system’s were sporting American Flags (in ASCII Art), while my much more liberal board was the place to be if you were more critical of the war effort.

At the peak of activity, I averaged 200 messages a day in our main forum, and I’m pretty sure that the things that made this work apply just as much to the more sophisticated communities in play today. Those were:

    • Meeting a Need: There were plenty of people who desired a place to talk politics and share with a community, and there wasn’t a lot of competition. The bulk of my success was offering the right thing at the right time. It’s much tougher now to hang a shingle and convince people that your community will meet their needs when they have millions to choose from. How successful — and how useful — your community might be depends on how much of a unique need it serves.
    • Maintaining Focus: many of the popular bulletin boards had forums, online gaming, and downloads. My board had forums. The handful of downloads were the QWK readers and supporting software that helped people use the forums. The first time you logged on, you were subjected to a rambling bit of required reading that said, basically, “if birdwatching and chatting about the issues of the day interests you, keep on reading”, and I saw numerous people hang up before getting through that, which i considered a very good thing. The ones that made it through tended to be civil and engaged by what they signed on for. By focusing more on what made for a quality discussion, as opposed to trying to attract a large, diverse crowd, my base grew much bigger than I ever imagined it would.
    • Tolerance and Civility: We had a few conservatives among our active callers, and that kept the conversation lively. But we had excellent manners, never resorting to personal attacks and sending lots of private messages to the contrarians supporting their involvement. We really appreciated them, and they appreciated semi-celebrity status. It was all about the arguments, not about the attitude. Mind you, this was 1989/90 — I’m not sure if it’s possible to have civil public political debates today…
    • Active moderation: My hobby was a full time job that I did on top of my full time job. I engaged with my callers as if they were sitting in my living room, being gracious and helpful while I participated fully in the main events. There was a little moderation required to keep the tone civil, and making the board safe for all — particularly the ones with the minority opinions — required having their trust that I wouldn’t let any attacks get through without my response.

I think that the biggest question today is whether you should be building a community on your own, or engaging your community in the ample public places (Facebook, Twitter, etc.) that they might already hang out in. In fact, I think that where you engage is a fairly moot point, what’s important is that you do engage and provide a forum that helps people cope and learn about the issues that your organization is addressing. Pretty much all of the bulleted advice above will apply to your community, or out in the community.

NPTech.Info Updated

NPTech Aggragator at http://nptech.info

Those of you familiar with my sideproject at http://nptech.info know that it has been trustworthily aggregating blog entries, photos and websites tagged with the term “nptech” for close to four years now.  It’s been a little negelcted of late, but after Annaliese over at NTEN gave it a shout-out, I figured it was due for some clean-up. Here’s what’s new:

  • About 25 blogs added to the NPTech Blogs section, and a broken link or two corrected on the existing ones;
  • Information from Twitter added to the main “Tagged items” feed that already grabs nptech items from Delicious, Flickr and Technorati;
  • New additions to the general tech section from sites like ReadWriteWeb and Mashable
  • A simple Facelift, primarily adding a little color and going for a more attractive font (fancy design is not a big priority here, particularly since my last big effort to pretty it up got creamed in a Drupal upgrade).

As usual, if you have a blog focused on Non-Profit Technology that you’d like added to the mix, let me know, but rest assured that, if you can find your blog on Technorati, we’re already grabbing the items that you tag or categorize as “nptech”.

The Silo Situation

This post originally appeared on the Idealware Blog in May of 2009.

The technology trend that defines this decade is the movement towards open, pervasive computing. The Internet is at our jobs, in our homes, on our phones, TVs, gaming devices. We email and message everyone from our partners to our clients to our vendors to our kids. For technology managers, the real challenges are less in deploying the systems and software than they are in managing the overlap, be it the security issues all of this openness engenders, or the limitations of our legacy systems that don’t interact well enough. But the toughest integration is not one between software or hardware systems, but, instead, the intersection of strategic computing and organizational culture.

There are two types of silos that I want to discuss: organizational silos, and siloed organizations.

An organizational silo, to be clear, is a group within an organization that acts independently of the rest of the organization, making their own decisions with little or no input from those outside of the group. This is not necessarily a bad thing; there are (although I can’t think of any) cases where giving a group that level of autonomy might serve a useful purpose. But, when the silo acts in an environment where their decisions impact others, they can create long-lived problems and rifts in critical relationships.

We all know that external decisions can disrupt our planning, be it a funders decision to revoke a grant that we anticipated or a legislature dropping funding for a critical program. So it’s all the more frustrating to have the rug pulled out from under us by people who are supposed to be on the same team. If you have an initiative underway to deploy a new email system, and HR lays off the organizational trainer, you’ve been victimized by a silo-ed decision. On the flip side, a fundraiser might undertake a big campaign, unaware that it will collide with a web site redesign that disables the functionality that they need to broadcast their appeal.

Silos thrive in organizations where the leadership is not good at management. Without a strong CEO and leadership team, departmental managers don’t naturally concern themselves with the needs of their peers. The expediency and simplicity of just calling the shots themselves is too appealing, particularly in environments where resources are thin and making overtures to others can result in those resources being gladly taken and never returned. In nonprofits, leaders are often more valued for their relationships and fundraising skills than their business management skills, making our sector more susceptible to this type of problem.

The most damaging result of operating in this environment is that, if you can’t successfully manage the silos in your organization, then you won’t be anything but a silo in the world at large.

We’ve witnessed a number of industries, from entertainment and newspapers to telephones and automobiles, as they allowed their culture to dictate their obsolescence. Instead of adapting their models to the changing needs of their constituents, they’ve clung to older models that aren’t relevant in the digital age, or appropriate for a global economy on a planet threatened by climate change. Since my focus is technology, I pay particular attention to the impacts that technological advancement, and the accompanying change in extra-organizational culture (e.g., the country, our constituents, the world) have on the work my organization does. Just in the past few years, we’ve seen some significant cultural changes that should be impacting nonprofit assumptions about how we use technology:

  • Increased regulation on the handling of data. We’re wrestling with the HIPAA laws governing handling of medical data and PCI standards for financial data. If we have not prioritized firewalls, encryption, and the proper data handling procedures, we’re more and more likely to be out of step with new laws. Even the 990 form we fill out now asks if we have a document retention plan.
  • Our donors are now quite used to telephone auto attendants, email, and the web. How many are now questioning why we use the dollars they donate to us to staff reception, hand write thank you notes, and send out paper newsletters and annual reports?
  • Our funders are seeing more available data on the things that interest them everywhere, so they expect more data from us. The days of putting out the success stories without any numbers to quantify them are over.

Are we making changes in response to these continually evolving expectations? Or are we still struggling with our internal expectations, while the world keeps on turning outside of our walls? We, as a sector, need to learn what these industrial giants refused to, before we, too, are having massive layoffs and closing our doors due to an inability to adapt our strategies to a rapidly evolving cultural climate. And getting there means paying more attention to how we manage our people and operations; showing the leadership to head into this millennia by mastering our internal culture and rolling with the external changes. Look inward, look outward, lead and adapt.

SaaS and Security

This post was originally published on the Idealware Blog in May of 2009.

My esteemed colleague Michelle Murrain lobbed the first volley in our debate over whether tis safer to host all of your data at home, or to trust a third party with it. The debate is focused on Software as a Service (SaaS) as a computing option for small to mid-sized nonprofits with little internal IT expertise. This would be a lot more fun if Michelle was dead-on against the SaaS concept, and if I was telling you to damn the torpedos and go full speed ahead with it. But we’re all about the rational analysis here at Idealware, so, while I’m a SaaS advocate and Michelle urges caution, there’s plenty of give and take on both sides.

Michelle makes a lot of sound points, focusing on the very apt one that a lack of organizational technology expertise will be just as risky a thing in an outsourced arrangement as it is in-house. But I only partially agree.

  • Security: Certainly, bad security procedures are bad security procedures, and that risk exists in both environments. But beyond the things that could be addressed by IT-informed policies, there are also the security precautions that require money to invest in and staff to support, like encryption and firewalls. I reject the argument that the data is safer on an unsecured, internal network than it is in a properly secured, PCI-Compliant, hosted environment. You’re not just paying the SaaS provider to manage the servers that you manage today; you’re paying them to do a more thorough and compliant job at it.
  • Backups: Many tiny nonprofits don’t have reliable backup in place; a suitable SaaS provider will have that covered. While you will also want them to provide local backups (either via scheduled download or regular shipment of DVDs), even without that, it’s conceivable that the hosted situation will provide you with better redundancy than your own efforts.
  • Data Access: Finally, data access is key, but I’ve seen many cases where vendor licensing restricts users from working with their own data on a locally installed server. Being able to access your data, report on it, back it up, and, if you choose, globally update it is the ground floor that you negotiate to for any data management system, be it hosted or not. To counter Michelle, resource-strapped orgs might be better off with a hosted system that comes with data management services than an internal one that requires advanced SQL training to work with.

Where we might really not see eye to eye on this is in our perception of how ‘at risk” these small nonprofits are, and I look at things like increasing governmental and industry regulation of internal security around credit cards and donor information as a time bomb for many small orgs, who might soon find themselves facing exorbitant fines or criminal charges for being your typical nonprofit, managing their infrastructure on a shoestring and, by necessity, skimping on some of the best practices. It’s simple – the more we invest in administration, the worse we look in our Guidestar ratings. In that scenario, outsourcing this expertise is a more affordable and reliable option than trying to staff to it, or, worse, hope we don’t get caught.

But one point of Michelle’s that I absolutely agree with is that IT-starved nonprofits lack the internal expertise to properly assess hosting environments. In any outsourcing arrangement, the vendors have to be thoroughly vetted, with complete assurances about your access to data, their ability to protect it, and their plans for your data if their business goes under. Just as you wouldn’t delegate your credit card processing needs to some kid in a basement, you can trust your critical systems to some startup with no assurance of next year’s funding. So this is where you make the right investments, avail yourself of the type of information that Idealware provides, and hire a consultant.

To me, there are two types of risk: The type you take, and the type you foster by assuming that your current practices will suffice in an ever-changing world (more on this next week). Make no mistake, SaaS is a risky enterprise. But managing your own technology without tech-savvy staff on hand is something worse than taking a risk – it’s setting yourself up for disaster. While there are numerous ways to mitigate that, none of them are dollar or risk free, and SaaS could prove to be a real bang for your buck alternative, in the right circumstances.

Technology and Risk: Are you Gathering Dust?

This post originally appeared on the Idealware Blog in April of 2009.

Last week I had the thrill of visiting a normally closed-to-the-public Science Building at UC Berkeley, and getting a tour of the lab where they examine interstellar space dust collected from the far side of Mars. NASA spent five or six years, using some of the best minds on the planet and $300,000,000, to develop the probe that went out past Mars to zip (at 400 miles a second) through comet tails and whatever else is out there, gathering dust. The most likely result of the project was that the probe would crash into an asteroid and drift out there until it wasted away. But it didn’t, and the scientists that I met on Saturday are now using these samples to learn things about our universe that are only speculative fiction today.

So, what does NASA know that we don’t about the benefits of taking risks?

In my world of technology management, it seems to be primarily about minimizing risk. We do multiple backups of critical data to different media; we lock down the internet traffic that can go in and out of our network; we build redundancy into all of our servers and systems, and we treat technology as something that will surely fail if we aren’t vigilant in our efforts to secure it. Most of our favorite adages are about avoiding risk: “It it ain’t broke, don’t fix it!” and “Nobody was ever fired for buying IB.. er, MicroSoft.”

On Monday, I’ll be presenting on my chapter of NTEN‘s Book “Managing Technology to Meet Your Mission” at the Nonprofit Technology Conference in San Francisco. My session, and chapter, is about mission-focused technology planning and the art of providing business-class systems on a nonprofit budget. That’s certainly about finding sustainable and dependable options, but my case is that nonprofits, in particular, need to identify the areas where they can send out those probes and gamble a bit. For many nonprofits, technology planning is a matter of figuring out which systems desperately need upgrading and living with a lot of systems and applications that are old and semi-functional. My case is that there’s a different approach: we should spend like a regular business on the critical systems, but be creative and take risks where we can afford to fail a bit, on the chance that we’ll get far more for less money than we would playing it “safe” with inadequate technology. It’s a tough sell, yes, but I frame it in my belief that, when your business is changing the world, your business plan has to be bold and creative. As I mention often, the web is, right now, a platform rife with opportunity. We will miss out on great chances to significantly advance our missions if we just treat it like another threat to our stability.

We need stable systems, and we often struggle with inadequate funding and the technical resources simply to maintain our computer systems. I say that, as hard as that is, we need to invest in exploration. It’s about maximizing potential at the same time as you minimize risk. And its all about the type of dust that you want to gather.

NTC (Just) Past and Future

Photo by Andrew J. Cohen of Forum1

Photo by Andrew J. Cohen of Forum1

Here it is Saturday, and I’m still reeling from the awesome event that was the Nonprofit Technology Conference, put on by org of awesomeness NTEN. First things first, if you attended, live or virtually, and, like me, you not only appreciate, but are pretty much astounded by the way Holly, Anna, Annaliese, Brett and crew get this amazing event together and remain 100% approachable and sociable while they’re keeping the thing running, then you should show your support here.

We had 1400 people at the sold-out event, and if that hadn’t been a capacity crowd, I’m pretty sure we had at least 200 more people that were turned away. What does that say about this conference in a year when almost all of us have slashed this type of budget in response to a dire economic situation? I think it says that NTEN is an organization that gets, totally, and phenomenally, what the web means to cash-strapped, mission-focused organizations, and, while we have all cut spending, sometimes with the painful sacrifice of treasured people and programs, we know that mastering the web is a sound strategic investment.

Accordingly, social media permeated the event, from the Clay Shirky plenary, to the giant screen of tweets on the wall, and the 80% penetration of social media as topic in the sessions. As usual, I lit a candle for the vast majority of nonprofit techies who are not on Twitter, don’t have an organizational Facebook page, and, instead, spend their days troubleshooting Windows glitches and installing routers. My Monday morning session, presented with guru Matt Eshleman of CITIDC, was on Server Virtualization. If you missed it, @jackaponte did such a complete, accurate transcription, and you can feel like you were there just by reading her notes (scroll down to 10:12) and following along with the slides.

My dream — which I will do my best to make reality — is that next year will include a Geek Track that focuses much harder on the traditional technology support that so many NPTechs need. I stand on record that I’m willing to put this track together and make it great!

I was also quite pleased to do a session on How to Decide, Planning and Prioritizing, based on my chapter of NTEN’s book, Managing Technology to Meet Your Mission.  It was really great to start the session with a question that I’ve always dreamed I’d be able to ask: “Have you read my book?”.  I’m in debt to NTEN for that opportunity!

The biggest omission at this event (um, besides reliable wifi, but what can you do?) was the addition of a twitter name space on our ID badges. Twitter provided a number of things to the — by my estimation — half of the attendees who hang out there.

  • Event anticipation buildup, resource sharing, session coordination and  planning, ride and room sharing and other activities were all rife on Twitter as the conference approached.
  • Session tweeting allowed people both in other sessions and at home to participate and share in some of the great knowledge shared.
  • For me, as a Twitter user who has been on the network for two years and is primarily connected to NTEN members, Twitter did something phenomenal. Catching up with many of my “tweeps”, we just skipped the formalities and dived into the conversations. So much ice is broken when you know who works where, what they focus on in their job, if they have partners and/or kids, what music tastes you share, that catching up in person means diving in deeper. The end result is clear — #09ntc is still an active tag on Twitter, and the conference continues there, and will continue until it quietly evolves into #10ntc.

One thing, however, worries me. This was the tenth NTC, my fifth, but it was the first NTC that the online world noticed. Tuesday, on Twitter, we were the second most popular trend (the competing pandemic outranked us). NTEN’s mission is to help nonprofits use technologies to further their missions. But, as said above, this conference was, in many ways, a social media event. I’m hoping that Holly and crew will review their registration process next year to insure that early spots in what is sure to be an even more popular event aren’t filled up by people who really aren’t as committed to changing the world as they are to keeping up with this trend.

But, concerns aside, we need to send that team to a week-long spa retreat, and be proud of them, and proud of ourselves for not only being a community that cares, but being one that shares. I urge even the most skeptical of you to jump on the Twitter bandwagon, we’re not on there discussing what we had for breakfast. We’re taking the annual event and making it a perpetual one, with the same expertise sharing,  querying, peer support and genuine camaraderie that makes the nptech community so unique – and great. Come join us!