Tag Archives: innovation

Finding Aid To Improve Find Legal Aid

This post was originally published on the LSC Technology Blog in January of 2014. LSC is Legal Services Corporation, my employer.

FLA-example.PNG

Hands down, the most popular feature on LSC’s website is our Find Legal Aid lookup, which directs you to the LSC-funded legal services provider in your service area. I’m happy to announce that we’ve given this lookup a refresh while simplifying its use. But we didn’t do it alone, and the story of how we got this project going is one that I really want to share with our community.

As I’ve blogged about before, our service areas are a unique geography that doesn’t lend itself to easy data integration. This became a problem when we started looking at the possibility of sharing our data with the hacker community, in hopes that they would use it to develop apps that further equal justice goals. Simply put, our territories sometimes run within county and city boundaries, making it difficult to align them to standard geographical data. This also meant that our Find Legal Aid tool was a complicated piece of code that was never entirely accurate (it was right 99.8% of the time, and, otherwise, the people who answered calls could redirect someone to the proper legal services provider).

Our desire was to have Find Legal Aid work the same way that any major retailer’s “Find a Store” lookup would, with no more input required than a zip code. We didn’t have the internal expertise established to do this on our own. So we learned of a group called the DC Legal Hackers, and we introduced ourselves. DC Legal Hackers is one of a number of Legal Hacker groups in the US and Canada. Legal hackers work at the intersection of law and technology, looking for ways to improve public access and address inequities in the system via the web. Access to Justice is one of the areas that they focus on. When the group held their first hackathon, we pitched revamping our lookup as one of the projects. Glenn Rawdon, Jessie Posilkin and I attended the hackathon on a Saturday and assisted where we could. We watched as some brilliant people took the shapefiles that LSNTAP made of the LSC service areas and mashed them up in such a way that, by about 2:00 in the afternoon, we had a working prototype.

It took a bit more time for LSC staff members Peter Larsen, Christina Sanabria and Alex Tucker to take it from prototype to a fully-functional application. We gained a lot more internal expertise in working with mapping technology. It’s important to note, though, that this took time, building the skillset as we completed the application and kept up with other priorities. These projects work best when the deadlines are loose.

We did face some choices. The lookup does not return office addresses or info about branches. We assume that the service providers may prefer to start with telephone screening before directing the public to a particular office location. We are contemplating adding links to online intake systems and statewide web sites relevant to the results. And we’re looking to see if a SMS text-based version of Find Legal Aid might be easy to produce.

We’re grateful to DC Legal hackers for taking us halfway there, and over the programming hump that was beyond us. There’s a great community out there willing to work with us.

Wave Impressions

This post originally appeared on the Idealware Blog in November of 2009.

A few months ago, I blogged a bit about Google Wave, and how it might live up to the hype of being the successor to email.  Now that I’ve had a month or so to play with it, I wanted to share my initial reactions.  Short story: Google Wave is an odd duck, that takes getting used to. As it is today, it is not that revolutionary — in fact, it’s kind of redundant. The jury is still out.

Awkwardness

To put Wave in perspective, I clearly remember my first exposure to email.  I bought my first computer in 1987: a Compaq “portable”. The thing weighed about 60 pounds, sported a tiny green on black screen, and had two 5 and 1/4 inch floppy drives for applications and storage).  Along with the PC, I got a 1200 BPS modem, which allowed me o dial up local bulletin boards.  And, as I poked around, I discovered the 1987 version of email: the line editor.

On those early BBSes, emails were sent by typing one line (80 characters, max) of text and hitting “enter”.  Once “enter” was pressed, that line was sent to the BBS.  No correcting typos, no rewriting the sentence.  It was a lot like early typewriters, before they added the ability to strike out previously submitted text.

But, regardless of the primitive editing capabilities, email was a revelation.  It was a new medium; a form of communication that, while far more awkward than telephone communications, was much more immediate than postal mail.  And it wasn’t long before more sophisticated interfaces and editors made their way to the bulletin boards.

Google Wave is also, at this point, awkward. To use it, you have to be somewhat self-confident right from the start, as others are potentially watching every letter that you type.  And while it’s clear that the ability to co-edit and converse about a document in the same place is powerful, it’s messy.  Even if you get over the sprawling nature of the conversations, which are only minimally better than  what you would get with ten to twenty-five people all conversing in one Word document, the lack of navigational tools within each wave is a real weakness.

Redundant?

I’m particularly aware of these faults because I just installed and began using Confluence, a sophisticated, enterprise Wiki (free for nonprofits) at my organization. While we’ve been told that Wave is the successor to email, Google Docs and, possibly, Sharepoint, I have to say that Confluence does pretty much all of those things and is far more capable.  All wikis, at their heart, offer collaborative editing, but the good ones also allow for conversations, plug-ins and automation, just as Google Wave promises.  But with a wiki, the canvas is large enough and the tools are there to organize and manage the work and conversation.  With Wave, it’s awfully cramped, and somewhat primitive in comparison.

Too early to tell?

Of course, we’re looking at a preview.  The two things that possibly differentiate Wave from a solid wiki are the “inbox” metaphor and the automation capabilities. Waves can come to you, like email, and anyone who has tried to move a group from an email list to a web forum knows how powerful that can be. And Wave’s real potential is in how the “bots”, server-side components that can interact with the people communicating and collaborating, will integrate the development and conversation with existing data sources.  It’s still hard to see all of that in this nascent stage.  Until then, it’s a bit chicken and egg.

Wave starting points

There are lots of good Wave resources popping up, but the best, hands down, is Gina Trapini’s Complete Guide, available online for free and in book form soon. Gina’s blog is a must read for people who find the types of things I write about interesting.

Once you’re on wave, you’ll want to find Waves to join, and exactly how you do that is anything but obvious.  the trick is to search for a term “such as “nonprofit” or “fundraising” and add the phrase “with:public”. A good nonprofit wave to start with is titled, appropriately, “The Nonprofit Technology Wave”.

If you haven’t gotten a Wave invite and want to, now is the time to query your Twitter and Facebook friends, because invites are being offered and we’ve passed the initial “gimme” stage.  In fact, I have ten or more to share (I’m peterscampbell on most social networks and at Google’s email service).

The Road to Shared Outcomes

This post originally appeared on the Idealware Blog in May of 2009.

At the recent Nonprofit Technology Conference, I attended a somewhat misleadingly titled session called “Cloud Computing: More than just IT plumbing in the sky“. The cloud computing issues discussed were nothing like the things we blog about here (see Michelle’s and my recent “SaaS Smackdown” posts). Instead, this session was really a dive into the challenges and benefits of publishing aggregated nonprofit metrics. Steve Wright of the Salesforce Foundation led the panel, along with Lucy Bernholz and Lalitha Vaidyanathan. The session was video-recorded; you can watch it here.

Steve, Lucy and Lalithia painted a pretty visionary picture of what it would be like if all nonprofits standardized and aggregated their outcome reporting on the web. Lalithia had a case study that hit on the key levels of engagement: shared measurement systems; comparative performance measurement and a baked in learning process. Steve made it clear that this is an iterative process that changes as it goes — we learn from each iteration and measure more effectively, or more appropriately for the climate, each time.

I’m blogging about this because I’m with them — this is an important topic, and one that gets lost amidst all of the social media and web site metrics focus in our nptech community. We’re big on measuring donations, engagement, and the effectiveness of our outreach channels, and I think that’s largely because there are ample tools and extra-community engagement with these metrics — every retailer wants to measure the effectiveness of their advertising and their product campaigns as well. Google has a whole suite of analytics available, as do other manufacturers. But outcomes measurement is more particular to our sector, and the tools live primarily in the reporting functionality of our case and client management systems. They aren’t nearly as ubiquitous as the web/marketing analysis tools, and they aren’t, for the most part, very flexible or sophisticated.

Now, I wholly subscribe to the notion that you will never get anywhere if you can’t see where you’re going, so I appreciate how Steve and crew articulated that this vision of shared outcomes is more than just a way to report to our funders; it’s also a tool that will help us learn and improve our strategies. Instead of seeing how your organization has done, and striving to improve upon your prior year’s performance, shared metrics will offer a window into other’s tactics, allowing us all to learn from each others’ successes and mistakes.

But I have to admit to being a bit overwhelmed by the obstacles standing between us and these goals. They were touched upon in the talk, but not heavily addressed.

  • Outcome management is a nightmare for many nonprofits, particularly those who rely heavily on government and foundation funding. My brief forays into shared outcome reporting were always welcomed at first, then shot down completely, the minute it became clear that joint reporting would require standardization of systems and compromise on the definitions. Our case management software was robust enough to output whatever we needed, but many of our partners were in Excel or worse. Even if they’d had good systems, they didn’t have in-house staff that knew how to program them.
  • Outcomes are seen by many nonprofit executives as competitive data. If we place ours in direct comparison with the similar NPO down the street, mightn’t we just be telling our funders that they’re backing the wrong horse?
  • The technical challenges are huge — of the NPOs that actually have systems that tally this stuff, the data standards are all over the map, and the in-house skill, as well as time and availability to produce them, is generally thin. You can’t share metrics if you don’t have the means to produce them.

A particular concern is that all metrics are fairly subjective, as can happen when the metrics produced are determined more by the funding requirements than the NPO’s own standards. When I was at SF Goodwill, our funders were primarily concerned with job placements and wages as proof of our effectiveness. But our mission wasn’t one of getting people jobs; it was one of changing lives, so the metrics that we spent the most work on gathering were only partially reflective of our success – more outputs than outcomes. Putting those up against the metrics of an org with different funding, different objectives and different reporting tools and resources isn’t exactly apples to apples.

The benefits of shared metrics that Steve and crew held up is a worthwhile dream, but, to get there, we’re going to have to do more than hold up a beacon saying “This is the way”. We’re going to have to build and pave the road, working through all of the territorial disputes and diverse data standards in our path. Funders and CEOs are going to have to get together and agree that, in order to benefit from shared reporting, we’ll have to overcome the fact that these metrics are used as fodder in the battles for limited funding. Nonprofits and the ecosystem around them are going to have to build tools and support the art of data management required. These aren’t trivial challenges.

I walked into the session thinking that we’d be talking about cloud computing; the migration of our internal servers to the internet. Instead, I enjoyed an inspiring conversation that took place, as far as I’m concerned, in the clouds. We have a lot of work to do on the ground before we can get there.

The ROI on Flexibility

This post originally appeared on the Idealware Blog in April of 2009.

Non Profit social media maven Beth Kanter blogged recently about starting up a residency at a large foundation, and finding herself in a stark transition from a consultant’s home office to a corporate network. This sounds like a great opportunity for corporate culture shock. When your job is to download many of the latest tools and try new things on the web that might inform your strategy or make a good topic for your blog, encountering locked-down desktops and web filtering can be, well, annoying is probably way to soft a word. Beth reports that the IT Team was ready for her, guessing that they’d be installing at least 72 things for her during her nine month stay. My question to Beth was, “That’s great – but are they just as accommodating to their full-time staff, or is flexibility reserved for visiting nptech dignitaries?”

The typical corporate desktop computer is restricted by group policies and filtering software. Management, along with the techs, justify these restrictions in all sorts of ways:

  • Standardized systems are easier, more cost-effective to manage.
  • Restricted systems are more secure.
  • Web filtering maximizes available bandwidth.

This is all correct. In fact, without standardization, automation, group policies that control what can and can’t be done on a PC, and some protection from malicious web sites, any company with 15 to 20 desktops or more is really unmanageable. The question is, why do so many companies take this ability to manage by controlling functionality to extremes?

Because, in many/most cases, the restrictions put in place are far broader than is necessary to keep things manageable. Web filtering not only blocks pornography and spyware, but continues on to sports, entertainment, politics, and social networking. Group policies restrict users from changing their desktop colors or setting the system time. And the end result of using the standardization tools to intensively control computer usage results, most often, in IT working just as hard or harder to manage the exceptions to the rules (like Beth’s 72, above) than they would dealing with the tasks that the automation simplifies in the first place.

Restricting computer/internet use is driven by a management and/or IT assumption that the diverse, dynamic nature of computing is either a distraction or a problem. The opportunity to try something new is an opportunity to waste time or resources. By locking down the web; eliminating a user’s ability to install applications or even access settings, PC’s can be engineered back down to the limited functionality of the office equipment that they replaced, such as typewriters, calculators and mimeograph machines.

In this environment, technology is much more of a controlled, predictable tool. But what’s the cost of this predictability?

  • Technology is not fully appreciated, and computer literacy is limited in an environment where users can’t experiment.
  • Strategic opportunities that arise on the web are not noticed and factored into planning.
  • IT is placed in the role of organizational nanny, responsible for curtailing computer use, as opposed to enabling it.

Cash and resource-strapped, mission-focused organizations only need look around to see the strategic opportunities inherent in the web. There are an astounding number of free, innovative tools for activism and research. Opportunities to monitor discussion of your organization and issues, and meaningfully engage your constituents are huge. And all of this is fairly useless if your staff are locked out of the web and discouraged from exploring it. Pioneers like Beth Kanter understand this. They seek out the new things and ask, how can this tool, this web site, this online community serve our sector’s goals to ease suffering and promote justice? More specifically, can you end hunger in a community with a widget? Or bring water to a parched village via Twitter? If our computing environment is geared to stifle innovation at the cost of security, are we truly supporting technology?

As the lead technologist at my organization, I want to be an enabler. I want to see our attorneys use the power of the web to balance the scales when we go to court against far better resourced corporate and government counsel. In this era of internet Davids taking down Goliaths from the RIAA the the mainstream media, I don’t want my co-workers to miss out on any opportunities to be effective. So I need the flexibility and perspective to understand that security is not something that you maintain with a really big mallet, lest you stamp out innovation and strategy along with the latest malware. And, frankly, cleaning a case of the conflickr worm off of the desktop of an attorney that just took down a set of high-paid corporate attorneys with data grabbed from some innovative mapping application that our web-filtering software would have mistakenly identified as a gaming site is well worth the effort.

Flexibility has it’s own Return on Investment (ROI), particularly at nonprofits, where we generally have a lot more innovative thinking and opportunistic attitude than available budget. IT has to be an enabler, and every nonprofit CIO or IT Director has to understand that security comes at a cost, and that cost could be the mission-effectiveness of our organizations.

Lessons Learned: Effective Practices In IT Management

This article was first published on the NTEN Blog in May of 2007.

Peter Campbell, TechCafeteria.com

I’ve spent more than 20 years in the sometimes maddening, sometimes wonderful, world of IT management. Along the way I’ve worked under a variety of CEOs with very diverse styles, and I’ve developed, deployed and maintained ambitious technology platforms. In order to survive, I put together three basic tenets to live by.

1. Management is 360 degrees: managing your superiors and peers is a bigger challenge than managing your staff.

2. To say anything effectively in an organization, you have to say it at least three times in three different media.

3. Follow Fidonet’s basic social guideline, “Do not be excessively annoying and do not become excessively annoyed.”

At a high level:

  • Work for the mission. Even in for-profit environments, I’ve managed to the organizational goals, not the individual personalities. You will avoid more political damage and navigate your way around the politics far more easily if you do the same. Don’t be scared of board or boss, and don’t cave in easily. This doesn’t mean that you countermand direct orders, but it does mean that you speak up if they don’t make sense to you. If you are in a political environment where, at the top, personality and ego trump mission in setting organizational priorities, then get out.
  • Make your priorities well known. Don’t ever assume that people are reading your business plans and proposals, and know for a fact that they haven’t read your emails. The key to successful project planning is communication, and that means face to face discussions with all parties with a stake in the project, especially those that you don’t particularly mesh with. Avoiding people who factor in your ability to succeed is a sure way to fail.
  • Take every opportunity to educate. Successful deployment of technology depends on joint ownership between the technology users and purveyors. Staff won’t own the technology if they don’t know what it does for them. In order to successfully manage technology, you need to constantly inform all parties at to what it can do for them.

Some other handy practices:

  • Run the IT Department as a lab – give your staff ample voice, diverse projects, and credit when they succeed. IT people, particularly in non-profits, are far more motivated by learning and accomplishing things than they are by money.
  • Value people skills, especially among your staff. Ability and comfort to communicate can be a more valuable talent than the ability to configure a Cisco 1750 blindfolded.
  • Marketing is not a dirty word! Sell your initiatives with PowerPoint, Project, and whatever else wows the suits.
  • Design for your users, not yourself. Stay aware that techies do not use the technology the way that everyone else does, and there is nothing wrong with everyone else – they just aren’t techies. So make sure that the software is configured to their needs and desires, not yours.
  • Consultants Rock! (and I’m not just saying that because I’m now a consultant). If you are doing your job well, a consultant can help you build resources and improve your status with management. Simple fact: The CEO will always listen to the consultant say exactly what you’ve been saying for years.
  • Be opportunistic. Apply for grants – you don’t have to wait for the grant writer to do it. Call different people at that vendor that you’re seeking a charitable discount from, not just the ones who think it will lower their commission. And then, back to marketing – let the CEO know every time you succeed.

Peter Campbell is a Business Technology Consultant focused on assisting members of the nonprofit/social services community with revenue-generating projects and promoting organizational self-sufficiency.