Monthly Archives: January 2014

Notes From Here And There

IMAG0236_1
Long time no blog, but I have good excuses.  Moving cross-country, even with a modest family of three, is no picnic, and we are now, over 13 months since I was offered the job in DC, starting to see the light at the end of the tunnel. Since summer, I’ve been frantically house hunting and, since December, busy relocating (for the third time) to our new, tree-laden home in Reston.

This, however, doesn’t mean that I haven’t been writing or totally neglecting my nptech duties. So here are some things to look forward to:

#ntcbeer. First and foremost. The annual Nonproft Technology Conference runs here in DC from March 13th to 15th, and the 6th Annual #ntcbeer will take place, as always, the night prior (Wednesday, 3/12, 7pm).  This year we’re at the Black Squirrel, a bar that’s a 15 minute stroll from the hotel (in the trendy Adams Morgan district) with three stories and 80 craft beers, which one would hope will meet the requirements. But I’m willing to bet (seriously!  Who wants to get in the pool?) that we will top their max standing room of about 200 people.  Here’s my logic: we averaged about 175 people last year in Minneapolis and the year prior in SF.  Minneapolis likely would have been bigger but a lot of planes were delayed by weather.  This year, we’re in DC, and that means two things: first, this is the largest center for NPOs in the world.  A lot more of the attendees live here. Second, it’s a very social place.  So I think that it’s not only likely that we’ll top 200; I don’t think 300 is out of range. We’ll have the Facebook page up in a week or two and we can hammer it all out there.

Also, #ntcbeer has sponsors this year.  We’ve been bought out by Blackbaud. (kidding!). Blackbaud and CommunityIT will be on hand with snacks and possible giveaways.  We’re figuring all of that out. Sponsorship is good, because this year we did manage to find a bar that doesn’t require a financial commitment up front, but I don’t think that will be possible in SF next year, given what a hard time we had finding a location in 2012.

Related, details to come, is that, prior to #ntcbeer on the 12th, I’ll be hosting a pre-conference workshop on IT Leadership with Richard Wollenberger and Katie Fritz.

As to that writing, keep your eyes open this week and next for NTEN’s release of “Collected Voices: Data-Driven Nonprofits. I spent 2013 participating in NTEN and Microsofts’ Communities of Impact program, where I joined 17 other nonprofit staff in diving into the challenges of managing, maximizing and sharing data in our sector.  We had two in person, two day meetings; numerous calls with bright presenters; active and professional facilitation by Julia Smith, NTEN’s Program Director; and this is the final product.  In addition to a few case studies and short pieces, I contributed an article on “Architecting Healthy Data Management Systems”. As this is really the focus of my career, whether it was unifying the database backend and building a portal to all client data at a law firm in the 90’s, or developing an open source retail data warehouse at Goodwill, or migrating/connecting all of LSC’s grantee data and documents to a Salesforce instance at my current job, this is the work that I think I do best, and I have a lot of best practices to share.  So I’m somewhat proud and happy to be publishing this article. it will be a free download for NTEN members.

Speaking of LSC, I’ve been busy there as well. We held our 14th annual technology conference two weeks ago, with record attendance. Among the crowd were frequent collaborators of mine like Laura Quinn of Idealware and Matt Eshleman of CommunityIT. It was a great time, with a lot of valuable sessions and discussions on data, internet security, and business process mapping.  We held a “Meet the Developer” session where our grantees, for the first time, got to speak directly with the guy that programs our online applications and give him some direct feedback. I attended in order to both facilitate and act as a human shield.  😉

The conference followed the release of our report on the two year technology summit that we hosted.  This consisted of two gatherings of leaders in the access to justice community from legal aid law firms, the courts, the ABA, the State Department, and the NLADA, along with key application developers and strategic thinkers.  We worked on a goal:

“to explore the potential of technology to move the United States toward providing some form of effective assistance to 100% of persons otherwise unable to afford an attorney for dealing with essential civil legal needs.”

Currently, the research shows that only 20% of those that qualify for and need the legal assistance that our funding provides are being served by the limited pool of attorneys and resources dedicated to this work. The report makes the case that 100% can receive some level of assistance, even if that isn’t actual legal representation, by innovative use of technology.  But we are working on the assertion that some help is better than no help, which is what 80% of those who need help get today.

The key strategies include:

  • using statewide portals effectively to connect people to the available resources
  • maximizing the use of document assembly to assist individuals in preparing court forms (a goal that lives or dies by the standardization of such forms, which is currently a big challenge)
  • Expanded use of mobile and SMS (many of the people who need assistance lack computers and smartphones, but can text)
  • Business Process Analysis, to insure that we are efficiently delivering any and all services, and
  • Expert Systems and intelligent Checklists, in order to resource individuals and attorneys to navigate the legal system.

As I mention here often, the right to an attorney only applies to criminal cases, not civil, but the peril for low income families and individuals from civil lawsuits is apparent.  You could lose your house, your children, your job, or your health if you can’t properly defend yourself against a wealthier accuser.  Equal justice is a cornerstone of American ethics. Take a look at the best thinking on how technology can help to restore it.

How I Learned To Stop Worrying and Love The RFP

This article was originally posted on the NTEN Blog in January of 2014.

Requests for Proposals (RFPs) seem like they belong in the world of bureaucratic paperwork instead of a lean, tech-savvy nonprofit. There’s a lot that can be said for an RFP when both sides understand how useful a tool an RFP can be – even to tech-savvy nonprofits.

Here’s a safe bet: preparing and/or receiving Requests for Proposals (RFPs) is not exactly your favorite thing. Too many RFPs seem like the type of anachronistic, bureaucratic paperwork more worthy of the company in Office Space than a lean, tech-savvy nonprofit. So you may wonder why I would pitch a 90 minute session on the topic for this year’s Nonprofit Technology Conference. I’d like to make the case for you to attend my session: Requests for Proposals: Making RFPs Work for Nonprofits and Vendors.

The problems with RFPs are numerous, and many of you have tales from the trenches that could fill a few horror anthologies regarding them. I’ll be the first to agree that they often end up doing more harm than good for a project.  But I believe that this is due to a poor understanding of the purpose of the RFP, and a lack of expertise and creativity in designing them. What a successful RFP does is to help a client assess the suitability of a product or service to their needs long before they invest more serious resources into the project. That’s very useful.

The mission of the RFP is two-fold: a well written RFP will clearly describe the goals and needs of the organization/client and, at the same time, ask the proper questions that will allow the organization to vet the product or consultant’s ability to address those needs. Too often, we think that means that the RFP has to ask every question that will need to be asked and result in a detailed proposal with a project timeline and fixed price. But the situations where we know exactly, at the onset, what the new website, donor database, phone system or technology assessment will look like and should look like before the project has begun are pretty rare.

For a consultant, receiving an RFP for a web site project that specifies the number of pages, color scheme, section headings and font choices is a sign of serious trouble. Because they know, from experience, that those choices will change. Pitching a  fixed price for such a project can be dangerous, because as the web site is built, the client might find that they missed key components, or the choices that they made were wrong. It does neither party any good to agree to terms that are based on unrealistic projections, and project priorities often change, particularly with tech projects that include a significant amount of customization.

So you might be nodding your head right now and saying, “Yeah, Campbell, that’s why we all hate those RFPs. Why use ’em?” To which I say, “Why write them in such a way that they’re bound to fail?”

The secret to successful RFP development is in knowing which questions you can ask that will help you identify the proper vendor or product. You don’t ask how often you’ll be seeing each other next spring on the first date. Why ask a vendor how many hours they project it will take them to design each custom object in your as yet un-designed Salesforce installation? Some information will be more relevant — and easier to quantify — as the relationship progresses.

At the RFP session, we’ll dive into the types of questions that can make your RFP a useful tool for establishing a healthy relationship with a vendor. We’ll learn about the RFPs that consultants and software vendors love to respond to.  We’ll make the case for building a critical relationship in a proactive and organized fashion.  And maybe, just maybe, we’ll all leave the session with a newfound appreciation for the much-maligned Request for Proposal.

Don’t miss Peter’s session at the 14NTC on Friday, March 14, 3:30pm -5:00pm.

Peter Campbell is a nonprofit technology professional, currently serving as Chief Information Officer at Legal Services Corporation, an independent nonprofit that promotes equal access to justice and provides grants to legal aid programs throughout the United States. Peter blogs and speaks regularly about technology tools and strategies that support the nonprofit community.

Making Your Website More Useful For More People

This post was originally published on the LSC Technology Blog in January of 2014. LSC is Legal Services Corporation, my employer.

At LSC, we’ve been taking a critical look at our web site, to see if we can make it a more useful web site by factoring in all of the ways that people might want to view or use our information. In these days of big data and small screens, we realize that we have to be much more attentive to the ways that we present data than we have in the past.

Identifying the different visitors who frequently use our site, we took a closer look at their needs, and how we could improve our delivery of information to them. For example, visitors to LSC’s web site could be:

  • reporters or Hill staffers looking for a quick cut and paste of data on the site that is hard to get out of a linked PDF;
  • general public looking for data to pull into a spreadsheet, who would also be disappointed to find that data in a PDF;
  • visually or physically impaired, and therefore not able to view web content that isn’t compliant with the standards that their specialized software requires;
  • accessing the site on a mobile device that doesn’t display flash or video and has no capability to display a PDF

The PDF Problem

Adobe has done great things with the Portable Document Format, opening it up as a public standard and continually improving the functionality of the format. But this is not an optimal format for web-based content, because PDFs require additional software in order to be viewed, and they need to be created with a solid understanding of how PDFs need to be prepared, so that they are compatible with accessibility standards. Our goal is to ensure content is delivered optimally, and in a format that makes it easy to access for anyone and everyone visiting our site.

In the past, we’ve relied heavily on publishing web content via PDF, and we now have a backlog of documents that aren’t as widely usable as we would like. Our plan is to immediately make two changes:

  1. Use PDF sparingly and thoughtfully as we move forward. Use PDFs as optional downloads for content that is also displayed in HTML, or as appropriate downloads for white papers and legal reports that aren’t the types of things that users will want to quote or edit; design PDFs that are compatible with the section 508 standards for web accessibility.
  1. Determine which of our existing PDFs need to be republished in more accessible formats and convert them. We don’t have the resources to fix everything, but we have good statistical data from Google Analytics to tell us which PDFs our visitors look at and a good idea how to prioritize this content.

Open Web

As a nonprofit that allocates federal funds, we have a responsibility to make data available to the public. But a commitment to open data means more than just making the data available; it needs to be available in formats that people can easily use. Data stored in an HTML table can be copied and pasted into Excel. Data in PDF and image formats can’t be, at least, not easily. As David Ottewell recently tweeted, a PDF of a spreadsheet is not a spreadsheet. These efforts dovetail with our broader efforts to make data available in manipulatable formats.

Wild, Wild Web

It is also important that our web site deliver the same user experience on smartphones or a tablets as it would when viewed on desktop or laptop browsers. This wasn’t high on our radar in 2011, when we redesigned our website in the Drupal content management system. At the time, we developed a mobile site as a separate, fractional copy of our main site.

Looking ahead

A  modest revamp of LSC.GOV is planned for second half of 2014 to improve the site navigation and responsiveness on multiple devices (e.g. one site that alters it’s navigational elements and appearance to properly utilize the screen that it’s displayed on). We also won’t forget the visitors that don’t have smart phones and how best to make information available to them.

Having a website that anticipates their diverse needs of our online visitors is our goal. What’s yours? What are your current challenges?

Finding Aid To Improve Find Legal Aid

This post was originally published on the LSC Technology Blog in January of 2014. LSC is Legal Services Corporation, my employer.

FLA-example.PNG

Hands down, the most popular feature on LSC’s website is our Find Legal Aid lookup, which directs you to the LSC-funded legal services provider in your service area. I’m happy to announce that we’ve given this lookup a refresh while simplifying its use. But we didn’t do it alone, and the story of how we got this project going is one that I really want to share with our community.

As I’ve blogged about before, our service areas are a unique geography that doesn’t lend itself to easy data integration. This became a problem when we started looking at the possibility of sharing our data with the hacker community, in hopes that they would use it to develop apps that further equal justice goals. Simply put, our territories sometimes run within county and city boundaries, making it difficult to align them to standard geographical data. This also meant that our Find Legal Aid tool was a complicated piece of code that was never entirely accurate (it was right 99.8% of the time, and, otherwise, the people who answered calls could redirect someone to the proper legal services provider).

Our desire was to have Find Legal Aid work the same way that any major retailer’s “Find a Store” lookup would, with no more input required than a zip code. We didn’t have the internal expertise established to do this on our own. So we learned of a group called the DC Legal Hackers, and we introduced ourselves. DC Legal Hackers is one of a number of Legal Hacker groups in the US and Canada. Legal hackers work at the intersection of law and technology, looking for ways to improve public access and address inequities in the system via the web. Access to Justice is one of the areas that they focus on. When the group held their first hackathon, we pitched revamping our lookup as one of the projects. Glenn Rawdon, Jessie Posilkin and I attended the hackathon on a Saturday and assisted where we could. We watched as some brilliant people took the shapefiles that LSNTAP made of the LSC service areas and mashed them up in such a way that, by about 2:00 in the afternoon, we had a working prototype.

It took a bit more time for LSC staff members Peter Larsen, Christina Sanabria and Alex Tucker to take it from prototype to a fully-functional application. We gained a lot more internal expertise in working with mapping technology. It’s important to note, though, that this took time, building the skillset as we completed the application and kept up with other priorities. These projects work best when the deadlines are loose.

We did face some choices. The lookup does not return office addresses or info about branches. We assume that the service providers may prefer to start with telephone screening before directing the public to a particular office location. We are contemplating adding links to online intake systems and statewide web sites relevant to the results. And we’re looking to see if a SMS text-based version of Find Legal Aid might be easy to produce.

We’re grateful to DC Legal hackers for taking us halfway there, and over the programming hump that was beyond us. There’s a great community out there willing to work with us.