Tag Archives: nten blog

Year-end Reflections

This post was originally published on the NTEN Blog on December 24th, 2015.

As years go, 2015 was a significant one in my career. The work of a CIO, or IT Director, or whatever title you give the person primarily responsible for IT strategy and implementation, is (ideally) two parts planning and one part doing. So in 2015—my third year at Legal Services Corporation—we did a couple of the big things that we’d been planning in 2013 and 2014.

First and foremost, we (and I do mean we—I play my part, but I get things done with an awesome staff and coworkers) rolled out the first iteration of our “Data Portal.” The vision for the Data Portal is that, as a funder that works primarily with 134 civil legal aid firms across the U.S. and territories, we should be able to access the relevant information about any grantee quickly and easily without worrying about whether we have the latest version of a document or report. To reach this vision, we implemented a custom, merged Salesforce/Box system. This entailed about a year of co-development with our partner, Exponent Partners, and a move from in-house servers to the Cloud. We’ll complete our Cloud “trifecta” in early 2016, when we go to Microsoft’s Office 365.

This was particularly exciting for me, because I have been envisioning and waiting for technology to reach a level of maturity and… collegiality that makes the vision of one place where documents and databases can co-exist a reality. Integration, and one-stop access to information, have always been the holy grails that I’ve sought for the companies that I’ve worked for; but the quests have been Monty Python-esque through the days when even Microsoft products weren’t compatible with each other, much less compatible with anything else. What we’ve rolled out is more of a stump than a tree; but in the next year we’ll grow a custom grants management system on top of that; and then we’ll incorporate everything pertinent to our grantees that currently hides in Access, Excel, and other places.

I’m working on a much more detailed case study of this project for NTEN to publish next year.

Secondly, we revamped our website, doing a massive upgrade from Drupal 7 to… Drupal 7! The website in place when I came to LSC was content-rich, navigation-challenged, and not too good at telling people what it is that we actually do.The four separate websites that made up our entire site weren’t even cross-searchable until we addressed that problem in early 2014. Internal terminology and acronyms existed on the front page and in the menus, making some things incomprehensible to the public, and others misleading. For example, we often refer to the law firms that we fund as “programs.” But, in the funding world, a “program” is a funding category, such as “arts” or “environment.” Using that terminology. along with too buried an explanation that what we actually do is allocate funding, not practice law ourselves, led many people to assume that we were the parent office of a nationwide legal aid firm, which we aren’t.

The new site, designed by some incredibly talented people at Beaconfire-RedEngine (with a particular call out to Eve Simon, who COMPLETELY got the aesthetic that we were going for and pretty much designed the site in about six hours), tells you up front who we are, what we do, and why civil legal aid is so important, in a country where the right to an attorney is only assured in criminal cases. While civil cases include home foreclosures, domestic violence, child custody, and all sorts of things that can devastate the lives of people who can’t afford an attorney to defend them. This new site looks just as good on a phone as on a computer, a requirement for the Twenty-Teens.

My happiness in life directly correlates to my ability to improve the effectiveness of the organizations that I work for, with meaningful missions like equal justice for all, defense against those who pollute the planet, and the opportunity to work, regardless of your situation in life. At my current job, we’re killing it.

Happy 10th Anniversary!

Cyber-cafeJust a quick post to commemorate ten years of blogging here at Techcafeteria.  That’s 268 entries, averaging to 22 posts per year, or damn close to two posts a month, which is not too shabby for a guy with a family and a demanding day job. The most popular stuff all now lives in my Recommended Posts section.

The goal here has never been much more than to share what I hope is useful and insightful knowledge on how nonprofits can make good use of technology, peppered with the occasional political commentary or rant, but I try to restrain myself from posting too many of those. After my recent reformat, I think I’ve made it much easier for visitors to find the content that interests them, so if you’re one of my many RSS subscribers, and you haven’t actually visited the site for some time, you should take a look.

I’m ever thankful to Idealware, NTEN, Techsoup, CommunityIT, and many others in the nptech community for giving me the opportunity to write for their blogs and republish here (about two thirds of the content, I suspect). And I’m happy to be part of this global, giving community.

Here’s to the next ten years!

What Is Nonprofit Technology – The Director’s Cut

This article was originally published on the NTEN Blog on March 10th, 2015, where it was edited for length. As with any director’s cut, their version might be better than this one! But this is how it was originally composed. Click here for more context.

For the past 14 years, I’ve been working for 501(c)(3) corporations, commonly referred to as nonprofits.  I’ve also become active in what we call the “nptech” community — “nptech” being shorthand for “nonprofit technology”.  But nonprofits, which comprise about 10% of all US businesses, have wildly diverse business models.  To suggest that there is a particular type of technology for nonprofits is akin to saying that all of the businesses in downtown Manhattan have similar technology needs. So what is nonprofit technology?  Less of a platform and more of a philosophy.

Snowflakes? No flakes.

It’s often said that each nonprofit is unique, like a snowflake, with specific needs and modes of operation.  Let’s just remember that, as unique as a snowflake is, if you lay about a million of them next to each other on a  field, you can not tell them apart.

Nonprofits do not use any technology that is 100% unique to the nonprofit sector.  Fundraising systems operate exactly like sales systems, with leads, opportunities, campaigns and sales/donations. Similarly, advocacy applications are akin to marketing software. What nonprofits call Constituent Relationship Management Systems are called Customer Relationship Management systems everywhere else.  I want to make it clear that the technology used by nonprofits is analogous enough to what for-profits use to be nearly indistinguishable.

Also, small businesses, big businesses, most businesses operate under tight margins.  They keep overhead to a minimum.  They make decisions based on a scarcity of funding.   Nonprofits are not unique in their lack of sizable technology budgets.

No Margin for Investment.

The most significant difference between a nonprofit and a for-profit, from a business perspective, is this:

A for-profit holds to tight budgets in order to maximize profit. A nonprofit holds to tight budgets in order to remain funded.

Of course, for-profits can go under by getting their overhead ratio wrong.  But where they have room to move, and, say, invest 30% in overhead one year in order to boot up a long-term, profitable strategy, they can.  They can make the case to their board. Their customers will likely not even know how much they spent on technology, marketing, or extra staff.

If a nonprofit decides to boost the overhead rate by 30% for a year in order to boot up a long-term, mission-effective strategy, then Guidestar, Charity Navigator, the Better Business Bureau and their own website will, basically, tell their donors that they’re a bad investment, and the drop in donations might well sink them.  501(c)(3)’s are required to publish their financial statements for public review annually, and this is the data that they are primarily assessed on.  The effectiveness of their strategies are harder for nonprofits to qualify than it is for a retailer or manufacturer.

Customers don’t care how a Target and WalMart run their businesses; they care that they can buy anti-bacterial wipes at competitive prices. Constituents care deeply about how much of their donation is going to the people or cause that a nonprofit serves, as opposed to the operating expense of the nonprofit.

All businesses want to minimize expenses and increase profitability (even nonprofits!). But nonprofits must minimize those expenses; they have no strategic breathing room when it comes to funding operations.

Management is not the priority, fundraising is.

So, for a nonprofit, a CEO’s primary job is to maintain the funding.  In many cases, this means that the qualifications of a nonprofit CEO have a lot to do with their networking and fundraising skills.  Many nonprofits are run by people who don’t have extensive training or experience in business management.

Nonprofit IT Staff aren’t your typical techies

Nonprofits have lower IT budget and staff ratios than a typical for-profit. The average nonprofit IT budget is 1% to 2% of the entire budget, per NTEN Staffing Survey; average for-profit is 2% to 3%, per Gartner). IT Salaries are consistently below the market rate, and they vary wildly, with some nonprofits paying far below market, others at market. A common scenario at a nonprofit is for the technical staff to include, if not be totally made up of, “accidental techies“.  People who were hired for clerical or administrative work, had a knack for tech, and became the defacto tech person, sometimes also getting a title that reflects that. This is more common in smaller organizations, but it can happen anywhere that the administrative staffing is a small percentage of the overall staff and/or the CEO doesn’t know to hire IT professionals.

Is that a bad thing? Yes and no.  Accidental techies are often the people who had good, strategic notions about how technology could be applied to achieve objectives.  They tend to be smart, autonomous, good learners and teachers.  But they are more likely to be reactive and opportunistic in their approach to their work. IT also benefits from planning and consistency.  Truthfully, you need both styles on a healthy IT team.

So what is “Nonprofit Technology”?

It’s both a class of software and an approach to technology deployment.

Nonprofit technology includes fundraising, advocacy, grants management and other applications that support the primary technology needs, such as donor management and promotion of causes. In some cases, the same systems that salespeople and marketers use can suffice, as evidenced by the popularity of Salesforce in the nonprofit space. But the nonprofit sector has it’s own terminology around revenue processes, so, if commercial software is used, it’s modified to address that.  In the Salesforce case, a nonprofit will either use the Nonprofit Starter Pack, which “skins” Salesforce to feel more like a fundraising system, or purchase an actual fundraising application developed for the platform, such as Roundcause or Blackbaud’s Luminate.  Idealware, a nonprofit dedicated to helping nonprofits make good software choices publishes a booklet listing the types of software that nonprofits use.

Outside of those specialty applications, nonprofits use fairly standard stuff from Microsoft, Adobe, Google and other big companies. Many of these companies offer charity pricing, and further discounts are available to 501(c)(3)’s through Techsoup, a company that provides a transaction layer to vendors who want to donate software to charities. A seasoned IT staffer knows how to cut through the front line salespeople and find the person at a company that might make a donation or discount software or hardware.

But purchasing software is actually the easiest part.  Deploying it is the challenge, with little IT staff and less time to focus on how systems should be implemented, technology rollouts are often done on the fly.  Where a for profit might invest significant time up front analyzing the business processes that the system will address; evaluating products, and training staff, these steps are a hard sell in an understaffed environment where people always have at least five other things to prioritize.

Taking the NPTech Challenge

So if you are thinking of working at a nonprofit as an IT implementer (System Manager, IT Director, CIO), take heart: the work is rewarding, because the motivations are broader than just bringing home a paycheck.  The people are nice, and most nonprofits recognize that, if they’re going to pay poorly, they should let people have their lives nights and weekends. There are opportunities to learn and be creative. The constrained environment rewards inventive solutions. If you’re a tech strategist, you can try things that a more risk-averse for profit wouldn’t, as long as you the risk you’re taking isn’t too costly. For example, I built a retail reporting data warehouse at a Goodwill in 2003, saving us about a $100,000 on what it would have cost to buy a good reporting system.  I also pitched a business plan and started up ecommerce there, and I don’t have a college degree. If money isn’t your motivation, but accomplishing things that make a difference in people’s lives does excite you, this is a fertile environment.

That said, if you don’t like to talk to people, and you don’t think that marketing should be part of your job, think twice.  Successful technology implementations at nonprofits are done by people who know how to communicate. The soft skills matter even more than the tech skills, because you will likely be reporting to people who don’t understand what tech does.  If you can”t justify your projects in terms that they’ll understand, they won’t consider funding them.

You should be as good at the big picture as you are at the small ones.  NPTech is all about fixing the broken routers while you configure the CRM and interpret the Google analytics. You have to be good at juggling a lot of diverse tasks and projects, and conversant in multiple technologies.

Creativity trumps discipline. If you strictly follow the best ITIL policies and governance, be afraid. Strict adherence to for profit standards requires staffing and budget that you aren’t likely to have.  Good technology governance at nonprofits is a matter of setting priorities and making strategic compromises.

Collaboration and delegation are key. Nonprofits have a lot of cross-department functionality.  If you are all about IT controlling the systems, you’re going to have more work on your plate than you can handle and a frustrated user-base to work with.  Letting those who can do tech do tech — whether or not they have the credentials or report to you — is a key strategy towards getting it done.

NPTech is not just a job, it’s a community.

If some of what I’ve described above sounds discouraging, then know that the challenges are shared by a  committed group of tech practitioners that is welcoming and relatively free of ego.  Nobody has to take on the battle of improving nonprofit technology alone.  Search the #nptech hashtag on Google or Twitter and you’ll find groups, blogs and individuals who see this challenge as a shared one for our sector.  Make the time to check out an NTEN 501 Tech club meeting in your city or, better yet, attend their annual conference. Read all of the articles at Idealware.  Join the forums at Techsoup.  If this work is for you, then you are one of many who support each other, and each other’s organization’s missions, and we’ll help you along the way.

Pre-Post On What Is Nonprofit Technology

Early next week, I’m going to publish the “director’s cut” of my recent NTEN.ORG article, “What Is Nonprofit Technology“. But I wanted to talk about it a little first.

The story behind this article is that, late in 2014, I was approached by some online tech e-mag to write an article for them.  I thought, why not tell all of the for-profit techies what it’s really like working in our sector?  And I wrote a solid first draft.  Then I started researching the magazine, and couldn’t find much.  There was little in the way of a FAQ, so I couldn’t ascertain things like, “who owns the content submitted”? I decided against publishing there. I sent it on to Amy at NTEN, and she came back with the suggestion that they publish it in March, shortly after the NTEN conference, as the March theme is Nonprofit Management. And we did that.

The article has gone over really well with the nonprofit community, and is still being actively shared and liked across social media platforms nine days in.  I’m really flattered.  I think the strengths of the article are that it, first, distills a lot of my thinking over the last ten years or so about what we, as nonprofit technologists, do, and what our challenges are. I’ve been drafting this article in my head for a long, long time. But I think it also benefits from the fact that I wrote it for a different audience — one that doesn’t know our sector and our challenges well. And I both think and hope that this is a large part of why the article is resonating so well with the community. This is something that you can share with people outside of the sector that explains a lot about us.

That’s my goal, at least — I hope it’s true.  And I hope that it’s useful for you, particularly if you have friends that you’re trying to recruit into the side that promotes social good.

The “director’s cut” story is simple. Steph at NTEN admitted that her edits were primarily focused on shortening the article in order to fit NTEN’s max post length.  She did a great job — there is no point that I wanted to make missing from the NTEN version. But there are a few areas where the grammar got a little confused. My rendering is more spacious, with a few more examples.  So I decided to print it as originally written and let you decide which one you prefer.

Career Reflections: My Biggest Data Fail

This article was published on the NTEN Blog in February of 2014.  It originally appeared in the eBook “Collected Voices: Data-informed Nonprofits“.

Peter Campbell of Legal Services Corporation shares his biggest data fail, and what he’d do differently now.

This case study was originally published along with a dozen others in our free e-book, Collected Voices: Data-Informed Nonprofits. You can download the e-book here.

Note: names and dates have been omitted to protect the innocent. 

Years ago, I was hired at an organization that had a major database that everyone hated. My research revealed a case study in itself: how not to roll out a data management system. Long story short, they had bought a system designed to support a different business model, and then paid integrators to customize it beyond recognition. The lofty goal was to have a system that would replace people talking to each other. And the project was championed by a department that would not have to do the data entry; the department identified to do all of the work clearly didn’t desire the system.

The system suffered from a number of problems. It was designed to be the kitchen sink, with case info, board updates, contact management, calendaring, web content management, and other functions. The backend was terrible: a SQL database with tables named after the tabs in the user interface. The application itself had miserable search functionality, no dupe checking, and little in the way of data quality control. Finally, there were no organizational standards for data entry. Some people regularly updated information; others only went near it when nagged before reporting deadlines. One person’s idea of an update was three to five paragraphs; another’s two words.

I set out to replace it with something better. I believed (and will always believe) that we needed to build a custom application, not buy a commercial one and tweak it. What we did was not the same thing that the commercial systems were designed to track. But I did think we’d do better building it with consultants on a high-level platform than doing it by ourselves from scratch, so I proposed that we build a solution on Salesforce. The system had over 150 users, so this would be relatively expensive.

Timing is everything: I made my pitch the same week that financial news indicated that we were diving into a recession. Budgets were cut. Spending was frozen.  And I was asked if I could build the system in Access, instead?  And this is when I…

…explained to my boss that we should table the project until we had the budget to support it.

Or so I wish. Instead, I dusted off my amateur programming skills and set out to build the system from scratch. I worked with a committee of people who knew the business needs, and I developed about 90% of a system that wasn’t attractive, but did what needed to be done reasonably well. The goals for the system were dramatically scaled back to simply what was required.

Then I requested time with the department managers to discuss data stewardship. I explained to the critical VP that my system, like the last one, would only be as good as the data put into it, so we needed to agree on the requirements for an update and the timeliness of the data entry. We needed buy-in that the system was needed, and that it would be properly maintained. Sadly, the VP didn’t believe that this was necessary, and refused to set aside time in any meeting to address it. Their take was that the new system would be better than the old one, so we should just start using it.

This was where I had failed. My next decision was probably a good one: I abandoned the project. While my system would have been easier to manage (due to the scaled back functionality, a simple, logical database structure and a UI that included auto-complete and dupe-checking), it was going to fail, too, because, as every techie knows, garbage in equals garbage out. I wanted my system to be a success.  We went on with the flawed original system, and eventually started talking about a new replacement project, and that might have happened, but I left the company.

Lessons learned:

  1. If I’m the IT Director, I can’t be the developer. There was a lot of fallout from my neglected duties.
  2. Get the organizational commitment to the project and data quality standards confirmed before you start development.
  3. Don’t compromise on a vision for expediency’s sake.  There are plenty of times when it’s okay to put in a quick fix for a problem, but major system development should be done right.  Timing is everything, and it wasn’t time to put in a data management system at this company.

How I Learned To Stop Worrying and Love The RFP

This article was originally posted on the NTEN Blog in January of 2014.

Requests for Proposals (RFPs) seem like they belong in the world of bureaucratic paperwork instead of a lean, tech-savvy nonprofit. There’s a lot that can be said for an RFP when both sides understand how useful a tool an RFP can be – even to tech-savvy nonprofits.

Here’s a safe bet: preparing and/or receiving Requests for Proposals (RFPs) is not exactly your favorite thing. Too many RFPs seem like the type of anachronistic, bureaucratic paperwork more worthy of the company in Office Space than a lean, tech-savvy nonprofit. So you may wonder why I would pitch a 90 minute session on the topic for this year’s Nonprofit Technology Conference. I’d like to make the case for you to attend my session: Requests for Proposals: Making RFPs Work for Nonprofits and Vendors.

The problems with RFPs are numerous, and many of you have tales from the trenches that could fill a few horror anthologies regarding them. I’ll be the first to agree that they often end up doing more harm than good for a project.  But I believe that this is due to a poor understanding of the purpose of the RFP, and a lack of expertise and creativity in designing them. What a successful RFP does is to help a client assess the suitability of a product or service to their needs long before they invest more serious resources into the project. That’s very useful.

The mission of the RFP is two-fold: a well written RFP will clearly describe the goals and needs of the organization/client and, at the same time, ask the proper questions that will allow the organization to vet the product or consultant’s ability to address those needs. Too often, we think that means that the RFP has to ask every question that will need to be asked and result in a detailed proposal with a project timeline and fixed price. But the situations where we know exactly, at the onset, what the new website, donor database, phone system or technology assessment will look like and should look like before the project has begun are pretty rare.

For a consultant, receiving an RFP for a web site project that specifies the number of pages, color scheme, section headings and font choices is a sign of serious trouble. Because they know, from experience, that those choices will change. Pitching a  fixed price for such a project can be dangerous, because as the web site is built, the client might find that they missed key components, or the choices that they made were wrong. It does neither party any good to agree to terms that are based on unrealistic projections, and project priorities often change, particularly with tech projects that include a significant amount of customization.

So you might be nodding your head right now and saying, “Yeah, Campbell, that’s why we all hate those RFPs. Why use ’em?” To which I say, “Why write them in such a way that they’re bound to fail?”

The secret to successful RFP development is in knowing which questions you can ask that will help you identify the proper vendor or product. You don’t ask how often you’ll be seeing each other next spring on the first date. Why ask a vendor how many hours they project it will take them to design each custom object in your as yet un-designed Salesforce installation? Some information will be more relevant — and easier to quantify — as the relationship progresses.

At the RFP session, we’ll dive into the types of questions that can make your RFP a useful tool for establishing a healthy relationship with a vendor. We’ll learn about the RFPs that consultants and software vendors love to respond to.  We’ll make the case for building a critical relationship in a proactive and organized fashion.  And maybe, just maybe, we’ll all leave the session with a newfound appreciation for the much-maligned Request for Proposal.

Don’t miss Peter’s session at the 14NTC on Friday, March 14, 3:30pm -5:00pm.

Peter Campbell is a nonprofit technology professional, currently serving as Chief Information Officer at Legal Services Corporation, an independent nonprofit that promotes equal access to justice and provides grants to legal aid programs throughout the United States. Peter blogs and speaks regularly about technology tools and strategies that support the nonprofit community.

A Brief History of Nonprofit Technology Leadership, And a Call to Action for New Circuit Riders

This article was first published on the NTEN Blog in June of 2013.

When someone asked me, “What is the role of circuit riders today?” I didn’t have an immediate answer. But the question stuck with me, and I have an idea that I want to share, appropriately, with the NTEN community.

A month or two ago, a friend of mine asked me a great question: “What is the role of circuit riders today?” I didn’t have an immediate answer. But the question stuck with me, and I have an idea that I want to share, appropriately, with the NTEN community.

We speak a lot here about nonprofit technology, more affectionately known as “nptech.” The origins of nptech lie in the tradition of circuit riding. The circuit riders that founded NTEN were a loosely affiliated group of people who saw the need for technology at nonprofits before the nonprofits did. As with the Lutheran ministers from whom they borrowed the “circuit rider” name, these people weren’t motivated by money, but by missions, those being the missions of the numerous nonprofits that they served.

The typical services that a circuit rider would provide included setting up basic PC networks, installing phone systems, and designing Access or Filemaker databases to replace paper donation records. While NPOs still need some help getting their basic technical plumbing in order, that work is now simpler and help is easier to find than it was in the 90’s. And anyone designing an Access database for an NPO today should be spanked!

In the early 90’s, we were hitting that turning point where PCs went from specialized systems to commodity equipment. Prior to that, a telephone was on every desk, but there wasn’t necessarily a computer. And, even if there was one there, it wasn’t turned on every day. Today you don’t even need a phone if you have a computer, a VOIP service, and a headset. So we hire people trained in setting up specific systems, or we pay a professional company, rather than relying on volunteers, because it’s more critical to get it right.

So what is the role of the circuit rider in a world where we hand the networking to tech integrators and subcontract database design to specialized Blackbaud and Salesforce consultants? By nature, the role of the New Circuit Rider should be short-term engagements that offer high value. It should capitalize on a technical skill set that isn’t readily available, and it should be informed by a thorough understanding of nonprofit needs.

It’s a type of technology leadership – maybe even a stewardship of technology leadership. I say that it starts with technology assessments. What small to mid-sized nonprofits need most importantly is some good advice about what to prioritize, what to budget for, how to staff IT, and how to support technology. The modern circuit riders’ legacy should be a track record of leaving their clients with a solid understanding of how to integrate technology staff, systems, and strategy into their work. There’s a great need for it.

Just this month I’ve heard stories of NPO leaders who have no idea how to title, compensate, or describe the duties of the IT leader that they know they need to hire; I’ve met newly promoted accidental techies charged with huge integration projects with no strategic plan in place; and I’ve seen a $15 million social services org scraping by with two full time IT staff supporting their five-office enterprise. These organizations need some guidance and advice.

So I’m opening the floor for strategies as to how we build a New Circuit Rider Network to fill this immediate need, and I’m proposing we start helping nonprofits do more than invest in technology, that we help them plan for it and resource it proactively.

Peter Campbell is a nonprofit technology professional, currently serving as Chief Information Officer at Legal Services Corporation, an independent nonprofit that promotes equal access to justice and provides grants to legal aid programs throughout the United States. Peter blogs and speaks regularly about technology tools and strategies that support the nonprofit community.

Best Of 2012: Nonprofit Technology Grows Up

This article was first published on the NTEN Blog in December of 2012.

I think that the best thing that happened in 2012 was that some of the 2010-2011 “bleeding edge” conceptual technologies stood up and proved they weren’t fads.

When NTEN asked me to write a “best tech of 2012” post, I struggled a bit. I could tell you about the great new iPads and Nexus tablets; the rise of the really big phones; the ascendency of Salesforce; and the boundary-breaking, non-gaming uses of MicroSoft’s Kinect. These are all significant product developments, but I think that the David Pogues and Walter Mossberg’s out there will have them covered.

I think that the best thing that happened in 2012 was that some of the 2010-2011 “bleeding edge” conceptual technologies stood up and proved they weren’t fads. These aren’t new topics for NTEN readers, but they’re significant.

Cloud computing is no longer as nebulous a thing as, say, an actual cloud. The question has moved somewhat soundly from “Should I move to the cloud?” to “Which cloud should I move to and when?” Between Microsoft’s Cloud ServicesGoogle Apps, and a host of additional online suites, there’s a lot to choose from.

Similarly, virtualization is now the norm for server rooms, and the new frontier for desktops. The ultimate merger of business and cloud computing will be having your desktop in the cloud, loadable on your PC, laptop, tablet or smartphone, from anywhere that you have an internet connection. Key improvements in Microsoft’s latest server platforms support these technologies, and Citrix and VMWare ars still growing and innovating, as Amazon, Google, Rackspace and others improve the net storage systems where our desktops can be housed.

Social networks aren’t the primary fodder for late night comedians anymore. Maybe there are still people ridiculing Twitter, but they aren’t funny, particularly when every product and place on earth now has it’s own Facebook page and hashtag. I mean, hashtags were created by geeks like us and now you see one superimposed on every TV show! I remember joining Facebook in 2007 and calling it “The Great Trivializer”, because the bulk of what I saw was my smart, committed NPTech friends asking me which five albums I would bring with me to a deserted island. Today, Facebook is a place where we communicate and plan. Its’s grown in ways that make it a far more serious and useful tool. Mind you, some of that growth was spurred by adding Google+ features, which are more geared toward real conversation.

But the big winner in 2012 was data. It was the year of Nate Silver and the Infographic. Nate (as countless post-election pundits have pointed out), via his fivethirtyeight blog at the New York Times, proved that data can be analyzed properly and predict the future. This is the power of aggregation: his perfect electoral college score was built on an aggregated analysis of multiple individual polls. I think this presents a clear challenge to nonprofits: You should keep doing your surveying, but for useful data on the demographics that fuel your mission, you need to partner with similar orgs and aggregate those results for more accurate analysis.

Infographics make data poignant and digestible. They tell the stories behind the data in picture book format. Innovative storytellers have used videos, cartoons and comic books to make their points, but nothing is as succinct at telling a data-based story as an infographic. There should be one or more in your next annual report.

Peter starts as Chief Information Officer at Legal Services Corporation in January.

Virtualization: The Revolution in Server Management and Why You Should Adopt It

This article was co-written by Matt Eshleman of Community IT Innovators and first published on the NTEN Blog in June of 2009.

  

Peter Campbell, Earthjustice and Matthew Eshleman, Community IT Innovators

This year’s Nonprofit Technology Conference offered a good chance to discuss one the most important — but geeky — developments in the world of computers and networks: server virtualization.

Targeting a highly technical session to an NTEN audience is kind of like cooking a gourmet meal with one entrée for 1000 randomly picked people. We knew our attendees would include as many people who were new to the concepts as there were tech-savvy types looking for tips on resolving cache conflicts between the SAN, DBMS and Hypervisor. We aimed to start very broad, focus on use cases, and leave the real tech stuff to the Q&A. We’ll try to do the same in this article.

We’ve already summarized the view from the top in a quick, ignite-style presentation, available wherever fine NTC materials are found (and also on Slideshare).  In a nutshell, virtualization technology allows many computers to run concurrently on one server, each believing it’s the sole occupant. This allows for energy and cost savings, greater efficiency, and some astounding improvements in the manageability of your networks and backups, as servers can be cloned or dragged, dropped and copied, allowing for far less downtime when maintenance is required and easy access to test environments.  It accomplishes this by making the communication between an operating system, like Windows or Linux, generic and hardware-independent.

Most of the discussion related to virtualization has been centered on large data centers and enterprise implementations, but a small network can also take advantage of the benefits that virtualization has to offer. Here are three common scenarios:

  • Using a new server running a virtualization hypervisor to migrate an existing server
  • Using a new server to consolidate 3-4 physical servers to save on electric & warranty expenses
  • Using a storage area network (SAN) to add flexibility and expandability to the infrastructure

In the first scenario, an existing server is converted into a virtual server running on new physical hardware. Tools from VMWare and other vendors allow disks to be resized, additional processor cores to be assigned and RAM to be added. The benefit to this process is that the physical server now exists on a new hardware platform with additional resources. End users are shielded from major disruptions and IT staff are not required to make any changes to scripts or touch workstations.

The second scenario, much like the first case, starts with the addition of new physical hardware to the network. Today’s servers are so powerful, it’s unlikely that more that 5% of their total processing power is used. That excess capacity allows an organization to use virtualization to lower their hardware expenses by consolidating multiple servers on one hardware platform. Ideal candidates are servers that run web & intranet applications, antivirus management, backup, directory services, or terminal services.  Servers that do a lot of transactional processing such as database & email servers can also be virtualized but require a more thoughtful network architecture.

The final scenario involves taking the first step toward a more traditional enterprise implementation, incorporating two physical servers connected to a SAN. In this scenario, the hardware resources continue to be abstracted from the virtual servers. The SAN provides much more flexibility in adding storage capacity and assigning it to the virtual servers as required. Adding multiple server heads onto the SAN will also provide the capacity to take advantage of advanced features such as High Availability, Live Server Migration, and Dynamic Resource Scheduling.

The space for virtualization software is highly competitive. Vendors such as Microsoft, VMWare, Citrix and Virtual Iron continue to lower their prices or provide their virtualization software for free. Using no-cost software, an organization can comfortably run a virtual server environment of 16 virtual servers on 3 physical machines.

The session was followed by a healthy and engaging Q&A, and we were fortunate to have it all transcribed by the incredibility talented Jack Aponte. Scroll down to 10:12 in her NTC Live Blog for a full re-enactment of the session. We can also start a new Q&A, in comments, below.

And stayed tuned for more! The biggest paradigm shift from virtualization is related to the process surrounding the backup and recovery of virtual servers. We’ll be writing an article for the November NTEN newsletter with some detailed scenarios related to backup & disaster recovery in the virtual environment.

The Five Best Tools For Quick And Effective Project Management

This article was first published on the NTEN Blog in March of 2011.

The keys to managing a successful project are buy-in and communication. Projects fail when all participants are on different pages. You want to use tools that your project participants can access easily, preferably ones they’re already using.

As an IT Director, co-workers, peers, and consultants frequently ask me, “Do you use Microsoft Project?” The answer to that question is a resounding denial.

Then I elaborate with my true opinion of Project: it’s a great tool if you’re building a bridge or a luxury hotel. But my Project rule of thumb is, if the budget doesn’t justify a full-time employee to manage the Project plan (e.g., keep the plan updated, not manage the project, necessarily), then MS Project is overkill. Real world projects require far more agile and accessible tools.

The keys to managing a successful project are buy-in and communication. The people who run the organization need to support it and the people the project is being planned for need to be expecting and anticipating the end result. Projects fail when all participants are on different pages: vague or different ideas of what the goals are; different levels of commitment; poor understanding of the deadlines; and poorly set expectations. GANTT charts are great marketing tools — senior executives never fail to be impressed by them — but they don’t tell the Facilities Coordinator in clear language that you need the facility booked by March 10th, or the designer that the web page has to be up by April 2nd.

You want to use tools that your project participants can access easily, preferably ones they’re already using. Here are five tools that are either free or you’ve already obtained, which, used together, will be far more effective than MS Project for the typical project at a small to mid-sized organization:

  • GanttProject. GanttProject is an open source, cross-platform project management tool. Think of it as MS Project lite. While the feature set includes identifying project resources, allocating time, and tracking completion, etc., it excels at creating GANTT charts, which can then be used to promote and communicate about the project. People appreciate visual aids, and GANTT charts visually identify the key tasks, milestones and timeframes. I don’t recommend diving into the resource allocations and the like, as I think that’s the point where managing the project plan starts becoming more work than managing the project.
  • Your email app. It’s all about communication: setting expectations, managing expectations, reminding and checking on key contributors so that deadlines are met. Everyone already lives in their email, so you want to visit them where they live. Related tool: the telephone.
  • MeetingWizard, Doodle, etc. We might gripe about meetings, but email alone does not cut it. If you want people to understand what you’re trying to accomplish — and care –they need to see your face and here the inflections in your voice when you tell them about it. By the same token, status updates and working out schedules where one person’s work depends on others completing theirs benefit greatly from face-to-face planning.
  • Excel (or any spreadsheet). Budgets, check off lists, inventory — a spreadsheet is a great tool for storing the project data. Worthy alternatives (and superior, because they’re multi-user): Sharepoint or Open Atrium.
  • Socialcast (or Yammer). Socialcast is Facebook for organizations. Share status, links, and files in a microblogging client. You can create categories and assign posts to them. The reasoning is the same as for the email, and email might be your fallback if your co-workers won’t take to microblogging, but if they’re open to it, it’s a great way to keep a group of people easily informed.

It’s not that there aren’t other good ways to manage projects. Basecamp, or one of the many similar web apps might be a better fit, particularly if the project team is widely dispersed geographically. Sharepoint can replace a number of the tools listed here. But you don’t really have to spend a penny. You do need to plan, promote, and communicate.

Projects don’t fail because you’re not using capital “P” Project. They fail when there isn’t buy-in, shared understanding, and lots of interaction.

Peter Campbell is currently the Director of Information Technology at Earthjustice, a non-profit law firm dedicated to defending the earth. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo & Marin Counties, Inc. Peter has been managing technology for non-profits and law firms for over 20 years, and has a broad knowledge of systems, email and the web. In 2003, he won a “Top Technology Innovator” award from InfoWorld for developing a retail reporting system for Goodwill thrift. Peter’s focus is on advancing communication, collaboration and efficiency through creative use of the web and other technology platforms.