Tag Archives: data

Talking Databases For A Change

NTEN‘s new issue of Change is out and I got a chance to sound off to Idealware‘s Chris Bernard about the dream of “one database to rule them all” — doing all of your organization’s Constituent Relationship Management (CRM) in a single system. My interview is on page 22, but the whole issue is a dream for NPO’s struggling with wrangling information.

Suggestion: use a big monitor to view this. Change is a great magazine, but the Bluetoad viewer is somewhat tough to use on small screens.

NTEN Change, Issue 4

Tech Tips From The Nonprofit Technology Conference

This article was first published on the Idealware Blog in May of 2010.

Last month, I reported on the first annual Tech Track, a series of sessions presented at the April, 2010 Nonprofit Technology Conference. In that post I listed the topics covered in the five session track. Today I want to discuss some of the answers that the group came up with.

Session 1: Working Without a Wire

This session covered wireless technologies, from cell phones to laptops. Some conclusions:

The state of wireless is still not 100%, but it’s better than it was last year and it’s still improving Major metropolitan areas are well covered; remote areas (like Wyoming) are not. There are alternatives, such as Satellite, but that still requires that your location be in unobstructed satellite range. All in all, we can’t assume that wireless access is a given, and the challenge is more about managing staff expectations than installing all of the wireless by ourselves. It will get there.
Wireless security options are improving. Virtual Private Networks (VPNs), remote access solutions (such as Citrix, VNC andTerminal Services) are being provided for more devices and platforms, and the major smartphone companies are supporting enterprise features like remote device wipes.
Policy-wise, more orgs are moving to a module where staff buy their own smartphones and the companies reimburse a portion of the bill to cover business use. Some companies set strict password policies for accessing office content; others don’t.

Session 2: Proper Plumbing

This session was pitched as covering virtualization and other server room technologies, but when we quizzed the participants, virtualization was at the top of their list, so that’s what we focused on.

We established that virtualizing servers is a recommended practice. If you have a consultant recommending it and you don’t trust their recommendation, find another consultant and have them virtualize your systems, because the recommendation is a good one, but it’s a problem that you don’t trust your consultant!
The benefits of virtualization are numerous — reduced budgets, reduced carbon footprints, instant testing environments, 24/7 availability (if you can upgrade a copy of a server and then switch it back live, an advanced virtualization function).
There’s no need to rush it — it’s easier on the budget and the staff, as well as the environment, to replace standalone servers with virtualized ones as the hardware fails.
On the planning side, bigger networks do better by moving all of their data to a Storage Area Network (SAN) before virtualizing. This allows for even more flexibility and reduced costs, as servers are strictly operating systems with software and data is stored on fast, redundant disk arrays that can be accessed by any server, virtual or otherwise.

Session 3: Earth to Cloud

The cloud computing session focused a lot on comparisons. While the general concern is that hosting data with a third party is risky, is it any more risky than hosting it on our own systems? Which approach is more expensive? Which affords the most freedom to work with our data and integrate systems? How do we manage disaster recovery and business continuity in each scenario?

Security – Everyone is hackable, and Google and Salesforce have a lot more expertise in securing data systems than we do. So, from a “is your data safe?” perspective, it’s at least a wash. But if you have sensitive client data that needs to be protected from subpoenas, as well as or more than hackers, than you might be safer hosting your own systems.
Cost – We had no final answers; it will vary from vendor to vendor. But the cost calculation needs to figure in more than dollars spent — staff time managing systems is another big expense of technology.
Integration and Data Management – Systems don’t have to be in the same room to be integrated; they have to have robustAPIs. And internal systems can be just as locked as external if your contract with the vendor doesn’t give you full access and control over your data. This, again, was a wash.
Risk Management – There’s a definite risk involved if your outsourced host goes out of business. But there are advantages to being hosted, as many providers offer multiply-redundant systems. Google, in particular, writes every save on a Google Doc or GMail to two separate server farms on two different continents.
It all boils down to assessing the maturity of the vendors and negotiating contracts carefully, to cover all of the risks. Don’t sign up with the guy who hosts his servers from his basement; and have a detailed continuity plan in place should the vendor close up shop.
 If you’re a small org (15 staff or less), it’s almost a no-brainer that it will be more cost-effective and safer to host your email and data in the cloud, as opposed to running our own complex CRMs and Exchange servers. If you’re a large org, it might be much more complex, as larger enterprise apps sometimes depend on that Exchange server being in place. But, all in all, Cloud computing is a viable option that might be a good fit for you — check it out, thoroughly.

I’ll finish this thread up with one more post on budgeting and change management in the next few weeks.

Blog Policy on Recent Racist Comments

This blog doesn’t get a ton of comments – the most active posts tend to be the ones leading up to this weeks Nonprofit Technology Conference.  But I’ve been getting a bunch lately that I’ve decided not to post, as comments, at least.  So this is to clarify the comment policy, and respond to some borderline conversational/offensive comments left in the last day or so.

Comments are moderated here, mainly in order to weed out the obvious spam that slips through my Akismet filter on occasion.  I don’t publish spam or link spam, so if you’re one of the people leaving innocuous comments about my writing style, note that I don’t believe that you’re sincere, and I won’t publish your link to your viagra site.

But the comments I received this week aren’t spam.  Instead, they appear to be the work of someone looking to provoke me.  They’re in reply to my post “The Offensive Bardwell Defense“, in which I spoke about segregation, my marriage, and the legal battle to allow same sex marriage underway.  The first message was easy to ignore, because it was pure vitriol, equating my interracial marriage with numerous controversial sex acts.  The writer, one “DMTS” of gmail, followed that up with a more measured comment that, while continuing to make personal comments about my marital status, argued that, while it’s fine for me to “hook up” with people of non-white ancestry, I have no right to blog about it.  “Don’t ask, don’t tell”, as it were.  The full comment went:

“Peter Campbells marriage (if still intact) is just an exception to the way things really work in mixed marriages. I don’t want to deny him any success or happiness with his nice wife and child pictured (great pic btw), but he does not have any rights defending something that is clearly wrong for the majority, when he is in the minority of working mixed marriages(for now). If I hook up with a different race partner, I will just do it, and not advertise it as normal, or make a big deal and use someones legit comment as a scapegoat. WHO CARES ANYWAY PETER? no one is making laws that specify you can’t hook up with dreadlocks, beehives, or skinheads, so what are you worried about? when has anyone persecuted mixed racials? sounds to me you are looking to MAKE TROUBLE by drawing sympathy to yourself that is totally unjustified. Blog about something else that is important, like what your son is planning to do with his future, to help make this a better world without blog script shills making trouble for all races. Shalom”

I’d point out two things to Mr. (I presume) DMTS. The first is that, while he can suggest that my marriage is some kind of exception to the rule, I’m not aware of any evidence that it is.  Divorce is rampant in this country, but I’ve never seen a statistic that suggests that it’s higher among interracial couples than same race. Mr. Bardwell didn’t cite any statistics for his assumptions, either.

The second thing I’d point out is that DMTS completely missed my point.  I used my interracial marriage, and interracial marriage in general, to point out that the same sex marriage debate underway in this country is a parallel, and, as with interracial marriage in the 60’s, the bigots, of whom I assume DMTS counts himself among, are going to lose the battle.  He seems to have skimmed my message and misread my conclusion that this type of bigotry — be it about race or sexual orientation — will be overcome.  It’s a slow process. It clearly still exists, as DMTS chooses to illustrate.  But, today, his attitudes and comments are sad.  In 30 years time, they’ll be outrageous.  Racism and hatred/bigotry based on assumptions about race (or race relations) is on the wane.  Interracial marriage is now accepted in the U. S.. It’s a slower course for a lot of the institutionalized racism in our schools and justice system. But most of the vitriol comes from old, white men, and two trends are clear: whites as a percentage of our population are shrinking, and old people will die sooner than the more enlightened young ones.

As to publishing comments like this: I’m interested in dialogue, and if DMTS responds to this with something that doesn’t use language that I wouldn’t want my Mom (who reads this blog) to see, I’ll certainly approve it.  If he provides some backing for his unverified claims that interracial (“mixed” is an offensive term) marriages are at higher risk of failure than same race marriages, a claim that I find very suspect and unlikely, I might even reply. But if DMTS actually isn’t invested in his arguments, and is just trying to get a rise out of me, it only takes a second to mark a comment as spam.  And rude, unconstructive conversation, like DMTS’s first message, which I will not publish,  is spam here; that’s the policy.

The Ethnic Check

Census_2001Yesterday I received a letter from the State of California alerting me that my Census form is due next week and that I should be sure to fill it out and return it, as is decidedly my intention. That form will include the page that drives many Americans crazy — the one that offers you a bunch of ethnic backgrounds that you can identify yourself on. As my spouse of African-Cherokee-Jamaican-German and who knows what else decent says, this is not a multiple choice question for many of us. Personally, I always check the “white” box, which is not lying, although I always have a nagging doubt that the Semitic parts of my genetic makeup aren’t fairly represented by that choice.

Today, skimming through my news feed, I starred this article by Michelle Malkin, passed on by Google Reader’s “Cool” feed, and I just found time to read it. The gist of the article is that Census filler-outers should refrain from allowing the government to peg us by ethnicity, instead choosing “Other” and filling in the comment squares with “American”. Take that, Gubmint statisticians!

Now, this is interesting, because while Ms. Malkin proudly describes herself as a Fox News Commentator, I don’t think this question lands on a liberal/conservative scale. Discomfort with being pegged by race straddles all ideological outposts, as it should. But data is data, and the ethnic makeup of our country by geographic area is a powerful set of data. If we don’t know that a neighborhood is primarily Asian, White, Black or Hispanic, we don’t know if the schools are largely segregated. We don’t know if the auto insurance rates are being assessed with a racial bias. We don’t know if elected officials are representative of the districts they serve. And these are all very important things to know.

It might seem that, by eschewing all data about race, we can consider ourselves above racism. But we can board our windows and doors and dream that the world outside is made of candy, too. It won’t make the world any sweeter. If we don’t have any facts about the ethnic makeup and the conditions of people in this country, then we can’t discuss racial justice and equality in any meaningful fashion. We might hate to take something as personal as the genetic, geographic path that brought us to this country and made us the unique individuals that we are and dissect it, analyze it, generalize about it and draw broad conclusions. It is uncomfortable and, in a way, demeaning. But it’s not as uncomfortable and demeaning as being broadly discriminated against. And without evidence of abuse, and of progress, we can’t end discrimination. We can only board up the windows that display it.

So, I’m not going to take Ms. Malkin’s advice on this one, and I’m going to urge my multi-racial wife and kid to be as honest as they can with the choices provided to them. Because we want the government to make decisions based on facts and data, not idealizations, even if it means being a little blaze about who we really are.

Won’t You Let me Take You On A Sea Change?

This post was originally published on the Idealware Blog in December of 2009.

seachange.png

Last week, I reported that Nonprofit assessors like Charity Navigator and Guidestar will be moving to a model of judging effectiveness (as opposed to thriftiness). The title of my post drew some criticism. People far more knowledgeable than I am on these topics questioned my description of this as a “sea change”, and I certainly get their point.  Sure, the intention to do a fair job of judging Nonprofits is sincere; but the task is daunting.  As with many such efforts, we might well wind up with something that isn’t a sea change at all, but, rather, a modified version of what we have today that includes some info about mission effectiveness, but still boils down to a financial assessment.

Why would this happen? Simple. Because metrics are numbers: ratios, averages, totals. It’s easy to make metrics from financial data.  It’s very difficult to make them out of less quantifiable things, such as measuring how successfully one organization changed the world; protected the planet; or stopped the spread of a deadly disease.

I used to work for an org whose mission was to end poverty in the San Francisco Bay Area. And, sure enough, at the time, poverty was becoming far less prevalent in San Francisco. So could we be judged as successful?  Could we grab the 2005 versus 2000 poverty statistics and claim the advances as our outcomes? Of course not. The reduction in poverty had far more to do with gentrification during the dotcom and real estate booms than our efforts.  Poverty wasn’t reduced at all; it was just displaced. And our mission wasn’t to move all of the urban poor to the suburbs; it was to bring them out of poverty.

So the announcement that our ratings will now factor in mission effectiveness and outcomes could herald something worse than we have today. The dangerous scenario goes like this:

  • Charity Navigator, Guidestar, et al, determine what additional info they need to request from nonprofits in order to measure outcomes.
  • They make that a requirement; nonprofits now have to jump through those hoops.
  • The data they collect is far too generalized and subjective to mean much; they draw conclusions anyway, based more on how easy it is to call something a metric than how accurate or valuable that metric is.
  • NPOs now have more reporting requirements and no better representation.

So, my amended title: “We Need A Sea Change In The Way That Our Organizations Are Assessed”.

I’m harping on this topic because I consider it a call to action; a chance to make sure that this self-assessment by the assessors is an opportunity for us, not a threat. We have to get the right people at the table to develop standardized outcome measurements that the assessing organizations can use.  They can’t develop these by themselves. And we need to use our influence in the nonprofit software development community to make sure that NPOs have software that can generate these reports.

The good news? Holly Ross of NTEN got right back to me with some ideas on how to get both of these actions going.  That’s a powerful start. We’ll need the whole community in on this.

Get Ready For A Sea Change In Nonprofit Assessment Metrics

This post was originally published on the Idealware Blog in December of 2009.

watchdogs.png

Last week, GuideStar, Charity Navigator, and three other nonprofit assessment and reporting organizations made a huge announcement: the metrics that they track are about to change.  Instead of scoring organizations on an “overhead bad!” scale, they will scrap the traditional metrics and replace them with ones that measure an organization’s effectiveness.

  • The new metrics will assess:
  • Financial health and sustainability;
  • Accountability, governance and transparency; and
  • Outcomes.

This is very good news. That overhead metric has hamstrung serious efforts to do bold things and have higher impact. An assessment that is based solely on annualized budgetary efficiency precludes many options to make long-term investments in major strategies.  For most nonprofits, taking a year to staff up and prepare for a major initiative would generate a poor Charity Navigator score. A poor score that is prominently displayed to potential donors.

Assuming that these new metrics will be more tolerant of varying operational approaches and philosophies, justified by the outcomes, this will give organizations a chance to be recognized for their work, as opposed to their cost-cutting talents.  But it puts a burden on those same organizations to effectively represent that work.  I’ve blogged before (and will blog again) on our need to improve our outcome reporting and benchmark with our peers.  Now, there’s a very real danger that neglecting to represent your success stories with proper data will threaten your ability to muster financial support.  You don’t want to be great at what you do, but have no way to show it.

More to the point, the metrics that value social organizational effectiveness need to be developed by a broad community, not a small group or segment of that community. The move by Charity Navigator and their peers is bold, but it’s also complicated.  Nonprofit effectiveness is a subjective thing. When I worked for a workforce development agency, we had big questions about whether our mission was served by placing a client in a job, or if that wasn’t an outcome as much as an output, and the real metric was tied to the individual’s long-term sustainability and recovery from the conditions that had put them in poverty.

Certainly, a donor, a watchdog, a funder a, nonprofit executive and a nonprofit client are all going to value the work of a nonprofit differently. Whose interests will be represented in these valuations?

So here’s what’s clear to me:

– Developing standardized metrics, with broad input from the entire community, will benefit everyone.

– Determining what those metrics are and should be will require improvements in data management and reporting systems. It’s a bit of a chicken and egg problem, as collecting the data wis a precedent to determining how to assess it, but standardizing the data will assist in developing the data systems.

– We have to share our outcomes and compare them in order to develop actual standards.  And there are real opportunities available to us if we do compare our methodologies and results.

This isn’t easy. This will require that NPO’s who have have never had the wherewith-all to invest in technology systems to assess performance do so.  But, I maintain, if the world is going to start rating your effectiveness on more than the 990, that’s a threat that you need to turn into an opportunity.  You can’t afford not to.

And I look to my nptech community, including Idealware, NTEN, Techsoup, Aspiration and many others — the associations, formal, informal, incorporated or not, who advocate for and support technology in the nonprofit sector — to lead this effort.  We have the data systems expertise and the aligned missions to lead the project of defining shared outcome metrics.  We’re looking into having initial sessions on this topic at the 2010 Nonprofit Technology Conference.

As the world starts holding nonprofits up to higher standards, we need a common language that describes those standards.  It hasn’t been written yet.  Without it, we’ll escape the limited, Form 990 assessments to something that might equally fail to reflect our best efforts and outcomes.

The Idealware Research Fund

Idealware LogoFans of this blog are likely fans of the other site I blog at, Idealware.  So you already know that Idealware offers a rich, valuable service to the nonprofit community with it’s reports, webinars, trainings and programs that help nonprofits make smart decisions about software.  One of the big challenges that Idealware faces is to maintain a high level of independence for their reporting.  If your goal is to be the Consumer Reports of nonprofit software, and you need funding in order to do that, you also need to be very careful about how you receive that funding, in order to make sure that no bias creeps through to your reporting. Laura Quinn, Idealware’s founder and primary force, has come up with a few clever models for eliminating such bias, but today she unleashed a more sustainable approach to funding that will greatly simplify the process.

The Idealware Research Fund will provide basic, pooled funding for the great work that Idealware does, keeping it independent, unbiased, and resourced to provide the critical insight that smooths the stormy waters when we embark on big and small technology projects. The fund was kicked off today with a goal of raising $15,000 by December 31st.  Please let people know about Idealware’s work and this opportunity to support them, and consider supporting them yourself, if you can afford to.

Note that my self-interest is minimal here.  I’m an unpaid, volunteer blogger at Idealware and will remain such.  I have been paid (via Techsoup) for a couple of articles I’ve written.  But my support and pitch here is based solely on my belief that Idealware does great, effective work and needs our support.

Why Geeks (like Me) Promote Transparency

This post was originally published on the Idealware Blog in November of 2009.
Mizukurage.jpg
Public Domain image by Takada

Last week, I shared a lengthy piece that could be summed up as:

“in a world where everyone can broadcast anything, there is no privacy, so transparency is your best defense.”

(Mind you, we’d be dropping a number of nuanced points to do that!)

Transparency, it turns out, has been a bit of a meme in nonprofit blogging circles lately. I was particularly excited by this post by Marnie Webb, one of the many CEO’s at the uber-resource provider and support organization Techsoup Global.

Marnie makes a series of points:

Meaningful shared data, like the Miles Per Gallon ratings on new car stickers or the calorie counts on food packaging help us make better choices;But not all data is as easy to interpret;Nonprofits have continually been challenged to quantify the conditions that their missions address;

Shared knowledge and metrics will facilitate far better dialog and solutions than our individual efforts have;

The web is a great vehicle for sharing, analyzing and reporting on data;

Therefore, the nonprofit sector should start defining and adopting common data formats that support shared analysis and reporting.

I’ve made the case before for shared outcomes reporting, which is a big piece of this. Sharing and transparency aren’t traditional approaches to our work. Historically, we’ve siloed our efforts, even to the point where membership-based organizations are guarded about sharing with other members.

The reason that technologists like Marnie and I end up jumping on this bandwagon is that the tech industry has modeled the disfunction of a siloed approach better than most. early computing was an exercise in cognitive dissonance. If you regularly used Lotus 123, Wordperfect and dBase (three of the most popular business applications circa 1989) on your MS-DOS PC, then hitting “/“, F7 or “.” were the things you needed to know in order to close those applications respectively. For most of my career, I stuck with PCs for home use because I needed compatibility with work, and the Mac operating system, prior to OSX, just couldn’t easily provide that.

The tech industry has slowly and painfully progressed towards a model that competes on the sales and services level, but cooperates on the platform side. Applications, across manufacturers and computing platforms, function with similar menus and command sequences. Data formats are more commonly shared. Options are available for saving in popular, often competitive formats (as in Word’s “Save As” offering Wordperfect and Lotus formats). The underlying protocols that fuel modern operating systems and applications are far more standardized. Windows, Linux and MacOS all use the same technologies to manage users and directories, network systems and communicate with the world. Microsoft, Google, Apple and others in the software world are embracing open standards and interoperability. This makes me, the customer, much less of an innocent bystander who is constantly sniped by their competitive strategies.

So how does this translate to our social service, advocacy and educational organizations? Far too often, we frame cooperation as the antithesis to competition. That’s a common, but crippling mistake. The two can and do coexist in almost every corner of our lives. We need to adopt a “rising tide” philosophy that values the work that we can all do together over the work that we do alone, and have some faith that the sustainable model is an open, collaborative one. Looking at each opportunity to collaborate from the perspective of how it will enhance our ability to accomplish our public-serving goals. And trusting that this won’t result in the similarly-focused NGO down the street siphoning off our grants or constituents.

As Marnie is proposing, we need to start discussing and developing data standards that will enable us to interoperate on the level where we can articulate and quantify the needs that our mission-focused organizations address. By jointly assessing and learning from the wealth of information that we, as a community of practice collect, we can be far more effective. We need to use that data to determine our key strategies and best practices. And we have to understand that, as long as we’re treating information as competitive data; as long as we’re keeping it close to our vests and looking at our peers as strictly competitors, the fallout of this cold war is landing on the people that we’re trying to serve. We owe it to them to be better stewards of the information that lifts them out of their disadvantaged conditions.

Security and Privacy in a Web 2.0 World

This post originally appeared on the Idealware Blog in November of 2009.
A Tweet from Beth

Yes, we do Twitter requests!

To break down that tweet a bit, kanter is the well-known Beth Kanter of Beth’s blog. pearlbear is former Idealware blogger and current contributor Michelle Murrain, and Beth asked us, in the referenced blog post, to dive a bit into internet security and how it contrasts with internet privacy concerns. Michelle’s response, offers excellent and concise definitions of security and privacy as they apply to the web, and then sums up with a key distinction: security is a set of tools for protecting systems and information. The sensitivity of that data (and need for privacy) is a matter of policy. So the next question is, once you have your security systems and policies in place, what happens when the the policies are breached?

Craft a Policy that Minimizes Violations

Social media is casual media. The Web 2.0 approach is to present a true face to the world, one that interacts with the public and allows for individuals, with individual tastes and opinions, to share organizational information online. So a strict rule book and mandated wording for your talking points are not going to work.

Your online constituents expect your staff to have a shared understanding of your organization’s mission and objectives. But they also expect the CEO, the Marketing Assistant and the volunteer Receptionists to have real names (and real pictures on their profiles); their own online voices; and interests they share that go beyond the corporate script. It’s not a matter of venturing too far out of the water — in fact, that could be as much of a problem as staying too close to the prepared scripts. But the tone that works is the one of a human being sharing their commitment and excitement about the work that they (and you) do.

Expect that the message will reflect individual interpretations and biases. Manage the messaging to the key points, and make clear the areas that shouldn’t be discussed in public. Monitor the discussion, and proactively mentor (as opposed to chastising) staff who stray in ways that violate the policy, or seem capable of doing so.

The Case for Transparency

Transparency assumes that multiple voices are being heard; that honest opinions are being shared, and that organizations aren’t sweeping the negative issues under the virtual rug. Admittedly, it’s a scary idea that your staff, your constituents, and your clients should all be free to represent you. The best practice of corporate communications, for many years, was to run all messaging through Marketing/Communications experts and tightly control what was said. I see two big reasons for doing otherwise:

  • We no longer have a controlled media.

Controlled messaging worked when opening your own TV or Radio Station was prohibitively expensive. Today, YouTube, Yelp and Video Blogs are TV Stations. Twitter and Facebook Status are radio stations. The investment cost to speak your mind to a public audience has just about vanished.

  • We make more mistakes by under-communicating than we do by over-communicating.

Is the importance of hiding something worth the cost of looking like you have something to hide? At the peak of the dot com boom, I hired someone onto my staff at about $10k more (annually) than current staff in similar roles were making. An HR clerk accidentally sent the offer letter to my entire staff. The fallout was that I had meaningful talks about compensation with each of my staff; made them aware that they were getting market (or better) in a rapidly changing market, and that we were keeping pace on anniversary dates. Prior to the breach, a few of my staff had been wrongly convinced that they were underpaid in their positions. The incident only strengthened the trust between us.

The Good, the Bad, and the Messenger

Your blog should allow comments, and — short of spam, personal attacks and incivility — shouldn’t be censored. A few years ago, a former employee of my (former) org managed to register the .com extension of our domain name and put up a web site criticizing us. While the site didn’t get a lot of hits, he did manage to find other departed staff with axes to grind, and his online forum was about a 50-50 mix of people trashing us and others defending. After about a month, he went in and deleted the 50% of forum messages that spoke up for our organization, leaving the now one-sided, negative conversation intact. And that was the end of his forum; nobody ever posted there again.

There were some interesting lessons here for us. He had a lot of inside knowledge that he shared, with no concern or allegiance to our policy. And he was motivated and well-resourced to use the web to attack us, But, in the end, we didn’t see any negative impact on our organization. The truth was, it was easy to separate his bias from his “inside scoops”, and hard to paint us in a very negative light, because the skeletons that he let out of our closet were a lot like anybody else’s.

What this proves is that message delivery accounts for the messenger. Good and bad tweets and blog posts about your organization will be weighed by the position and credibility of the tweeter or blogger.

Transparency and Constituent Data Breaches

Two years ago, a number of nonprofits were faced with a difficult decision when a popular hosted eCRM service was compromised, and account information for donors was stolen by one or more hackers. Thankfully, this wasn’t credit card information, but it included login details, and I’m sure that we all know people who use the same password for their online giving as they do for other web sites, such as, perhaps, their online banking. This was a serious breach, and there was a certain amount of disclosure from the nonprofits to their constituents that was mandated.

Strident voices in the community called for full disclosure, urging affected nonprofits to put a warning on the home page of their web sites. Many of the organizations settled for alerting every donor that was potentially compromised via phone and/or email, determining that their unaffected constituents might not be clear on how the breach happened or what the risks were, and would simply take the home page warning as a suggestion to not donate online.

To frame this as a black and white issue, demanding that it be treated with no discretion, is extreme. The seriousness and threat that resulted from this particular breach was not a simple thing to quantify or explain. So it boils down to a number of factors:

  • Scope: If all or most of your supporters are at risk, or the number at risk is in the six figure range, it’s probably more responsible, in the name of protecting them, to broadcast the alert widely. If, as in the case above, those impacted are the ones donate online, then that’s probably not close to the amount that would fully warrant broad disclosure, as even the strident voice pointed out.
  • Risk: Will your constituents understand that the notice is informational, and not an admission of guilt or irresponsibility in handling their sensitive data? Alternatively, if this becomes public knowledge, would your lack of transparency look like an admission of guilt? You should be comfortable with your decision, and able to explain it.
  • Consistency: Some nonprofits have more responsibility to model transparency than others. If the Sunlight Foundation was one of the organizations impacted, it’s a no-brainer. Salvation Army? Transparency isn’t referenced on their “Positions” page.
  • Courtesy: Some constituencies are more savvy about this type of thing than others. If the affected constituents have all been notified, and they represent a small portion of the donor base, it’s questionable whether scaring your supporters in the name of openness is really warranted.

Since alternate exposure, in the press or community, is likely to occur, the priority is to have a consistent policy about how and when you broadcast information about security breaches. Denying that something has had happened in any public forum would be irresponsible and unethical, and most likely come right back at you. Not being able to explain why you chose not to publicize it on your website could also have damaging consequences. Erring on the side of alerting and protecting those impacted by security breaches is the better way to go, but the final choice has to weigh in all of the risks and factors.

Conclusion

All of my examples assume you’re doing the right things. You have justifiable reasons for doing things that might be considered provocative. Your overall efforts are mission-focused. And the reasons for privacy regarding certain information are that it needs to be private (client medical records, for example); it supports your mission-based objectives by being private, and/or it respects the privacy of people close to the information.

No matter how well we protect our data, the walls are much thinner than they used to be. Any unfortunate tweet can “go viral”. We can’t put a lock on our information that will truly secure it. So it’s important to manage communications with an understanding that information will be shared. Protect your overall reputation, and don’t sweat the minor slips that reveal, mostly, that you’re not a paragon of perfection, maybe, but a group of human beings, struggling to make a difference under the usual conditions.

How and Why RSS is Alive and Well

This post was first published on the Idealware Blog in September of 2009.

rss.png
Image: SRD

RSS, one of my favorite protocols, has been taking a beating in the blogosphere. Steve Gillmor, in his blog TechcrunchIT, declared it dead in May, and many others have followed suit.

Did Twitter Kill it?

The popular theory is that, with social networks like Twitter and Facebook serving as link referral tools, there’s no need to setup and look at feeds in a reader anymore. And I agree that many people will forgo RSS in favor of the links that their friends and mentors tweet and share. But this is kind of like saying that, if more people shop at farmer’s markets than supermarkets, we will no longer need trucks. Dave Winer, quite arguably the founder of RSS, and our friends at ReadWriteWeb have leapt to RSS’s defense with similar points – Winer puts it best, saying:

“These protocols…are so deeply ingrained in the infrastructure they become part of the fabric of the Internet. They don’t die, they don’t rest in piece.”

My arguments for the defense:

1. RSS is, and always has been about, taking control of the information you peruse. Instead of searching, browsing, and otherwise separating a little wheat from a load of chaff, you use RSS to subscribe to the content that you have vetted as pertinent to your interests and needs. While that might cross-over a bit with what your friends want to share on Facebook, it’s you determining the importance, not your friends. For a number of us, who use the internet for research; brand monitoring; or other explicit purposes, a good RSS Reader will still offer the best productivity boost out there.

2. Where do you think your friends get those links? It’s highly likely that most of them — before the retweets and the sharing — grabbed them from an RSS feed. I post links on Twitter and Facebook, and I get most of them from my Google Reader flow.

3. It’s not the water, it’s the pipe. The majority of those links referred by Twitter are fed into Twitter via RSS. Twitterfeed, the most popular tool for feeding RSS data to Twitter, boasts about half a million feeds. Facebook, Friendfeed and their ilk all allow importing from RSS sources to profiles.

So, here are some of the ways I use RSS every day:

Basic Aggregation with Drupal

My first big RSS experiment built on the nptech tagging phenomenon. Some background: About five years ago, with the advent of RSS-enabled websites that allowed for storing and tagging information (such as Delicious, Flickr and most blogging platforms), Techsoup CEO Marnie Webb had a bright idea. She started tagging articles, blog posts, and other content pertinent to those working in or with nonprofits and technology with the tag “nptech”. She invited her friends to do the same. And she shared with everyone her tips for setting up an RSS newsreader and subscribing to things marked with our tag. Marnie and I had lunch in late 2005 and agreed that the next step was to set up a web site that aggregated all of this information. So I put up the nptech.info site, which continues to pull nptech-tagged blog entries from around the web.

Other Tricks

Recently, I used Twitterfeed to push the nptech aggregated information to the nptechinfo Twitter account. So, if you don’t like RSS, you can still get the links via Twitter. But stay aware that they get there via RSS!

I use RSS to track Idealware comments, Idealware mentions on Twitter, and I subscribe to the blog, of course, so I can see what my friends are saying.

I use RSS on my personal website to do some lifestreaming, pulling in Tweets and my Google Reader favorites.

But I’m pretty dull — what’s more exciting is the way that Google Reader let me create a “bundle” of all of the nptech blogs that I follow. You can sample a bunch of great Idealware-sympatico bloggers just by adding it to your reader.

Is RSS dead? Not around here.