Doesn’t it figure that this would arrive after this morning’s post?
Doesn’t it figure that this would arrive after this morning’s post?
Doesn’t it figure that this would arrive after this morning’s post?
Let’s get this out of the way first: Thank you so much, NTEN, for the award. And great thanks to all of my nptech peers for the kind words and overdone Star Wars references here — I think my 11 year old enjoyed the video as much as I did (although he dozed off during the part where I was talking). And a whole level of thanks to my dear friend Deborah Finn, who made sure that anyone within a ten mile radius of someone who knows what “NPTech” means heard about my award (and Deborah hates awards!).
Winning an award is great. Even better is knowing that personal efforts of mine to increase NPTech awareness of good technology and beer carried on undaunted in my absence. Carie Lewis, David Krumlauf and Jenn Howard possibly doubled attendance at the Pre-NTEN Beer Bash. Track Kronzak and a host of smart people pulled off the second Tech Track to good crowds and reviews. Look forward to an even bigger bash on April 2nd, 2012, on my home turf in San Francisco (official conference dates are 4/3-5), and Judi Sohn has stepped up to the plate as organizer for the 2012 Tech Track (now you’re officially on the hook, Judi).
Feedback on this year’s conference has only served to reinforce my opinion that we need to do more outreach to the technical staff at nonprofits and bring them more into the mix of fundraisers, web developers and social media strategists that make up the NTEN community. The tech staff attending are looking for deeper conversations, and it’s a challenge to offer beginning and advanced topics when the techie attendance (or perception of same) is still moderate to low. It’s a chicken and egg problem: it’s hard for a Sysadmin or IT Support person to look at session after session on using Twitter and 4Square and then explain to their boss why they need to go to NTEN. But the crowd-sourced session input is dominated by people who find subjects like virtualization and network security kind of dull. I might find myself challenging NTEN’s session selection methods this year, not in an attempt to hijack the content, only to make it more democratic. Nonprofit technical staff need a technology network, too.
See you in 2012. I won’t miss it!
This year’s Nonprofit Technology Conference offered a good chance to discuss one the most important — but geeky — developments in the world of computers and networks: server virtualization.
Targeting a highly technical session to an NTEN audience is kind of like cooking a gourmet meal with one entrée for 1000 randomly picked people. We knew our attendees would include as many people who were new to the concepts as there were tech-savvy types looking for tips on resolving cache conflicts between the SAN, DBMS and Hypervisor. We aimed to start very broad, focus on use cases, and leave the real tech stuff to the Q&A. We’ll try to do the same in this article.
We’ve already summarized the view from the top in a quick, ignite-style presentation, available wherever fine NTC materials are found (and also on Slideshare). In a nutshell, virtualization technology allows many computers to run concurrently on one server, each believing it’s the sole occupant. This allows for energy and cost savings, greater efficiency, and some astounding improvements in the manageability of your networks and backups, as servers can be cloned or dragged, dropped and copied, allowing for far less downtime when maintenance is required and easy access to test environments. It accomplishes this by making the communication between an operating system, like Windows or Linux, generic and hardware-independent.
Most of the discussion related to virtualization has been centered on large data centers and enterprise implementations, but a small network can also take advantage of the benefits that virtualization has to offer. Here are three common scenarios:
In the first scenario, an existing server is converted into a virtual server running on new physical hardware. Tools from VMWare and other vendors allow disks to be resized, additional processor cores to be assigned and RAM to be added. The benefit to this process is that the physical server now exists on a new hardware platform with additional resources. End users are shielded from major disruptions and IT staff are not required to make any changes to scripts or touch workstations.
The second scenario, much like the first case, starts with the addition of new physical hardware to the network. Today’s servers are so powerful, it’s unlikely that more that 5% of their total processing power is used. That excess capacity allows an organization to use virtualization to lower their hardware expenses by consolidating multiple servers on one hardware platform. Ideal candidates are servers that run web & intranet applications, antivirus management, backup, directory services, or terminal services. Servers that do a lot of transactional processing such as database & email servers can also be virtualized but require a more thoughtful network architecture.
The final scenario involves taking the first step toward a more traditional enterprise implementation, incorporating two physical servers connected to a SAN. In this scenario, the hardware resources continue to be abstracted from the virtual servers. The SAN provides much more flexibility in adding storage capacity and assigning it to the virtual servers as required. Adding multiple server heads onto the SAN will also provide the capacity to take advantage of advanced features such as High Availability, Live Server Migration, and Dynamic Resource Scheduling.
The space for virtualization software is highly competitive. Vendors such as Microsoft, VMWare, Citrix and Virtual Iron continue to lower their prices or provide their virtualization software for free. Using no-cost software, an organization can comfortably run a virtual server environment of 16 virtual servers on 3 physical machines.
The session was followed by a healthy and engaging Q&A, and we were fortunate to have it all transcribed by the incredibility talented Jack Aponte. Scroll down to 10:12 in her NTC Live Blog for a full re-enactment of the session. We can also start a new Q&A, in comments, below.
And stayed tuned for more! The biggest paradigm shift from virtualization is related to the process surrounding the backup and recovery of virtual servers. We’ll be writing an article for the November NTEN newsletter with some detailed scenarios related to backup & disaster recovery in the virtual environment.
This article was first published on the NTEN Blog in March of 2011.
As an IT Director, co-workers, peers, and consultants frequently ask me, “Do you use Microsoft Project?” The answer to that question is a resounding denial.
Then I elaborate with my true opinion of Project: it’s a great tool if you’re building a bridge or a luxury hotel. But my Project rule of thumb is, if the budget doesn’t justify a full-time employee to manage the Project plan (e.g., keep the plan updated, not manage the project, necessarily), then MS Project is overkill. Real world projects require far more agile and accessible tools.
The keys to managing a successful project are buy-in and communication. The people who run the organization need to support it and the people the project is being planned for need to be expecting and anticipating the end result. Projects fail when all participants are on different pages: vague or different ideas of what the goals are; different levels of commitment; poor understanding of the deadlines; and poorly set expectations. GANTT charts are great marketing tools — senior executives never fail to be impressed by them — but they don’t tell the Facilities Coordinator in clear language that you need the facility booked by March 10th, or the designer that the web page has to be up by April 2nd.
You want to use tools that your project participants can access easily, preferably ones they’re already using. Here are five tools that are either free or you’ve already obtained, which, used together, will be far more effective than MS Project for the typical project at a small to mid-sized organization:
It’s not that there aren’t other good ways to manage projects. Basecamp, or one of the many similar web apps might be a better fit, particularly if the project team is widely dispersed geographically. Sharepoint can replace a number of the tools listed here. But you don’t really have to spend a penny. You do need to plan, promote, and communicate.
Projects don’t fail because you’re not using capital “P” Project. They fail when there isn’t buy-in, shared understanding, and lots of interaction.
Peter Campbell is currently the Director of Information Technology at Earthjustice, a non-profit law firm dedicated to defending the earth. Prior to joining Earthjustice, Peter spent seven years serving as IT Director at Goodwill Industries of San Francisco, San Mateo & Marin Counties, Inc. Peter has been managing technology for non-profits and law firms for over 20 years, and has a broad knowledge of systems, email and the web. In 2003, he won a “Top Technology Innovator” award from InfoWorld for developing a retail reporting system for Goodwill thrift. Peter’s focus is on advancing communication, collaboration and efficiency through creative use of the web and other technology platforms.
This article was first published on the Idealware Blog in March of 2011.
NPTech maven Deborah Elizabeth Finn started a blog last week called “No Nonprofit Spam“. As a well-known NPTech consultant, Deborah is far from alone in finding herself regularly subscribed to nonprofit email lists that she has never opted into. But, as opposed to just complaining about what is, in anyone’s definition (except possibly the sender’s) unsolicited commercial email; Deborah took the opportunity to try and educate. It’s a controversial undertaking. Nobody likes spam. Many of us like nonprofits, and aren’t going to hold them to the same level of criticism as we will that anonymous meds or mortgages dealer; and the measures that we take against the seamy spammers are pretty harsh. Even if nonprofits are guilty of the spamming crime, should they be subject to the same punishments?
Spam, like beauty, is in the eye of the beholder. So, for the purposes of this conversation, let’s agree on a definition of nonprofit spam. Sending one email to someone that you have identified as a potential constituent, either by engaging them in other media or purchasing their name from a list provider, is, at worst, borderline spam, and not something that I would join a campaign to complain about. If I delete the message and don’t hear from the NPO again, no big deal. But subscribing me to a recurring list without my express buy-in is what I consider spamming. And that’s the focus of Deborah’s blog (which is naming names) and the action that goes from email engagement to email abuse, for the purposes of this post.
In my post to the No Nonprofit Spam website, I made the point that we’re all inundated with email and we can only support so many orgs, so NPOs would do better to build their web site and their Charity Navigator rating than to push their messages, uninvited, into our inboxes. It’s a matter of being respectful of constituent priorities.
There are two motivations for overdoing it on the emails. One is the mildly understandable, but not really forgiveable mistake of overenthusiasm for one’s mission. Believing that the work you do is so important that subscribing people who have expressed no interest to your list is warranted. That’s a mistake of naivety more than anything else.
The less forgivable excuse is the typical spam calculation: no matter how many people you offend, enough people will click on it to justify the excess. After all, it’s cost-justified by the response rate, right?
The downside in both cases is that, if you only count the constituents you gained, then you’re missing something of great important to nonprofits and little import to viagra salesman. The people you offended might have otherwise been supporters. The viagra spammer isn’t going to pitch their product through other avenues. It’s a low investment, so any yeild is great gain. But you likely have people devoting their full hearts to your cause. You’re in the business of building relationships, not burning them. And you will never know how many consttuents that you might have gained through more respectful avenues if you treat them callously with your email initiatives.
Worse, the standard ways that individuals deal with spam could be very challenging for an NPO to deal with. In the comments to my No Nonprofit Spam post, some people advocated doing more than just marking the messages as spam, but also reporting the offending orgs to Spamcop, who then list them with Spamhaus, the organization that maintains block lists of known spammers that large ISPs subscribe to. By overstepping the bounds of net courtesy, you could not only alienate individuals, but wreak havoc with your ability to reach people by email at all. My take is that reporting NPOs — even the ones who, by my above definition, spam — is unusually cruel to organizations who do good in the world. But I’m a nonprofit professional. Many of the people that we might be offending aren’t going to be so sympathetic.
So, what do you think? Is spam from a nonprofit any different from spam from a commercial vendor? Should nonprofits be held to the same level of accountability as viagra spammers? Are even single unsolicited emails spam, or are they permissable? I searched for some nonprofit-focused best practices before completing this article, and didn’t come up with anything that differentiated our industry from the commercial ones, but I think there’s a difference. Just as nonprofits are exempt from the Do Not Call lists, I think we deserve some exemptions in email. But I could be wrong, and what would serve us all well is a clear community policy on email engagement. Does anyone have any to recommend?
Cartoon borrowed from Rob Cottingham’s Noise To Signal collection.