Tag Archives: virtualization

My Tips For Planning Successful NTEN Tech Sessions

NTEN needs good tech sessions at the 2014 conference. Submissions are open.  Here’s a pitch for any tech-savvy NTENdees to dive in and present, followed by my lessons learned (from 20+ sessions at eight NTCs) for successfully presenting technical topics to the diverse audience that shows up at NTC.  Simply put, there are ways to do great sessions that meet the needs of staff from large and small, advanced and tech-challenged nonprofits in attendance. I’ll outline the ones that have worked for me below.

The IT Staff track is the place to submit the infrastructure-related sessions. The other tracks receive a lot more submissions than the IT Staff track (as much as five times the number!), even though 53% of the 13NTC attendees surveyed say they want more technical content.  My take on that the problem is that techies aren’t generally all that interested in standing up in front of crowds and presenting. That’s less of a problem for the Communications and Leadership tracks. All I can say to those of you who have the subject expertise but lack the desire and or confidence to present is that we all stand to gain if you will step outside of that comfort zone. NTEN will have the range of sessions that NPOs struggling with cloud, wireless, business intelligence and unified communications projects need to move forward.  You’ll add public speaking to your resume, which is a great thing to have there.  And I’ll help.

Over the last few years, I’ve presented on topics like server virtualization, VOIP, and project management.  These sessions have averaged 50-60 attendees, and every audience has ranged from complete novices to old hands at the subject matter. To my mind, the biggest (and most common) mistake that presenters make is to choose a target audience (e.g. they’re all newbies, or they’re all intermediate) and stick with that assumption. Simply put, the attendees will be forgiving if you spend some time addressing the needs of the others in the room, as long as you also address theirs.  They’ll be pissed if they spend the whole session either out of their depth or bored out of their minds.

There are two key ways that you can address a range of audiences: structure the session in beginner, intermediate and advanced topics, or break the attendees into groups by org size.  The latter will require co-presenters; the former keeps that as an option.

In 2010, Matt Eshleman and I did a session on Server Virtualization, an incredibly geeky topic, and it was the third highest rated session that year. We didn’t break up the audience into groups.  Instead, I gave about a 15 minute powerpoint that introduced the concepts, doing my best to bring anyone who didn’t know what it was up to speed.  Matt then outlined three virtualization scenarios: one for a small org; one for medium; and one for a large. We left about 30 minutes for questions, and some of those hit on the really advanced questions that the experts had.  By that point, the novices were grounded enough to not be thrown by the advanced conversation.

In 2012, I designed a session on VOIP and Videoconferencing.  Knowing that small orgs and large orgs have dramatically different needs in this area, I drafted Matt again, as well as Judi Sohn.  This time, we split the room into two groups, and had two very different conversations, both of which were quite valuable for the attendees.  I never heard how this session was rated, but I think it’s the best of the 20 or so I’ve done. My measure is: did the attendees walk out of the session with substantial, practical knowledge that they didn’t have when they walked in, that they can use to support their NPO(s)?

Two big tips:

  1. Don’t get to wonky with the slides.  IDC and Microsoft have a ton of diagrams outlining server setups that you can download, but they are not what an NTEN crowd wants to see.  Nobody wants to stare at a Visio diagram with 16 objects and 10 arrows and tiny tiny labels saying what they all mean.
  2. Mine the wisdom of the crowd.  Most people attend sessions to learn, but some attend because they love the topic and have a lot of expertise in it.  The best Q&A (which should never be less than 30 minutes) is one that the presenter facilitates, encouraging dialogue among the attendees.  As the presenter, you can reply (or weigh in), as you’ll have relevant expertise that the audience might lack, but it’s often the case that someone else in the room knows what you know, and more.

I hope this is helpful, but, even more, I hope that you’ll submit a session and make 14NTC the most rewarding yet for the IT staff that attend. It’s in my neighborhood nest year (DC), so come early and have a beer with me beforehand.

Best Of 2012: Nonprofit Technology Grows Up

This article was first published on the NTEN Blog in December of 2012.

I think that the best thing that happened in 2012 was that some of the 2010-2011 “bleeding edge” conceptual technologies stood up and proved they weren’t fads.

When NTEN asked me to write a “best tech of 2012” post, I struggled a bit. I could tell you about the great new iPads and Nexus tablets; the rise of the really big phones; the ascendency of Salesforce; and the boundary-breaking, non-gaming uses of MicroSoft’s Kinect. These are all significant product developments, but I think that the David Pogues and Walter Mossberg’s out there will have them covered.

I think that the best thing that happened in 2012 was that some of the 2010-2011 “bleeding edge” conceptual technologies stood up and proved they weren’t fads. These aren’t new topics for NTEN readers, but they’re significant.

Cloud computing is no longer as nebulous a thing as, say, an actual cloud. The question has moved somewhat soundly from “Should I move to the cloud?” to “Which cloud should I move to and when?” Between Microsoft’s Cloud ServicesGoogle Apps, and a host of additional online suites, there’s a lot to choose from.

Similarly, virtualization is now the norm for server rooms, and the new frontier for desktops. The ultimate merger of business and cloud computing will be having your desktop in the cloud, loadable on your PC, laptop, tablet or smartphone, from anywhere that you have an internet connection. Key improvements in Microsoft’s latest server platforms support these technologies, and Citrix and VMWare ars still growing and innovating, as Amazon, Google, Rackspace and others improve the net storage systems where our desktops can be housed.

Social networks aren’t the primary fodder for late night comedians anymore. Maybe there are still people ridiculing Twitter, but they aren’t funny, particularly when every product and place on earth now has it’s own Facebook page and hashtag. I mean, hashtags were created by geeks like us and now you see one superimposed on every TV show! I remember joining Facebook in 2007 and calling it “The Great Trivializer”, because the bulk of what I saw was my smart, committed NPTech friends asking me which five albums I would bring with me to a deserted island. Today, Facebook is a place where we communicate and plan. Its’s grown in ways that make it a far more serious and useful tool. Mind you, some of that growth was spurred by adding Google+ features, which are more geared toward real conversation.

But the big winner in 2012 was data. It was the year of Nate Silver and the Infographic. Nate (as countless post-election pundits have pointed out), via his fivethirtyeight blog at the New York Times, proved that data can be analyzed properly and predict the future. This is the power of aggregation: his perfect electoral college score was built on an aggregated analysis of multiple individual polls. I think this presents a clear challenge to nonprofits: You should keep doing your surveying, but for useful data on the demographics that fuel your mission, you need to partner with similar orgs and aggregate those results for more accurate analysis.

Infographics make data poignant and digestible. They tell the stories behind the data in picture book format. Innovative storytellers have used videos, cartoons and comic books to make their points, but nothing is as succinct at telling a data-based story as an infographic. There should be one or more in your next annual report.

Peter starts as Chief Information Officer at Legal Services Corporation in January.

Talking NPTech in Marin

Yesterday I joined my frequent collaborators John Kenyon and Susan Tenby at the Marin Nonprofit Conference, where we presented a 90 minute panel on nptech, from servers to tweets. John deftly dished out the web strategy while Susan flooded us with expert advice on how to avoid social media pitfalls. I opened up the session with my thesis: You have too many servers, even if you have just one”. I made the case that larger orgs can reduce with virtualization tech and smaller orgs should be moving to the cloud. The crowd in Marin was mostly from smaller orgs, so I focused the talk more on the cloud option, and that’s where I got all of the conversation going. My goal with the slides was to do a semi “ignite”, given that I only had 25 minutes and I value the Q&A over the talking head time.

Virtualization: The Revolution in Server Management and Why You Should Adopt It

This article was co-written by Matt Eshleman of Community IT Innovators and first published on the NTEN Blog in June of 2009.

  

Peter Campbell, Earthjustice and Matthew Eshleman, Community IT Innovators

This year’s Nonprofit Technology Conference offered a good chance to discuss one the most important — but geeky — developments in the world of computers and networks: server virtualization.

Targeting a highly technical session to an NTEN audience is kind of like cooking a gourmet meal with one entrée for 1000 randomly picked people. We knew our attendees would include as many people who were new to the concepts as there were tech-savvy types looking for tips on resolving cache conflicts between the SAN, DBMS and Hypervisor. We aimed to start very broad, focus on use cases, and leave the real tech stuff to the Q&A. We’ll try to do the same in this article.

We’ve already summarized the view from the top in a quick, ignite-style presentation, available wherever fine NTC materials are found (and also on Slideshare).  In a nutshell, virtualization technology allows many computers to run concurrently on one server, each believing it’s the sole occupant. This allows for energy and cost savings, greater efficiency, and some astounding improvements in the manageability of your networks and backups, as servers can be cloned or dragged, dropped and copied, allowing for far less downtime when maintenance is required and easy access to test environments.  It accomplishes this by making the communication between an operating system, like Windows or Linux, generic and hardware-independent.

Most of the discussion related to virtualization has been centered on large data centers and enterprise implementations, but a small network can also take advantage of the benefits that virtualization has to offer. Here are three common scenarios:

  • Using a new server running a virtualization hypervisor to migrate an existing server
  • Using a new server to consolidate 3-4 physical servers to save on electric & warranty expenses
  • Using a storage area network (SAN) to add flexibility and expandability to the infrastructure

In the first scenario, an existing server is converted into a virtual server running on new physical hardware. Tools from VMWare and other vendors allow disks to be resized, additional processor cores to be assigned and RAM to be added. The benefit to this process is that the physical server now exists on a new hardware platform with additional resources. End users are shielded from major disruptions and IT staff are not required to make any changes to scripts or touch workstations.

The second scenario, much like the first case, starts with the addition of new physical hardware to the network. Today’s servers are so powerful, it’s unlikely that more that 5% of their total processing power is used. That excess capacity allows an organization to use virtualization to lower their hardware expenses by consolidating multiple servers on one hardware platform. Ideal candidates are servers that run web & intranet applications, antivirus management, backup, directory services, or terminal services.  Servers that do a lot of transactional processing such as database & email servers can also be virtualized but require a more thoughtful network architecture.

The final scenario involves taking the first step toward a more traditional enterprise implementation, incorporating two physical servers connected to a SAN. In this scenario, the hardware resources continue to be abstracted from the virtual servers. The SAN provides much more flexibility in adding storage capacity and assigning it to the virtual servers as required. Adding multiple server heads onto the SAN will also provide the capacity to take advantage of advanced features such as High Availability, Live Server Migration, and Dynamic Resource Scheduling.

The space for virtualization software is highly competitive. Vendors such as Microsoft, VMWare, Citrix and Virtual Iron continue to lower their prices or provide their virtualization software for free. Using no-cost software, an organization can comfortably run a virtual server environment of 16 virtual servers on 3 physical machines.

The session was followed by a healthy and engaging Q&A, and we were fortunate to have it all transcribed by the incredibility talented Jack Aponte. Scroll down to 10:12 in her NTC Live Blog for a full re-enactment of the session. We can also start a new Q&A, in comments, below.

And stayed tuned for more! The biggest paradigm shift from virtualization is related to the process surrounding the backup and recovery of virtual servers. We’ll be writing an article for the November NTEN newsletter with some detailed scenarios related to backup & disaster recovery in the virtual environment.

Tech Tips From The Nonprofit Technology Conference

This article was first published on the Idealware Blog in May of 2010.

Last month, I reported on the first annual Tech Track, a series of sessions presented at the April, 2010 Nonprofit Technology Conference. In that post I listed the topics covered in the five session track. Today I want to discuss some of the answers that the group came up with.

Session 1: Working Without a Wire

This session covered wireless technologies, from cell phones to laptops. Some conclusions:

The state of wireless is still not 100%, but it’s better than it was last year and it’s still improving Major metropolitan areas are well covered; remote areas (like Wyoming) are not. There are alternatives, such as Satellite, but that still requires that your location be in unobstructed satellite range. All in all, we can’t assume that wireless access is a given, and the challenge is more about managing staff expectations than installing all of the wireless by ourselves. It will get there.
Wireless security options are improving. Virtual Private Networks (VPNs), remote access solutions (such as Citrix, VNC andTerminal Services) are being provided for more devices and platforms, and the major smartphone companies are supporting enterprise features like remote device wipes.
Policy-wise, more orgs are moving to a module where staff buy their own smartphones and the companies reimburse a portion of the bill to cover business use. Some companies set strict password policies for accessing office content; others don’t.

Session 2: Proper Plumbing

This session was pitched as covering virtualization and other server room technologies, but when we quizzed the participants, virtualization was at the top of their list, so that’s what we focused on.

We established that virtualizing servers is a recommended practice. If you have a consultant recommending it and you don’t trust their recommendation, find another consultant and have them virtualize your systems, because the recommendation is a good one, but it’s a problem that you don’t trust your consultant!
The benefits of virtualization are numerous — reduced budgets, reduced carbon footprints, instant testing environments, 24/7 availability (if you can upgrade a copy of a server and then switch it back live, an advanced virtualization function).
There’s no need to rush it — it’s easier on the budget and the staff, as well as the environment, to replace standalone servers with virtualized ones as the hardware fails.
On the planning side, bigger networks do better by moving all of their data to a Storage Area Network (SAN) before virtualizing. This allows for even more flexibility and reduced costs, as servers are strictly operating systems with software and data is stored on fast, redundant disk arrays that can be accessed by any server, virtual or otherwise.

Session 3: Earth to Cloud

The cloud computing session focused a lot on comparisons. While the general concern is that hosting data with a third party is risky, is it any more risky than hosting it on our own systems? Which approach is more expensive? Which affords the most freedom to work with our data and integrate systems? How do we manage disaster recovery and business continuity in each scenario?

Security – Everyone is hackable, and Google and Salesforce have a lot more expertise in securing data systems than we do. So, from a “is your data safe?” perspective, it’s at least a wash. But if you have sensitive client data that needs to be protected from subpoenas, as well as or more than hackers, than you might be safer hosting your own systems.
Cost – We had no final answers; it will vary from vendor to vendor. But the cost calculation needs to figure in more than dollars spent — staff time managing systems is another big expense of technology.
Integration and Data Management – Systems don’t have to be in the same room to be integrated; they have to have robustAPIs. And internal systems can be just as locked as external if your contract with the vendor doesn’t give you full access and control over your data. This, again, was a wash.
Risk Management – There’s a definite risk involved if your outsourced host goes out of business. But there are advantages to being hosted, as many providers offer multiply-redundant systems. Google, in particular, writes every save on a Google Doc or GMail to two separate server farms on two different continents.
It all boils down to assessing the maturity of the vendors and negotiating contracts carefully, to cover all of the risks. Don’t sign up with the guy who hosts his servers from his basement; and have a detailed continuity plan in place should the vendor close up shop.
 If you’re a small org (15 staff or less), it’s almost a no-brainer that it will be more cost-effective and safer to host your email and data in the cloud, as opposed to running our own complex CRMs and Exchange servers. If you’re a large org, it might be much more complex, as larger enterprise apps sometimes depend on that Exchange server being in place. But, all in all, Cloud computing is a viable option that might be a good fit for you — check it out, thoroughly.

I’ll finish this thread up with one more post on budgeting and change management in the next few weeks.

My Full NPTech Dance Card

Congress can take a vote and change the time that the sun goes down.  So why can’t they give me the 10 additional hours in each day that I keep lobbying for?

In addition to my fulfilling work at Earthjustice and the quality time at home with my lovely wife and Lego-obsessed 10 year old, here are some of the things that are keeping me busy that might interest you as well:

  • Blogging weekly at Idealware, as usual. This is one of those rare entries that shows up here at Techcafeteria, but not there.  And I’m joined at Idealware by a great group of fellow bloggers, so, if you only read me here, you might get more out of reading me there.
  • I recently joined the GreenIT Consortium, a group of nonprofit professionals committed to spreading environmental technology practices throughout our sector.  I blog about this topic at Earthjustice.  Planned (but no dates set) is a webinar on Server Virtualization; technology that can reduce electrical use dramatically while making networks more manageable.  This will be similar to the session I did at the Nonprofit Technology Conference in April, and I’ll be joined again by Matt Eshleman of CITIDC. I’m also helping Ann Yoders, a consultant at Informatics Studio, with an article on green technology for Idealware.
  • On September 9th, I’ll be recording another episode of Blackbaud‘s Baudcast with other friends, including Holly Ross of NTEN. The topic this time is technology management, a subject I don’t ever shut up about.
  • Saving the big ones for last, NTEN’s first Online Conference is themed around the book, Managing Technology To Meet Your Mission. This one takes place September 16th and 17th, and I’ll be leading the discussion on my chapter: How to Decide: Planning and Prioritizing.
  • In early 2010, Aspiration will bring my pitch to life when we hold a two day conference that is truly on nonprofit technology, geared towards those of us who manage and support it. I’ve been known to rant about the fact that the big nptech shindigs — NTEN’s NTC and Techsoup’s Netsquared — focus heavily on social media and web technologies, with few sessions geared toward the day to day work that most nptechs are immersed in.  The goal of the event is to not only share knowledge, but also to build the community.  With so many nptech staff bred in the “accidental” vein, we think that fostering mentoring and community for this crowd is a no-brainer.
  • Further out, at the 2010 Nonprofit Technology Conference, I’ll be putting together a similar tech-focused sub-track.  Since the Aspiration event will be local (in the SF Bay), this will be a chance to take what we learn and make it global.
  • My nptech friends will forgive me for declaring my extra-curricular dance card otherwise closed — this is enough work to drop on top of my full-time commitments!

The Sky is Calling

This post originally appeared on the Idealware Blog in February of 2009.

My big post contrasting full blown Microsoft Exchange Server with cloud-based Gmail drew a couple of comments from friends in Seattle. Jon Stahl of One/Northwest pointed out, helpfully, that MS sells it’s Small Business Server product to companies with a maximum of 50 employees, and that greatly simplifies and reduces cost for Exchange. After that, Patrick Shaw of NPower Seattle took it a step further, pointing out that MS Small Business Server, with a support arrangement from a great company like NPower (the “great” is my addition – I’m a big fan), can cost as little as $4000 a year and provide Windows Server, Email, Backup and other functions, simplifying a small office’s technology and outsourcing the support. This goes a long way towards making the chaos I described affordable and attainable for cash and resource strapped orgs.

What I assume Npower knows, though, and hope that other nonprofit technical support providers are aware of, is that this is the outdated approach. Nonprofits should be looking to simplify technology maintenance and reduce cost, and the cloud is a more effective platform for that. As ReadWriteWeb points out, most small businesses — and this can safely be assumed to include nonprofits — are completely unaware of the benefits of cloud computing and virtualization. If your support arrangement is for dedicated, outsourced management of technology that is housed at your offices, then you still have to purchase that hardware and pay someone to set it up. The benefits of virtualization and fast, ubiquitous Internet access offer a new model that is far more flexible and affordable.

One example of a company that gets this is MyGenii. They offer virtualized desktops to nonprofits and other small businesses. As I came close to explaining in my Lean, Green, Virtualized Machine post, virtualization is technology that allows you to, basically, run many computers on one computer. The environmental and financial benefits of doing what you used to do on multiple systems all on one system are obvious, but there are also huge gains in manageability. When a PC is a file that can be copied and modified, building new and customized PCs becomes a trivial function. Take that one step further – that this virtual PC is stored on someone else’s property, and you, as a user, can load it up and run it from your home PC, laptop, or (possibly) your smartphone, and you now have flexible, accessible computing without the servers to support.

For the tech support service, they either run large servers with virtualization software (there are many powerful commercial and open source systems available), or they use an outsourced storage platform like Amazon’s EC2 service. In addition to your servers, they also house your desktop operating systems. Running multiple servers and desktops on single servers is far more economical; it better utilizes the available server power, reducing electricity costs and helping the environment; and backups and maintenance are simplified. The cost savings of this approach should benefit both the provider and the client.

In your office, you still need networked PCs with internet access. But all you need on those computers is a basic operating system that can boot up and connect to the hosted, virtualized desktop. Once connected, that desktop will recognize your printers and USB devices. If you make changes, such as changing your desktop wallpaper or adding an Outlook plugin, those changes will be retained. The user experience is pretty standard. But here’s a key benefit — if you want to work from home, or a hotel, or a cafe, then you connect to the exact same desktop as the one at work. It’s like carrying your computer everywhere you go, only without the carrying part required.

So, it’s great that there are mission focused providers out there who will affordably support our servers. But they could be even more affordable, and more effective, as cloud providers, freeing us from having to own and manage any servers in the first place.

Greening Your Gadgets

This was originally published on the Earthjustice Blog in December of 2008.

It’s a conundrum: how can you reduce your carbon footprint without giving up all of your nifty electronic gadgets?  And, if this isn’t your conundrum, it’s surely your spouse’s, or your kid’s or your cousin’s, right? Cell phones, iPods,  PCs, laptops, TVs, DVDs, VCRs, DVRs, GPSs, radios, stereos, and home entertainment systems are just a fraction of the energy leaking devices we all have a mix of these days.  While selling them all on Ebay is an option, it might not be the preferred solution.  So here are some tips on how to reduce the energy output of those gadgets.

Shop Smart.  Look for energy-saving features supported by the product, some of which will be listed as such, some not.

1.    Energy Star compliance.  Dell and HP sell lots of systems, and some are designed to operate more efficiently.  The Energy Star program sets environmental standards for technology and certifies them for compliance.  You can browse Energy-Star compliant products at their web site.

2.    EPEAT. The Electronic Product Environmental Assessment Tool is a website that, like Energy Star, rates products according to environmental standards.  Focused on computers, laptops and monitors, this is another great resource for identifying green products.

Use Only What you Have To.  Most electronics continue to draw power after you turn them off.  This “feature” is designed to allow them to boot up faster and be more responsive, but it’s been widely deployed with no sensitivity to environmental or even budgetary concerns about idle power use.

1.    Truly turn off devices. Newer electronics, such as DVD Players and stereos, offer options to truly turn off when the power isn’t on, with an accompanying warning that the product might take longer to start up.  It’s worth the wait.

2.    Convenient, green charging. Of course, when you charge your phone or iPod, you don’t leave the charger plugged in when you’re done. But this makes it dangerously easy to plug a cord into your phone without remembering to plug in the other end.  Look for devices that can charge via the USB ports on your computer, instead of a wall charger, not because that takes less energy to charge them, but because it eliminates the need to plug and unplug the wall charger.

Be Virtual.  If there’s a way to do what you want to do without buying another electrical device, go for it!

1.    Backup online. Instead of buying a backup machine or drive for your computer, use an online backup service like Mozy or Carbonite (There are many more online backup options, as well – these are two popular ones).

2.    Squeeze multiple computers into one.  Sound like magic?  It’s not.  If you use a Mac and a PC (say, because you love the Mac but need a Windows machine for work compatibility), pick up Parallels or VMWare Fusion, programs that allow you to run multiple computer operating systems on one computer, and retire the second machine.

Go Solar. Costco, Target and other retailers are starting to carry affordable solar chargers, $30 to $50 devices that can replace your wall sockets as the power sources to charge your phones and iPods.

Be Vigilant.  Turn things off when they’re not in use, aggressively tweak the power settings on your systems, and make green computing a habit, not a special project.

Take it from a techie like me: we don’t want to abandon the 21st century in order to insure that there’s a 22nd.  But we do want to curtail our energy use as much as possible.  These are relatively easy first steps in our personal efforts to stop global warming.

The Lean, Green, Virtualized Machine

This post was originally published on the Idealware Blog in November of 2008.
I normally try to avoid being preachy, but this is too good a bandwagon to stay off of. If you make decisions about technology, at your organization, as a board member, or in your home, then you should decide to green your IT. This is socially beneficial action that you can take with all sorts of side benefits, such as cost savings and further efficiencies. And it’s not so much of a new project to take on as it is a set of guidelines and practices to apply to your current plan. Even if my day job wasn’t at an organization dedicated to defending our planet, I’d still be writing this post, I’m certain.I’ve heard a few reports that server rooms can output 50% or more of a company’s entire energy; PC Magazine puts them at 30-40% on average. If you work for an organization of 50 people or more, then you should look at this metric: how many servers did you have in 2000; how many do you have now? If the volume hasn’t doubled, at least, then you’re the exception to a very bloated rule. We used to pile multiple applications and services onto each server, but the model for the last decade or so has been one server per database, application, or function. This has resulted in a boom of power usage and inefficiency. Another metric that’s been quoted to me by IDC, the IT research group, is that, on average, we use 10% of any given server’s processing power. So the server sits there humming 24/7, outputting carbons and ticking up our power bills.

So what is Green IT? A bunch of things, some very geeky, some common sense. As you plan for your technology upgrades, here are some things that you can consider:

1. Energy-Saving Systems. Dell, HP and the major vendors all sell systems with energy-saving architecture. Sometimes they cost a little more, but that cost should be offset by savings on the power bills. Look for free software and other programs that will help users manage and automate the power output of their stations.

2. Hosted Applications. When it makes sense, let someone else host your software. The scale of their operation will insure that the resources supporting your application are far more refined than a dedicated server in your building.

3. Green Hosting. Don’t settle for any host – if you have a hosting service for your web site, ask them if they employ solar power or other alternative technologies to keep their servers powered. Check out some of the green hosting services referenced here at Idealware.

4. Server Virtualization. And if, like me, you have a room packed with servers, virtualize. Virtualization is a geeky concept, but it’s one that you should understand. Computer operating system software, such as Windows and Linux, is designed to speak to a computer’s hardware and translate the high-level activities we perform to machine code that the computer’s processor can understand. When you install Windows or Linux, the installation process identifies the particular hardware on your system–the type of processor, brand of graphics card, number of USB ports–and configures the operating system to work with your particular devices.

Virtualization is technology that sits in the middle, providing a generic hardware interface for the operating system to speak with. Why? Because, once the operating system is speaking to something generic, it no longer cares what hardware it’s actually installed on. So you can install your Windows 2003 server on one system. Then, if a component fails, you can copy that server to another system, even if it’s radically different – say, a Mac – and it will still boot up and run. More to the point, you can boot up multiple virtual servers on one actual computer (assuming it has sufficient RAM and processing power).

A virtual server is, basically, a file. Pure and simple: one large file that the computer opens up and runs. While running, you can install programs, create documents, change your wallpaper and tweak your settings. When you shut down the server, it will retain all of your changes in the file. You can back that file up. You can copy it to another server and run it while you upgrade components on it’s home server, so that your users don’t lose access during the upgrade. And you can perform the upgrade at 1:00 in the afternoon, instead of 1:00 in the morning.

So, this isn’t just cool. This is revolutionary. Need a new server to test an application? Well, don’t buy a new machine. Throw a virtualized server on an existing machine.

Don’t want to mess with installing Windows server again? Keep a virtualized, bare bones server file (VM) around and use it as a template.

Don’t want to install it in the first place? Google “Windows Server VM”. There are pre-configured virtual machines for every operating system made available for download.

Want to dramatically reduce the number of computers in your server room, thereby maximizing the power usage of the remaining systems? Develop a virtualization strategy as part of our technology plan.

This is just the surface of the benefits of virtualization. There are some concerns and gotchas, too, that need to be considered, and I’ll be blogging more about it.

But the short story is that we have great tools and opportunities to make our systems more supportive of our environment, curbing the global warming crisis one server room at a time. Unlike a lot of these propositions, this one comes with cost reductions and efficiencies built-in. It’s an opportunity to, once in place, lighten your workload, strengthen your backup strategy, reduce your expenses on hardware and energy, and, well — save the world.