November 20 2014

Why I’m Intrigued By Google’s Inbox

Google Inbox logoHere we go again! Another communication/info management Google product that is likely doomed to extinction (much like recent social networks I’ve been blogging about), and I can’t help but find it significant and important, just as I did Google Wave, Google Buzz, and the much-loved Google Reader. I snagged an early invite to Google’s new “Inbox” front-end to GMail, and I’ve been agonizing over it for a few weeks now.  This app really appeals to me, but I’m totally on the fence about actually using it, for a few reasons:

  • This is either a product that will disappear in six months, or it’s what Gmail’s standard interface will evolve into.  It is absolutely an evolved version of recent trends, notably the auto-sorting tabs they added about a year ago.
  • The proposition is simple: if you let Google sort your mail for you, you will no longer have to organize your mail.

I’ve blogged before about how expensive an application email is to maintain, time-wise. We get tons of email (I average over a hundred messages a day between work and home), and every message needs to be managed (deleted, archived, labeled, dragged to a folder, etc.), unlike texts and social media, which you can glance at and either reply or ignore. The average email inbox is flooded with a wide assortment of information, some useless and offensive (“Meet Beautiful Russian Women”), some downright urgent (“Your Aunt is in the Hospital!”), and a range of stuff in-between. If you get 21 messages while you’re at an hour-long meeting, and the first of the 21 is time-sensitive and critical, it’s not likely the first one that you are going to read, as it has scrolled below the visible part of your screen. The handful of needles in the crowded haystack can be easily lost forever.

Here’s how Inbox tries to make your digital life easier and less accident-prone:

  • Inbox assumes (incorrectly) that every email has three basic responses: You want to deal with it soon (keep it in the inbox); you want to deal with it later (“snooze” it with a defined time to return to the inbox); or you want to archive it. They left out delete it, currently buried under a pop-up menu, which annoys me, because I believe that permanently deleting the 25% of my email that can be glanced at (or not even opened) and deleted is a cornerstone of my inbox management strategy. But, that nit aside, I really agree with this premise.
  • Messages fall in categories, and you can keep a lot of the incoming mail a click away from view, leaving the prime inbox real estate to the important messages. Inbox accomplishes this with “Bundles“. which are the equivalent to the presorted tabs in Classic GMail.  Your “Promotions”, Updates” and “Social” bundles (among other pre-defineds) group messages, as opposed to putting each incoming message on it’s own inbox line. I find the in-list behavior more intuitive than the tabs. You can create your own bundles and teach them to auto-sort — I immediately created one for Family, and added in the primary email addresses for my immediate loved ones.  We’ll see what it learns.
  • Mail doesn’t need to be labeled (you can still label messages, but it’s not nearly as simple a task as it is in GMail classic). This is the thing I’m wrestling with most — I use my labels.  I have tons of filters defined that pre-label messages as they come in, and my mailbox cleanup process labels what’s missed. I go to the labels often to narrow searches. I totally get that this might not be necessary — Google’s search might be good enough that my labeling efforts are actually more work than just searching the entire inbox each time. But I’m heavily invested in my process.
  • Highlights” act a bit like Google Now, popping up useful info like flight details and package tracking.

One important note: Inbox does nothing to alter or replace your Gmail application.  It’s an alternative interface. When you archive, delete or label a message in Inbox, it gets archived, deleted or labeled in GMail as well, but Gmail knows nothing about bundles and, therefore, doesn’t reflect them, and not one iota of GMail functionality changes when you start using Inbox.  You do start getting double notifications, and Inbox offered to turn off GMail notifications for me if I wanted to fix that. I turned Inbox down and I’m waiting for GMail to make a similar offer.  ;-)

So what Inbox boils down to is a streamlined, Get Things Done (GTD) frontend for GMail that removes email clutter, eases email management, and highlights the things that Google thinks are important. If you think Google can do that for you reasonably well, then it might make your email communication experience much saner. You might want to switch to it.  Worse that can happen is it goes away, in which case Gmail will still be there.

I have invites.  Leave a comment or ping me directly if you’d like one.

If you’re using Inbox already, tell me, has it largely replaced GMail’s frontend for you?  If so, why? If not, why not?

 

November 17 2014

Should You Outsource Your IT Department?

This post was originally published on the MAP Techworks Blog in November of 2014. 

agreement-303221_640For a nonprofit that’s reached a size of 25 or more staff, a key question revolves around how to support technology that has grown from a few laptops and PCs to a full-blown network, with all of the maintenance and troubleshooting that such a beast requires. Should you hire internal IT staff or outsource to a more affordable vendor for that support? I’d say that the key question isn’t should you — that’s more a matter of finances and personal preferences. But what you outsource and how you go about it are critical factors.

The IT departments that I’ve worked on provided a range of services, which I’ve always broken down into two broad categories.  The first is the plumbing: computer maintenance, installation, database input, training, and tech support.  These functions can, with a few caveats, be successfully outsourced. The caveats:

  • You can’t just hire the outsourced IT firm and expect them to understand your needs after an initial meeting and walk-through.  They should be micro-managed for the first month or two.  Their inclination will be to offer a generic level of support that may or may not work for your application mix or your company culture. Orient them; set clear expectations and priorities; and check their work for a good while. If you don’t, your staff might immediately lose faith in them, setting up a situation where they don’t use the service you’re paying for and, when they do interact, do it begrudgingly.  The outsourced staff should be on your team, and you need to invest in onboarding them.
  • Everyone has to remember that it’s your network. Don’t give the outsourced service the keys to your kingdom.  You should keep copies of all passwords and they should understand that changing a system password without your prior knowledge, consent, and an updated password list is a fire-able offense.  And be ready to fire them — have a backup vendor lined up.

The other bucket is strategic tech planning. In-house infrastructure or cloud. Data management strategy. How tech integrates into a broader strategic plan and supports the mission.  How tech plays into the strategies of our partners, our clients, and our communities. These components can benefit from the advice of a good consultant, but are too integral to the work and culture of an organization to be handed off to outsiders wholesale.

Outsourcing your tech strategy can be a dangerous gamble.  If you have a great consultant who really cares about your mission, they can offer some good advice. But, in most cases, the consultants are more interested in pushing their tech strategy than developing one that works well with your organizational culture.  I find that my tech strategy is heavily informed by my understanding of my co-workers, their needs, and their ability to cope with change.  To get all that from outside of an organization requires exceptional insight.

Let me make that point another way — if you don’t have a tech strategist on your internal, executive team, you’re crippled from the start. These days, it’s as essential as having a development director and a finance person. Consultants can inform and vet your ideas, but you can’t outsource your tech strategy wholesale to them. It’s core to the functionality of any successful nonprofit.

The right outsourcer can be cost effective and meet needs. But be very thorough in your selection process and, again, do some serious onboarding, because your dissatisfaction will be tied completely to their lack of understanding of your business and your needs. There are a number of NP-specific vendors (Map for Nonprofits, former NPowers and others, like DC’s CommunityIT) that get us and are better choices, in general, than the commercial services.

November 10 2014

Does Your Request For Proposal (RFP) Ask The Right Questions?

This post was originally published on the Community IT Innovators Blog in November of 2014.

foler-29373_640Requests for Proposals (RFPs) are a controversial topic in the nonprofit sector. While governmental and corporate organizations use them regularly as a tool to evaluate products and services, their use in our sector is haphazard. I spoke recently about the RFP process and how it could work for us at the 2014 Nonprofit Technology Conference. My slides from that talk are here, along with this blog post outlining my key arguments in favor of RFPs. But a recent conversation on NTEN’s DC community list really summed up the topic.

A member posted an RFP for CRM consulting and asked why he was getting scant responses from the vendors. I looked over the RFP, and saw that it requested a fixed bid quote for work that was not well defined. I popped back into the forum with some comments:

“This five page RFP contains about a tenth of the information that a decent consultant would need in order to propose a meaningful bid for the work. If you’ve received any such bids already, I would advise you to throw them out, because those bids are wild guesses, and you will either be paying more than you need to, or setting yourself up for a combative relationship with a vendor who is angry that the project is taking far more hours than they guessed that it would. Decent consultants are passing on the RFP because it lacks so much specificity. There are two ways you could address this problem:

  1. Significantly beef up the RFP. If I were to go this route, I might hire a consultant to help me write the RFP, because they can better communicate the requirements than I could.
  2. Stop asking for a fixed bid. Query their expertise in the areas that need it, and request ample examples of work they’ve done. Also, ask for their hourly fees by role. The RFP can provide a fairly high-level overview of the project, as you won’t be asking them to generate a meaningful estimate. Instead, do reference checks and ask specific questions about their billing in order to vet that they are honest and sensitive to nonprofit budgets.

Many consultants would pop in here and say “forget the RFP – let us come talk with you and get a sense of the project and we can go from there.” As a customer, not a consultant, I wouldn’t go for that. A good RFP, sent before any face to face meetings, can tell you a lot about the professionalism, insight and care that a company will bring to your project.  Rapport and relationship are also critical, but assessing those elements is the second step. (And when it comes to that step, insist that you are meeting the people who you would almost certainly be working with). An RFP response can also be attached to the contract to make sure that the vendor is obliged to live up to their claims.

I do fixed-bid quotes for phone systems and virtualization projects, where I can tell them exactly what the project would entail. I don’t for websites and software development, because not only do I not know what the ultimate product will look like or require, I shouldn’t – a lot of learning takes place during the project that will shape the requirements further. Once I’ve hired a good consultant, we can do a defined discovery phase that will allow them to provide a fixed quote — or reasonable range — for the rest of the work.  It’s a much better way to set up the relationship than by basing it in unrealistic projections.”

Subsequently, a consultant posted a reply suggesting that RFPs are a pain, and they should really just hire a consultant they like and see if it works out, perhaps after doing a small Request for Information (RFI) to learn more about the consultants available. I replied:

I did say that consultants will often dis the RFP process and say, “just hire us and see if it works out.”  It certainly is easier on the consultant. As Clint Eastwood would say, the question is, “do you feel lucky?” Because if you feel lucky, then you can just find a suitable-looking consultant and hope that they are ethical, not over-booked (and therefore liable to under-prioritize your project), experienced with the technology that they’re deploying, etc, and do a discovery phase that will cost you x thousands of dollars, and then find out if they are the right consultant for you. Or you can do an RFP, throw out the responses that clearly don’t match your requirements, throw out the ones that don’t seem interested or well-resourced enough to respond fully, and interview the two to four consultants that look like good matches. It’s more work up front than hiring someone and hoping they’ll work out, true, but, here’s what it gets you:

  1. Focused. Just writing the RFP gets you more in touch with the goals and requirements for the project.
  2. Informed. The RFP review and interviews are a chance for the project team to explore the project possibilities with various experts.
  3. Confident. Without investing thousands of dollars into a “vendor test,” you will know who has the right experience and a compatible approach. For me, it’s often less about the skill and experience than the approach (e.g., we want a collaborative partner that would teach as they go, rather than experts to outsource the work to).
  4. Accountable. The RFP can be a contractual document, so if the vendor lied about what they can do, they can be held responsible for that lie. And, not all consultants lie, but some do. I’ve caught them at it.  ;-)
  5. Documented. In the future, after you’ve left the organization, your successors might wonder why you selected the partner that you did. The RFP process leaves a knowledge management trail for key organizational decision making.

And finally, RFI vs RFP is a question of scale.  For smaller projects, without much associated risk, RFI. The investment in doing a full RFP does have to be justified by the cost and complexity of the project. For big projects, doing an RFI in order to identify who you want to include in an RFP can be helpful.

——————–

Community IT Innovators are a DC consulting and outsourcing firm located in Washington, DC.  Their blog is a great source of good tech advice, with similar themes (but more expert advice, less over-indulgent opinion) as mine.

Category: NPTech | LEAVE A COMMENT
October 28 2014

How Easy Is It For You To Manage, Analyze And Present Data?

apple-256262_640I ask because my articles are up, including my big piece from NTEN’s Collected Voices: Data-Informed Nonprofits on Architecting Healthy Data Management Systems. I’m happy to have this one available in a standalone, web-searchable format, because I think it’s a bit of a  signature work.  I consider data systems architecture to be my main talent; the most significant work that I’ve done in my career.

  • I integrated eleven databases at the law firm of Lillick & Charles in the late 90’s, using Outlook as a portal to Intranet, CRM, documents and voicemail. We had single-entry of all client and matter data that then, through SQL Server triggers, was pushed to the other databases that shared the data.  This is what I call the “holy grail” of data ,entered once by the person who cares most about it, distributed to the systems that use it, and then easily accessible by staff. No misspelled names or redundant data entry chores.
  • In the early 2000’s, at Goodwill, I developed a retail data management system on open source (MySQL and PHP, primarily) that put drill-down reporting in a web browser, updated by 6:00 am every morning with the latest sales and production data.  We were able to use this data in ways that were revolutionary for a budget-challenged Goodwill, and we saw impressive financial results.

The article lays out the approach I’m taking at Legal Services Corporation to integrate all of our grantee data into a “data portal”, built on Salesforce and Box. It’s written with the challenges that nonprofits face front and center: how to do this on a budget, and how to do it without a team of developers on staff.

At a time when, more and more, our funding depends on our ability to demonstrate our effectiveness, we need the data to be reliable, available and presentable.  This is my primer on how you get there from the IT viewpoint.

I also put up four articles from Idealware.  These are all older (2007 to 2009), they’re all still pretty relevant, although some of you might debate me on the RSS article:

This leaves only one significant piece of my nptech writing missing on the blog, and that’s my chapter in NTEN’s “Managing Technology To Meet Your Mission” book about Strategic Planning. Sorry, you gotta buy that one. However, a Powerpoint that I based on my chapter is here.

October 27 2014

Architecting Healthy Data Management Systems

This article was originally published in the NTEN eBook “Collected Voices: Data-Informed Nonprofits” in January of 2014.

tape-403593_640Introduction

The reasons why we want to make data-driven decisions are clear.  The challenge, in our cash-strapped, resource-shy environments is to install, configure and manage the systems that will allow us to easily and efficiently analyze, report on and visualize the data.  This article will offer some insight into how that can be done, while being ever mindful that the money and time to invest is hard to come by.  But we’ll also point out where those investments can pay off in more ways than just the critical one: the ability to justify our mission-effectiveness.

Right off the bat, acknowledge that it might be a long-term project to get there.  But, acknowledge as well, that you are already collecting all sorts of data, and there is a lot more data available that can put your work in context.  The challenge is to implement new systems without wasting earlier investments, and to funnel data to a central repository for reporting, as opposed to re-entering it all into a redundant system.  Done correctly, this project should result in greater efficiency once it’s completed.

Consider these goals:

  • An integrated data management and reporting system that can easily output metrics in the formats that constituents and funders desire;
  • A streamlined process for managing data that increases the validity of the data entered while reducing the amount of data entry; and
  • A broader, shared understanding of the effectiveness of our strategic plans.

Here are the steps you can take to accomplish these goals.

Taking Inventory

The first step in building the system involves ferreting out all of the systems that you store data in today.  These will likely be applications, like case or client management systems, finance databases, human resources systems and constituent relationship management (CRM) systems.  It will also include Access databases, Excel spreadsheets, Word documents, email, and, of course, paper.  In most organizations (and this isn’t limited to nonprofits), data isn’t centrally managed.  It’s stored by application and/or department, and by individuals.

The challenge is to identify the data that you need to report on, wherever it might be hidden, and catalogue it. Write down what it is, where it is, what format it is in, and who maintains it.  Catalogue your information security: what content is subject to limited availability within the company (e.g., HR data and HIPAA-related information)? What can be seen organization-wide? What can be seen by the public?

Traditionally, companies have defaulted to securing data by department. While this offers a high-level of security, it can stifle collaboration and result in data sprawl, as copies of secured documents are printed and emailed to those who need to see the information, but don’t have access. Consider a data strategy that keeps most things public (within the organization), and only secures documents when there is clear reason to do so.

You’ll likely find a fair amount of redundant data.  This, in particular, should be catalogued.  For example, say that you work at a social services organization.  When a new client comes on, they’re entered into the case management system, the CRM, a learning management system, and a security system database, because you’ve given them some kind of access card. Key to our data management strategy is to identify redundant data entry and remove it.  We should be able to enter this client information once and have it automatically replicated in the other systems.

Systems Integration

Chances are, of course, that all of your data is not in one system, and the systems that you do have (finance, CRM, etc.) don’t easily integrate with each other.  The first question to ask is, how are we going to get all of our systems to share with each other? One approach, of course, is to replace all of your separate databases with one database.  Fortune 500 companies use products from Oracle and SAP to do this, systems that incorporate finance, HR, CRM and inventory management.  Chances are that these will not work at your nonprofit; the software is expensive and the developers that know how to customize it are, as well.  More affordable options exist from companies like MicroSoft, Salesforce, NetSuite and IBM, at special pricing for 501(c)(3)’s.

Data Platforms

A data platform is one of these systems that stores your data in a single database, but offers multiple ways of working with the data.  Accordingly, a NetSuite platform can handle your finance, HR, CRM/Donor Management and e-commerce without maintaining separate data stores, allowing you to report on combined metrics on things like fundraiser effectiveness (Donor Management and HR) and mail vs online donations (E-commerce and Donor Management).  Microsoft’s solution will incorporate separate products, such as Sharepoint, Dynamics CRM, and the Dynamics ERP applications (HR, Finance).  Solutions like Salesforce and NetSuite are cloud only, whereas Microsoft  and IBM can be installed locally or run from the cloud.

Getting from here to there

Of course, replacing all of your key systems overnight is neither a likely option nor an advisable one.  Change like this has to be implemented over a period of time, possibly spanning years (for larger organizations where the system changes will be costly and complex). As part of the earlier system evaluation, you’ll want to factor in the state of each system.  Are some approaching obsoletion?  Are some not meeting your needs? Prioritize based on the natural life of the existing systems and the particular business requirements. Replacing major data systems can be difficult and complex — the point isn’t to gloss over this.  You need to have a strong plan that factors in budget, resources, and change management.  Replacing too many systems too quickly can overwhelm both the staff implementing the change and the users of the systems being changed.  If you don’t have executive level IT Staff on board, working with consultants to accomplish this is highly recommended.

Business Process Mapping

BPM_Example

The success of the conversion is less dependent on the platform you choose than it is on the way you configure it.  Systems optimize and streamline data management; they don’t manage the data for you.  In order to insure that this investment is realized, a prerequisite investment is one in understanding how you currently work with data and optimizing those processes for the new platform.

To do this, take a look at the key reports and types of information in the list that you compiled and draw the process that produces each piece, whether it’s a report, a chart, a list of addresses or a board report.  Drawing processes, aka business process mapping, is best done with a flowcharting tool, such as Microsoft Visio.  A simple process map will look like this:

In particular, look at the processes that are being done on paper, in Word, or in Excel that would benefit from being in a database.  Aggregating information from individual documents is laborious; the goal is to store data in the data platform and make it available for combined reporting.  If today’s process involves cataloguing data in an word processing table or a spreadsheet, then you will want to identify a data platform table that will store that information in the future.

Design Considerations

Once you have catalogued your data stores and the processes in place to interact with the data, and you’ve identified the key relationships between sets of data and improved processes that reduce redundancy, improve data integrity and automate repetitive tasks, you can begin designing the data platform.  This is likely best done with consulting help from vendors who have both expertise in the platform and knowledge of your business objectives and practices.

As much as possible, try and use the built-in functionality of the platform, as opposed to custom programming.  A solid CRM like Salesforce or MS CRM will let you create custom objects that map to your data and then allow you to input, manage, and report on the data that is stored in them without resorting to actual programming in Java or .NET languages.  Once you start developing new interfaces and adding functionality that isn’t native to the platform, things become more difficult to support.  Custom training is required; developers have to be able to fully document what they’ve done, or swear that they’ll never quit, be laid off, or get hit by a bus. And you have to be sure that the data platform vendor won’t release updates that break the home-grown components.

Conclusion

The end game is to have one place where all staff working with your information can sign on and work with the data, without worrying about which version is current or where everything might have been stored.  Ideally, it will be a cloud platform that allows secure access from any internet-accessible location, with mobile apps as well as browser-based.  Further considerations might include restricted access for key constituents and integration with document management systems and business intelligence tools. But key to the effort is a systematic approach that includes a deep investment in taking stock of your needs and understanding what the system will do for you before the first keypress or mouse click occurs, and patience, so that you get it all and get it right.  It’s not an impossible dream.

 

October 27 2014

Techcafeteria’s Week Of Added Content

pile-154710_640As promised, I added about 40 of my guest posts here from the NTEN, Idealware, Earthjustice and LSC blogs. I also completely redid my categories and retagged every item, which is something I’d never done properly, so that, if you visit the blog, you can use the new sidebar category and tag cloud displays to find content by topic.

Included is my “Recommended Posts” category, which includes the posts that I think are among the best and the most valuable of what I’ve written. These are mostly nptech-related, with a few of the personal posts thrown in, along with some humor.

The newly-added content that is also in recommended posts includes:

Everything has been published by it’s original date, though, so if you’re really curious, you can find all the new stuff at these links:

I’m not finished — NTEN and Idealware have both given me permission to publish the longer articles that I’ve  written to the site.  So I will do that on a new “Articles” page.  These will include write-ups on document management, major software purchasing, data integration standards, RSS and system architecture.  Look for them this week.

October 24 2014

Incoming Content – Apologies In Advance!

wave-357926_1280 RSS subscribers to this blog should take note that I’m apt to flood your feeds this weekend. Over the past few weeks, I’ve gathered 35 to 40 posts that  I’ve written for other blogs  that I’m adding here.  These are primarily posts that I wrote for the NTEN, Idealware, Earthjustice and Legal Services  Corporation blogs, but neglected to cross-post here at the  time. The publish dates run from mid 2006 to a few months ago. I’m also seeking  permission to republish some of my larger articles that are out there, so you’ll be seeing, at  least, my guide on “Architecting Systems to Support  Outcomes Management”, which has only been available as part of NTEN’s ebook “Collected Voices: Data-Informed  Nonprofits“.

Another part of this project is to rewrite my tags from scratch and re-categorize everything on the blog in a more useful fashion. With about 260 blog  posts, this is a size-able  book now,  It just lacks a good table of contents and index.

I’ll follow the flood with a post outlining what’s most worthwhile in the batch.  Look, too, for upcoming posts on the Map for Nonprofits and  Community IT Innovators blogs on Outsourcing IT and RFPs, respectively, which I’ll also cross-post here. Plans for upcoming Techcafeteria posts include the promised one on gender bias in nptech.  I’m also considering doing a personal series on the writers and artists that have most influenced me. Thoughts?

October 10 2014

It’s Time For A Tech Industry Intervention To Address Misogyny

News junkie that I am, I see a lot of headlines.  And four came in over the last 30 hours or so that paint an astonishing picture of a  tech industry that is in complete denial about the intense misogyny that permeates the industry.  Let’s take them in the order that they were received:

First, programmer, teacher and game developer Kathy Sierra.  In 2007, she became well known enough to attract the attention of some nasty people, who set out to, pretty much, destroy her.  On Tuesday, she chronicled the whole sordid history on her blog, and Wired picked it up as well (I’m linking to both, because Kathy doesn’t promise to keep it posted on Serious Pony).  Here are some highlights:

  • The wrath of these trolls was incurred simply because she is a woman and she was reaching a point of being influential in the sector.
  • They threatened rape, dismemberment, her family;
  • They published her address and contact information all over the internet;
  • They made up offenses to attribute to her and maligned her character online;
  • Kathy suffers from epileptic seizures, so they uploaded animated GIFs to epilepsy support forums of the sort that can trigger seizures (Kathy’s particular form of epilepsy isn’t subject to those triggers but many of the forum members were).

The story gets more bizarre, as the man she identified as the ringleader became a sort of hero to the tech community in spite of this abhorrent behavior. Kathy makes a strong case that the standard advice of “don’t feed the trolls” is bad advice.  Her initial reaction to the harassment was to do just what they seemed to desire — remove herself from the public forums.  And they kept right after her.

Adria Richards, a developer who was criticized, attacked and harassed for calling out sexist behavior at a tech conference, then recounted her experiences on Twitter, and storified them here. Her attackers didn’t stop at the misogyny; they noted that she is black and Jewish as well, and unloaded as much racist sentiment as they did sexist.  And her experience was similar to Kathy Sierra’s.

These aren’t the only cases of this, by far.  Last month Anita Sarkeesian posted a vblog asking game developers to curb their use of the death and dismemberment of female characters as the “goto” method of demonstrating that a bad guy is bad. The reaction to her request was the same onslaught of rape and violence threats, outing of her home address, threats to go to her house and kill her and her children.

So, you get it — these women are doing the same thing that many people do; developing their expertise; building communities on Twitter, and getting some respect and attention for that expertise.  And ferocious animals on the internet are making their lives a living hell for it.  And it’s been going on for years.

Why hasn’t it stopped?  Maybe it’s because the leadership in the tech sector is in pretty complete denial about it.  This was made plain today, as news came out about two events at the Grace Hopper Celebration of Women in Computing conference running this week. The first event was a “White Male Allies Plenary Panel” featuring Facebook CTO Mike Schroepfer; Google’s SVP of search Alan Eustace; Blake Irving, CEO of GoDaddy; and Tayloe Stansbury, CTO of Intuit.  These “allies” offered the same assurances that they are trying to welcome women at their companies. A series of recent tech diversity studies show that there is a lot of work to be done there.  But, despite all of the recent news about Zoe Quinn, Anita Sarkeesian, etc., Eustace still felt comfortable saying:

“I don’t think people are actively protecting the [toxic culture] or holding on to it … or trying to keep [diverse workers] from the power structure that is technology,”

Later in the day, Satya Nadella, CEO of Microsoft, stunned the audience by stating:

“It’s not really about asking for the raise, but knowing and having faith that the system will actually give you the right raises as you go along.”

Because having faith has worked so well for equal pay in the last 50 years? Here’s a chart showing how underpaid women are throughout the U.S. Short story? 83% of men’s wages in the best places (like DC) and 69% in the worst.

Nadella did apologize for his comment. But that’s not enough, by a long shot, for him, or Eric Schmidt, or Mark Zuckerberg, or any of their contemporaries. There is a straight line from the major tech exec who is in denial about the misogyny that is rampant in their industry to the trolls who are viciously attacking women who try and succeed in it. As long as they can sit, smugly, on a stage, in front of a thousand women in tech, and say “there are no barriers, you just have to work hard and hope for the best”, they are undermining the efforts of those women and cheering on the trolls.  This is a crisis that needs to be resolved with leadership and action.  Americans are being abused and denied the opportunity that is due to anyone in this country. Until the leaders of the tech industry stand up and address this blatant discrimination, they are condoning the atrocities detailed above.

Postnote: The nonprofit tech sector is a quite different ballpark when it comes to equity among the sexes.  Which is not to say that it’s perfect, but it’s much better, and certainly less vicious. I’m planning a follow-up post on our situation, and I’ll be looking for some community input on it.

 

October 3 2014

The Increasing Price We Pay For The Free Internet

The Price of Freedom is Visible HerePicture : Rhadaway.

This is a follow-up on my previous post, A Tale Of Two (Or Three) Facebook Challengers. A key point in that post was that we need to be customers, not commodities.  In the cases of Facebook, Google and the vast majority of free web resources, the business model is to provide a content platform for the public and fund the business via advertising.  In this model, simply, our content is the commodity.  The customer is the advertiser.  And the driving decisions regarding product features relate more to how many advertisers they can bring on and retain than how they can meet the public’s wants and needs.

It’s a delicate balance.  They need to make it compelling for us to participate and provide the members and content that the advertisers can mine and market.  But since we aren’t the ones signing the checks, they aren’t accountable to us, and, as we’ve seen with Facebook, ethical considerations about how they use our data are often afterthoughts.  We’ve seen it over and over, and again this week when they backed off on a real names policy that many of their users considered threatening to their well-being.  One can’t help but wonder, given the timing of their statement, how much new competitor Ello’s surge in popularity had to do with the retraction. After all, this is where a lot of the people who were offended by the real names policy went.  And they don’t want to lose users, or all of their advertisers will start working on Ello to make the Facebook deal.

Free Software is at the Heart of the Internet

Freeware has been around since the ’80’s, much of it available via Bulletin Boards and online services like CompuServe and AOL. It’s important to make some distinctions here.  There are several variants of freeware, and it’s really only the most recent addition that’s contributing to this ethically-challenged business model:

  • Freeware is software that someone creates and gives away, with no license restrictions or expectation of payment. The only business model that this supports is when the author has other products that they sell, and the freeware applications act as loss leaders to introduce their paid products.
  • Donationware is much like Freeware, but the author requests a donation. Donationware authors don’t get rich from it, but they’re usually just capitalizing on a hobby.
  • Freemium is software that you can download for free and use, but the feature set is limited unless you purchase a license.
  • Open Source is software that is free to download and use, as well as modify to better meet your needs. It is subject to a license that mostly insures that, if you modify the code, you will share your modifications freely. The business model is usually based on providing  training and support for the applications.
  • Adware is free or inexpensive software that comes with advertising.  The author makes money by charging the advertisers, not the users, necessarily.

Much of the Internet runs on open source: Linux, Apache, OpenSSL, etc. Early adopters (like me) were lured by the free software. In 1989, I was paying $20 an hour to download Novell networking utilities from Compuserve when I learned that I could get a command line internet account for $20 a month and download them from Novell’s FTP site. And, once I had that account, I found lots more software to download in addition to those networking drivers.

Adware Ascendant

Adware is now the prevalent option for free software and web-based services, and it’s certainly the model for 99% of the social media sites out there.  The expectation that software, web-based and otherwise, will be free originated with the freeware, open source and donationware authors. But the companies built on adware are not motivated by showing off what they’ve made or supporting a community.  Any company funded by venture capital is planning on making money off of their product.  Amazon taught the business community that this can be a long game, and there might be a long wait for a payoff, but the payoff is the goal.

Ello Doesn’t Stand A Chance

So Ello comes along and makes the same points that I’m making. Their revenue plan is to go to a freemium model, where basic social networking is free, but some features will cost money, presumably business features and, maybe, mobile apps. The problem is that the pricing has to be reasonable and, to many, any price is unreasonable, because they like being subsidized by the ad revenue. The expectation is that social media networks are free.  For a social network to replace something as established as Facebook, they will need to offer incentives, not disincentives, and, sadly, the vast majority of Facebook users aren’t going to leave unless they are severely inconvenienced by Facebook, regardless of how superior or more ethical the competition is.

So I don’t know where this is going to take us, but I’m tired of getting things for free.  I think we should simply outlaw Adware and return to the simple capitalist economy that our founders conceived of : the one where people pay each other money for products and services. Exchanging dollars for goods is one abstraction layer away from bartering. It’s not as complex and creepy as funding your business by selling the personal information about your users to third parties.  On the Internet, freedom’s just another word for something else to lose.

Category: Politics, Social Media, Technology | Comments Off
September 27 2014

A Tale Of Two (Or Three) Facebook Challengers

Screen Shot 2014-09-26 at 8.20.31 PMFor a website that hosts so many cute pet videos, Facebook is not a place that reeks of happiness and sincerity. It’s populated by a good chunk of the world, and it’s filled with a lot of meaningful moments captured in text, camera and video by people who know that, more and more every day, this is where you can share these moments with a broad segment of your friends and family. And that’s the entire hook of Facebook — it’s where everybgoogleplusody is.  The feature set is not the hook, because Google Plus and a variety of other platforms offer similar feature sets. And many of those competitors, including Google’s offering, are more sensitive to the privacy concerns of their users and less invasive about how they share your data with advertisers.

Many of my professional acquaintances are on both Facebook and Google Plus. But they comprise only about a third of my Facebook friends. So I check Facebook most every day.  I go to Google Plus on rare occasion.

Facebook has a well-known history of overstepping.  From the numerous poorly thought out schemes to court advertisers by letting them tell the world what lingerie we’re buying to use our photos in sidebar advertising, to the constant updating of security settings that seems to always result in less security, it’s clear to most of us that Facebook is trying to please it’s advertisers primarily, and we are more the commodity that they broker than the clientele that they serve.

A few years ago, some people who valued Facebook but were fed up with these concerns developed Diaspora, the anti-Facebook — a network that is built on open source software; distributed, and highly respectful of our right to own and control our content. Diaspora does this by storing the data in “podEllos“, which are individual data stores hosted by users.  You can join a friend or neighbor’s pod, or start your own.  The pods, which work a lot like peer-to-peer apps like BitTorrent, communicate with each other, but the people who run Diaspora do not control that data.  You can blow away your Pod from your file manager or command line if you care to, and nobody is going to stop you. If these networks were fictional, Facebook would have been created by Andy Warhol and Diaspora by Ursula LeGuin.

And this week’s big news is Ello, which, like Diaspora, has defined itself in relationship to Facebook as the user-focused alternative.  Ello is, at present, a rough beta network that shows glimmers of elegance.  Their manifesto is poetry to BoingBoing readers like me:

“Your social network is owned by advertisers.

Every post you share, every friend you make, and every link you follow is tracked, recorded, and converted into data. Advertisers buy your data so they can show you more ads. You are the product that’s bought and sold.

We believe there is a better way. We believe in audacity. We believe in beauty, simplicity, and transparency. We believe that the people who make things and the people who use them should be in partnership.

We believe a social network can be a tool for empowerment. Not a tool to deceive, coerce, and manipulate — but a place to connect, create, and celebrate life.

You are not a product.”

But let’s be clear about Ello. It’s centralized, like Facebook; not distributed, like Diaspora.  It was built with about half a mil of venture capital funding. It will need to make money at some point in order to return on that investment.  As we watch Twitter get more and more commercialized, we know that this is a story just waiting to happen.

So, what am I saying?  That we should skip Ello and proceed to Diaspora?  Sadly, no.  While Diaspora has the model that I believe is viable to sustain a non-commercial, user-focused network, Grandma isn’t going to host her own server pod.  Peer-to-peer technology is not ready for prime time yet.  So I don’t see a Facebook killer here, or there, or anywhere in sight.  I see people who understand that the crass pimping of our personal lives that Mark Zuckerberg calls a business model is problematic and worthy of replacing.  We can’t replace it with something too geeky for the masses, nor can we replace it with a clone that kinda hopes that it will have a better business model (but likely will only have a less abrasive version, much like Google Plus).

I have a lot of high hopes lately.

I hope that we can curtail this trend of training our local police to be paramilitary units and champion nationwide community policing, as a community controls and reduces crime, while a military goes to war.

I hope that we can reverse the damage that was done when TV News programs became subject to Neilsen ratings.  I consider that to have been a dark day for our society. It was the hard turn that steered us to a place where news is available for whatever biased lens that you want to view it through.

And I hope that somebody will develop a Facebook competitor with a viable business model and a compelling feature set that will yank all of my friends and family out of their complacent acceptance of Facebook’s trade-offs. In this digital era, this is insanely important. We commune online; we share our most treasured moments. We sway each other’s attitudes on important matters.  The platform has to be agnostic, and it has to be devoted to our goals, not those of a third party, such as advertisers.  We have enough problems with societal institutions that have a stated purpose, but answer to people with different aims.

These are all realistic dreams.  But they seem pretty far away.

Category: Social Media | 2 Comments