There's a new code.org celebrity video making the rounds that features tech luminaries such as Bill Gates, Mark Zuckerberg, and Will.i.am (wait, you didn't know he was a technologist?) advocating the idea that all kids should learn to code. It's a well-produced video featuring famous luminaries and comes on the heels of an ongoing debate around whether all of us — even the mayor of New York City — should be able to program software.
I have two worries about the video: 1) it's wrong, and 2) peers in the social sector will perceive it as more proof positive that in order to have social impact in the 21st century you need to be able to build an app. I think there's a better approach: focus on empowering technology users, especially in the social sector, to make more intelligent decisions about their technology choices while allowing specialists to remain the actual coders.
Why You Shouldn't Learn to Code
The thrust of the code.org argument in favor of learning to code is that software is everywhere and runs everything, so if you don't know how to program software you are as dysfunctional as someone who can't read or write. This is flawed for more than one reason, not least of which because it implies that software programming is the most important skill set in our contemporary world.
As a thought experiment, here are a few other skill sets that I would argue are just as important (if not more important) than coding in that very same contemporary world:
1) The ability to understand legalese. If you think databases and code rule the world, understanding their Terms of Service is equally important. Facebook, Microsoft and Google only make money because you allow them to through acceptance of those dozens of pages of dense, massively lawyered TOS. Through the most recent wave of technology patent wars, we're now seeing lawyers, not programmers, take the driver's seat in dictating the evolution of the technology sector.
2) The ability to read a balance sheet. Basic accounting and finance skills are no less essential to survival in the current era as computer programming. What nearly caused a second Great Depression was not bad code but bad finance and shoddy accounting.
3) The ability to write well. It may not be in book form any more, but the ability to lay out and defend a cogent point of view in written form is an increasingly (and worrisomely) rare skill. The medium might have evolved to blogs, but thought leaders still shape the dialogue and push big new ideas onto the public consciousness because they are stellar writers, not because they can code. And big ideas still matter.
Is software an extremely important facet of contemporary life? Absolutely. Is it definitely more important than writing well, being able to call BS on an income statement, or the ability to intelligently push back when a lawyer pressures you to sign something? Absolutely not. So why aren't we pressuring everyone to go to law school, get an MBA, or take writing workshops?
A Happy Medium
Rather than advocate an absolutist approach that everyone should code, a more realistic and smarter tactic would be to encourage everyone to understand the basic tenets of computer programming, especially object oriented programming. This is akin to being able to push back on the lawyer even though you didn't go to law school, or to understand when a balance sheet smells fishy even though you're not a CFO. Encouraging everyone to be exposed to Computer programming 101 strikes me as a perfectly plausible idea because it would empower lay users to make more intelligent decisions about technology without having to absorb the massive opportunity costs associated with actually mastering the various programming languages.
Where I would argue we want to be is a place where more people understand the importance and relevance of key programming concepts such as data models, system architecture, APIs, and database structure rather than non-specialists being pressured into mastering Java, PHP, or C++. We should strive for an army of informed and empowered users who can make intelligent decisions about technology. When necessary, they can then intelligently hire and oversee specialists to perform the actual coding.
What it Means for the Social Sector
The universe in which I inhabit — the social sector — is notorious for being behind the curve when it comes to technology. The visceral response to that is often to assume that we therefore must embrace technology (whatever that means) and plenty of money and energy goes wasted in a quixotic attempt to turn advocates into hackers.
Rather than overreact to the code.org video (and the larger argument being pushed that everyone should learn to code), smart social sector actors would be wise to focus on becoming well-informed clients that can weigh the costs and benefits of key technology strategies and investments without having to actually implement the solutions themselves. In other words: become familiar enough with the jargon, terminology, products, and platforms to be able to make an informed decision…and stop there. To take a hypothetical: I don't want Human Rights Watch to suddenly launch a "labs" division with dozens of programmers cranking out apps. Instead, I want them to focus on what they do well (campaigning and legal work) while making smart and informed decisions about a) whether the need an app at all, b) who to hire to build the app, and c) how they (as the client) are integral to the discovery and needs assessment process.
The bottom-line: we don't all need to code. But we do all need to know when to call the coders.