Why I’m Leaving Facebook

The costs associated with the free social platform are just too high

There is a point beyond which a service becomes too expensive. The latest round of news about Facebook has made it clear to me that we have passed that point. The social impact and privacy costs of Facebook are too high for me. So, at the end of this month, I will be deleting my Facebook and Instagram account.

I’ve read many people lately who are making the same decision over concerns about how much social media dominates their time. Although that is a serious question, this is not my concern with Facebook. In fact, the app hasn’t been installed on my phone for months. I am deleting my Facebook and Instagram accounts because of the clear record of Facebook’s violation of the public trust. They know more about me than any other organization in the world. And I don’t trust them with that knowledge.

My goal is not to make a convert out of you. I’m not looking to spark a boycott or any sort of public demonstration. This post is to explain my thinking and maybe prompt you to think about your relationship with this company. That is all.

The issue I have with Facebook is cut and dried: They know too much about us (even if you don’t have an account), and they lie about what they do with that data.

Over the course of 2018, we learned a lot about how Facebook really operates, and it generated numerous scandals. Not only does Facebook follow you all over the web, track what you read, watch, and listen to, they also know what you purchase, where you go, and who you hang out with. Facebook has built a surveillance infrastructure that rivals in reach and depth anything George Orwell or the CIA could have dreamt up. And they have convinced us all to willingly hand over this information. Every day, millions of people share private and intimate details of their lives with this company.

The surveillance is not even the worst part. What’s actually worse is the purpose for which they use our private data. They are not using it to identify terrorists, study social ills, or encourage a constructive conversation in the public square. They are not aiming for any number of controversial, yet arguably defensible goals. No, they are using it to sell us T-shirts, insurance, and rugs. And of course in the process they are actually contributing to many of our societal ills. Smartphone addiction, a poisonous political discourse enflamed by filter bubbles, and foreign interference in our elections are just some of the problems that are being made significantly worse by Facebook. And in the last few weeks we’ve even heard about how they police political speech on their platform, with no transparency.

In tech punditry circles there was a popular phrase for a while that said if you used a service you didn’t pay for you were, in fact, the product. That’s not exactly true, but it does broadly capture the strategic reasons behind the success of companies like Facebook, Google, and others. My private data and ad impressions are Facebook’s inventory. I am their product. And that reality is one I can no longer abide.

By willingly handing over private information to these companies for a free service that is not essential—especially services that can be so destructive in how they can capture our attention—we are opening ourselves and our culture up to potentially massive problems. What we have seen with the Cambridge Analytica incident, Russian interference in our elections, and other Facebook scandals is just the tip of a very, very big iceberg. And I don’t think they can change course.

So I’m getting off the boat.

If you would like to keep up with what I’m doing after I leave Facebook, the best ways are to follow me is on Micro.blog, Twitter, or by signing up for my email newsletter.


Why I Switched to Jekyll from WordPress

This post is the first post on the new version of this site. If you are a regular reader then I’m sure you noticed the much cleaner and more mobile-friendly design. But what you probably didn’t notice—unless you routinely scrutinize URLs—is that this is now a static site. I left WordPress.

Now, for most readers of this site this won’t matter at all. Load times will be much faster, but it wasn’t noticeably slow before. Other than that, moving from a dynamic, server-side site to a static, HTML site won’t make one bit of difference to you. But it does to me.

WordPress is not terrible. But it’s not good, either.

Since I started blogging in 2003, I’ve used almost every major platform. Blogger, WordPress, Squarespace, and then WordPress again. I’ve never been happy with any of them, really. So when I started my tech blog last year I wanted something different.

For that site I chose Jekyll. Jekyll is a static site generator, not a blog platform, per se. Using text files written in Markdown, Liquid templates, HTML, JavaScript, and CSS it generates my whole website in a few seconds. From there I host it on Amazon’s S3 for a few cents a month. Last month it cost me just under $.50 to host and serve the site.

Once I got it in place it worked perfectly. Pages loaded super fast, I could customize the site using the tools I’ve used for years as a web developer and not some janky themes or plugins. I was able to create a clean, readable, and responsive site quickly and host it cheaply. After using Jekyll for just a few months I began work converting this site to Jekyll today. This post is the first on the new site.

For me, the decision was easy. I went with Jekyll for 5 reasons:

  1. I first went with WordPress to use some fancy, SEO-optimized themes. The theme(s) were not worth the money
  2. Beyond those themes, all of the “features” of WordPress were not only unnecessary, they made it harder for me to create the kind of site I wanted. The platform is overgrown and cluttered with features that don’t add much discernible value
  3. I don’t need to post “on-the-go” from my phone
  4. I do all my writing in Markdown, and WordPress did not offer an acceptable workflow for it. Yes, even with the fancy plugins and Markdown “integration”
  5. Finally, and most decisively, I had the technical skills to code my own site

This last reason is why most people will not be able to make the same choice I did. Even with great guides out there like the one I used, implementing a custom Jekyll site is still beyond the average blogger’s reach.

But I think static sites like this one are (or should be) the future for most independent writers on the web. Depending on your business model, there are very few downsides. If you need a membership site or other server-side features this would not be a good choice, but otherwise static sites have huge advantages.

I’m not making a living off this site, so I don’t have those needs. So with Jekyll I can run this blog cheaply, easily, and it fits right into my existing writing workflow. It really is perfect for me.


→ What the iOS App Store needs is better search

Bloomberg reported that Apple has a team working on monetizing App Store search, ala Google ads. The idea is that developers could pay Apple for better placement in user searches. From the story:

Among the ideas being pursued, Apple is considering paid search, a Google-like model in which companies would pay to have their app shown at the top of search results based on what a customer is seeking. For instance, a game developer could pay to have its program shown when somebody looks for “football game,” “word puzzle” or “blackjack.”

Paid search, which Google turned into a multibillion-dollar business, would give Apple a new way to make money from the App Store. The growing marketing budgets of app developers such as “Clash of Clans” maker Supercell Oy have proven to be lucrative sources of revenue for Internet companies, including Facebook Inc. and Twitter Inc.

John Gruber has it right:

This sounds like a terrible idea. The one and only thing Apple should do with App Store search is make it more accurate. They don’t need to squeeze any more money from it. More accurate, reliable App Store search would help users and help good developers. It’s downright embarrassing that App Store search is still so bad. Google web search is better for searching Apple’s App Store than the App Store’s built-in search. That’s the problem Apple needs to address.


Quick Grades v. 1.0

Today I launched my first independent software project, Quick Grades. Quick Grades is a free easy grader iPhone app built for teachers. I created this app for my wife several years ago, and I kept it updated for her. This fall I rewrote it from scratch for iOS 8. My wife, the original tester, loves the app and uses it often—and my hope is that many other teachers discover it and feel the same way.

I will post more of the technical details later, but I want to give you a short overview of the app. For those of you who aren’t teachers, let me introduce to you a tool that nearly every teacher in America knows well, the EZ Grader.

A vintage cardboard EZ Grader

This tool allows a teacher to set the total number of points for an assignment, and then quickly refer back to it to get the correct score for each graded paper. These graders have been around forever. I remember my mom using hers in the early 80’s, and it is the same one that’s sold today. It’s a useful item for teachers, but it’s also one more thing to keep track of.

Many teachers now use apps instead of the old physical EZ Graders. It makes perfect sense, but most of the apps aren’t very good. The one I my wife used before was not user friendly. The font was hard to read, the color scheme was distracting and hurt readability, and it did not have sensible settings or options. After watching her struggle with the app one too many times, I decided to build her a new one.

After three years of updates just for her, and all the changes from iOS 5 through iOS 8, the app evolved quite a bit. The functionality never changed, but the simple premise of this app offered lots of room to experiment and find the right design.

A four column design with a scrolling table for the scores turned out the be the right approach. Other applications jam too much information into one view. The choice to use a scrolling table view allows for larger and more readable text, and scrolling to find the right score is intuitive and fast.

The second major design choice was to allow the user to decide exactly what columns they want. Every teacher likes to see a different view based on their grading systems and preferences. Allowing the user to remove certain columns from the view also improves readability.

Additionally, I made the easy decision not to interrupt my users and ask them to rate my app. This is a topic that has generated a lot of conversation in the last year, and I land firmly in the ‘anti-prompt’ camp. Yes, App Store ratings are important. Yes, I really do hope my users like the app enough to rate it favorably. But no, I will not be interrupting their work to ask them to do so.

This app is small and focused. It’s simple by design, and a perfect use case for a mobile app. Your phone is always with you, so you don’t have to keep up with the old cardboard EZ Grader. On a phone the simple interface is better than a more complicated one. Clean design and a focus on the user’s needs yields a simple, yet very useful app.

That’s my hope for every teacher that downloads Quick Grades—that it makes their work just a little bit easier. Our teachers work so hard, for so little pay and have many challenges in their way. If I can help by adding something to their toolbox that is just a bit better than the alternatives, then I will consider this project a success.

Quick Grades is now available for free on the iTunes App Store.


For New Projects, Swift or Objective-C?

In the coming days I will be launching a new app on the App Store. I built this application for my wife three years ago after watching her use another app to help grade papers. Because of my employment at the time I could not release it, but now I will be doing so.

This fall I rewrote it from scratch in Swift for iOS 8, and I really enjoyed it. Learning Swift, the good and the bad, was fun. Now, I am looking down the barrel at two new independent projects, one iOS and one OS X.

Objective-C was never a language I felt fully comfortable with. I was competent and was able to build several projects, but I never loved it. I enjoyed the power and flexibility of the Cocoa frameworks, I loved making apps for the iPhone and iPad, but Objective-C never completely fit me well. It was a means to an end.

Swift is a different story. While there are still wrinkles in the language and the community is working through best practices and conventions, it is surprisingly capable for a new language. Apple made a smart move in using the Objective-C runtime and working to make sure it was interoperable with Cocoa frameworks written in Objective-C. This is certainly where the power and flexibility come from. But the language itself is comfortable. It takes many of the things I like from other languages and melds them into a tool that really works for Cocoa.

For my next two projects I will almost certainly write them in Swift, unless I run into a technical limitation, because it’s simply more fun. I am considering Objective-C for one of them, solely to make sure I stay sharp with the language, but I’d be surprised if I do.

It may be reckless to adopt a new language so readily, and I may regret it, but I don’t think so. I cannot imagine Apple pulling back on Swift support after the start its had. For the time being I will be plunging into the near future with Swift. We’ll see how it goes.


→ A valid counterpoint

From Poul-Henning Kamp:

That is the sorry reality of the bazaar Raymond praised in his book: a pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT “professionals” who wouldn’t recognize sound IT architecture if you hit them over the head with it. It is hard to believe today, but under this embarrassing mess lies the ruins of the beautiful cathedral of Unix, deservedly famous for its simplicity of design, its economy of features, and its elegance of execution.

One of Brooks’s many excellent points is that quality happens only if somebody has the responsibility for it, and that “somebody” can be no more than one single person—with an exception for a dynamic duo. I am surprised that Brooks does not cite Unix as an example of this claim, since we can pinpoint with almost surgical precision the moment that Unix started to fragment: in the early 1990s when AT&T spun off Unix to commercialize it, thereby robbing it of its architects.

It’s very easy for me to get overly confident in the open source movement and ethos as the only way forward. It’s good to hear a valid counterpoint every once and a while.


Observations from a Node newbie

It’s been a few months since I started playing around with Node.js, and in the last few weeks I started writing my first full application in Node. I’ve found myself really enjoying it, even if I have not figured out all the nooks and crannies yet.

In this time I’ve come to realize that Node is very different than other languages/frameworks I’ve coded in. Most of these differences impressed me, but some of them led to quite a few frustrating moments. This list below is a subset of my observations of a few months of working with Node—all just my personal opinions, of course.

  • Everything in Node just feels, well, fast. Most of my web development work lately has been with Rails, and I know this probably feeds into the old stereotypes of Ruby, but Node feels faster. Not that Ruby is slower actually, but Node feels lightning quick
  • Everything about Node is flexible. Because of the nature of Node and its implementation, it is incredibly flexible. Even within most of the established frameworks it is incredibly easy to write your application exactly as you want it. In many ways Node feels like the dream of those who love higher level languages. Throw the right npm modules into your package.json, and off you go
  • Everything about Node is almost too flexible. For good or for bad, it feels like I can do anything I want. From reading blogs and other community discussions, it seems like there is not yet a “Node way” in all but the most fundamental aspects of the language
  • It’s still JavaScript. I don’t mean that solely as a knock. Much of the power and flexibility of Node comes from the language underneath it. But, all the things about JavaScript that bug me are still there, and they still bug me. But there is enough value in the whole package to make it worth it. To like it even
  • Asynchronous, event-driven programming is a great fit for the server side. Especially for APIs. Today, I would’t pick a different tool
  • YMMV, but for me the community is still coalescing, and there is not yet a consensus on the “Node way” for many things yet. One of the things I earnestly love about Rails is the fact that the community has a cohesive point of view on how to use the framework. You don’t have to follow it, but it makes discovering and adopting best practices easy, which is good for the coder and their users
  • I have not found the right way for me to do TDD in Node yet. Maybe it’s a lack of tools, or maybe I just haven’t found what works for me, but testing is very important for me and I’m not yet comfortable

→ Realism on Swift

David Owens:

Swift has it’s warts. It’s a baby that’s trying to grow up quickly in a world that is harsh. The question I’ve been asking myself, both through these posts and throughout the weeks is really this: are Swift’s warts and ugly places worse than those of Objective C’s? Will they be tomorrow?

For the last week I’ve worked bit by bit on rewriting my first iOS app in Swift. Several years ago I wrote this app for my wife because she could not find a grade scale app on the App Store that was easy to use or not ugly. A few weeks later I installed this app on her phone, and she’s been using it ever since.

When I first created the app it was for iOS 5. As the years have gone on I did a few things to keep it up to date, particularly for iOS 7, but I never moved to Auto Layout. With the launch of the iPhones 6 and 6 Plus, I decided it was time, and I figured I’d use Swift. Why not, right?

It’s been a good exercise for me, and I’m learning to really like the language. I am more comfortable in Swift with my little bit of knowledge than I ever was in Objective-C, which I know far better. I enjoy it, but it’s not perfect. That’s why I appreciated this post from David Owens so much. It’s not perfect, but it’s what we’ve got.


I can’t escape the analog world

I work in a company that has been paperless for two decades. At home we have a ton of devices, more than one for every use case. I had a Palm Pilot in college. I haven’t sent a letter in the mail in over a decade. All of my to-dos and calendars are digital. And yet, I can’t escape the analog world of pen and paper.

Of course, the truth is that I don’t want to escape it. I love pens and paper. I love the feel of writing with a good pen. I love the physicality of a notebook. I carry a Field Notes book with me everywhere, every day.

I’m completely stuck in between analog and digital. If I take notes digitally they are easy to find, and easy to copy and send to others. If I take them with pen and paper I remember them more easily, and I am prompted to act every time I open my current notebook.

I don’t have a good solution to this problem. I don’t know how to get the best of both worlds. And yes, I’ve googled this many, many times. I’ve spent hours of blogs of the similarly afflicted. I’ve found solutions that come close, particularly archiving notebooks via photo in Evernote, but none of them have been practical for me.

So today I will go to work with my good pens and my notebook. I will sit down in my first meeting with my laptop open, and paper and pen beside it. I will continue living in between these worlds, because I don’t have another option.


The spirit of open source and the question of ownership

Update: Jeff Atwood and company have updated the name to ‘CommonMark’ in consultation with John Gruber. My thoughts below the original post.

Today Jeff Atwood (I’m a fan) and a working group announced a new fork of Markdown (which I am writing in, on a keyboard Atwood designed), a text markup language created by John Gruber (also a fan, a big fan)1. They created a new spec, intended to remove the ambiguity of Gruber’s original spec, and a standard set of test suites. They call it “Standard Markdown.”

They created this new project without Gruber’s involvement, and it looks like without his blessing—both of which are just fine in the open source world. I haven’t read the new spec in detail yet, and I think the standard tests are a great idea. This project was clearly conceived with a clear purpose because of a felt need. I think this is a good idea in many ways.

But the name is terrible. “Standard Markdown” is absolutely the wrong name for this project. And this isn’t nit-picking either, the name is so bad it casts a pall over the whole project.

First, reusing the name Markdown is a poor choice. I understand that we are all tired of “X flavored Markdown” as naming approach, but using the name Markdown on a fork of the syntax clearly marks this effort as trying to take the place of the original spec. It’s unnecessarily aggressive.

But beyond using that familiar moniker for the project, the team went a step further and claimed the mantle of “Standard.” This decision cannot be interpreted in any way other than an attempt to wrest ownership of the thing called Markdown from the guy who invented it. It’s an overly bold move, full of hubris.

I’m an open source proponent, but just not a Stallmanist2. I affirm this team’s right to take the Markdown language and syntax and make it better. I affirm the right of others to take that and use it instead of Markdown. That would be a good thing, in fact that is how technology advances. So don’t misread me, I’m in favor of this project existing.

But don’t call it Markdown, and certainly don’t call it “Standard Markdown.” Claiming something new to be a “Standard” version of an existing open source project is poor form. Gruber created the Markdown syntax and its first parser, he claimed a copyright, and released it to the world. It is open source. It has been built on, modified and extended well beyond its original incarnation. That’s all well and good. In a real sense, Gruber has a form of ownership over both the Markdown default implementation and syntax, open source though it is.

The chosen name for this project seems to reveal the intentions of the team behind it: take over the intangible ownership of a successful idea, and it’s brand, without the permission or help of the creator. Building on top of Gruber’s work is a good thing, it’s a great example of the open source ethos. But trying to claim a “standard” that is divorced from the original spec is not.

The fix is simple. Keep the new spec. Keep the test suite. Keep all the good work that moves Markdown forward. But change the name. It’s not the “standard” Markdown, and calling it that is dishonest.

Updated

On the second try, the project formerly known as “Standard Markdown” was renamed to “CommonMark.” First the renamed it to Common Markdown, and that names holds many of the same issues as the first.

Ultimately, this is the right outcome. There was lots of back and forth on Twitter between Atwood and Gruber, and of course their respective teams, and it looks like the name contention was eventually resolved in private.

In the end, this is the right solution. Given the people and platforms involved, there was no chance this would be resolved completely behind the scenes, but this result is good for all involved. Certainly CommonMark is not the best name ever, and some time spent thinking about alternative could result in something better, but it works. I’m glad the team decided to change the name, and I’m glad the web has a strict Markdown implementation. In fact, I’ll be investigating the test suite myself.

  1. This is a weird perfect storm of fandoms and respect. I’ve got so many horses in this race I think it all evens out. I have so many conflicting biases, I feel unbiased. 

  2. To be honest, I don’t even know what his opinion on this would be. I just think of him as the far-out open source radical. I bet he’d just think we’re all a bunch of sell outs. Which is cool, and maybe right. ;) 


→ Are we prepared for automation?

An automated world is closer than most of us realize.

This is the not Jetson’s future I was promised, but that’s okay. There is a problem though, as the video points out, in the economics.

When I talk about automation leading to efficiencies, I often frame it terms of freeing resources to do more interesting work. I still think that’s true, but this video offers a great rebuttal. There will be people doing more interesting and innovative work than before, but not enough to head off potentially disastrous levels of unemployment.

I really appreciate the summary though, because the automated future is not necessarily bad, but it will be if we are not prepared. As someone who is pursuing automation right now, I must admit we need to spend more time thinking about how to better handle the challenges that will result.


→ Is the App Store better than a minimum wage job?

Brent Simmons:

My city (Seattle) is in the process of raising its minimum wage to $15 an hour, which is about $30,000 for a full-time job. Many iOS indies would do better at minimum wage jobs here than on the App Store.

Ouch. If true, this is cold water on the dreams of many wannabe indie developers. And that’s too bad.

For software developers the dream of a one man dev shop, or at least a small team seems like nirvana. The problem is, nirvana may have a ‘no vacancy’ sign.


→ Confessions of an ex-developer

If you attend an iOS/Mac dev meetup and hang around long enough, you’ll start to hear the whispers and the nervous laughter. There are people making merry in the midst of plenty, but each of them occasionally steps away to the edge of the room, straining to listen over the music, trying to hear the barbarians at the gates. There’s something wrong with the world, Neo.

Great read from Matt Gemmell. I’ve been in this spot and gone back to development, I get where he is coming from.

There is something on the horizon, although I don’t know exactly what it is. There are so many changes afoot in the development world and in the industry that I don’t think anyone can see very far down the road anymore. Developers have become a recognized asset in the broader world, yet it seems like there are lots of people who don’t want to let that stand. The next decade will be fascinating.


→ DevOps and ‘The Phoenix Project’

When a colleague handed me The Phoenix Project I was very skeptical. I’m a reader and a writer, and if I can be honest, I’m more snobby about it than I should be. I was skeptical of the book, but I read it.

I read it in one day. I really liked it.

It’s not great literature. The criticisms of the storytelling and writing in the reviews I’ve read are dead on. That’s okay by me though, that’s not the point of the book. Sure, in an ideal world the level of art would be higher and match the lofty ideals of IT presented here, but it didn’t, and I’m okay with that.

The book tells the story of an all-too-typical IT shop. Everyone is under pressure, well intended processes are ignored because they are too cumbersome, and IT can’t deliver critical features to the business. Our hero is thrown into a battlefield promotion, and must save the day. He learns DevOps from the ground up, by thinking it through (with a touch of help from an all-knowing operations zen master-type), without ever using the term.

If you want to learn about DevOps, and why everyone thinks it’s a big deal, read this. If you need to convince someone else that DevOps is a good idea, hand them this book. It’s not the best novel, but it gets its job done. Actually, all things considered, it does its job well.


Leaders want the ball

Leading a technical organization is dangerous work. Every week there are literally hundreds of small decisions that must be made without full comprehension of the associated risks or benefits. Each one of these decisions could lead your team down the path to success or failure. Oh, and everyone on your team has a different opinion. That makes decision making very hard, and if we are honest, scary.

Yet leadership is not about decisions. Trying to lead solely through decisions is a recipe for failure, no matter how good they are. Effective decision making is simply management, but leadership is so much more.

Leaders want the ball

In college I played intramurals for my fraternity in every sport I could. I spent a lot of time on the ‘B’ teams, and later leading on our executive council. I loved it all. In that fraternity I learned about leadership firsthand, and it started on the intramural fields.

In the sports I was confident in, I wanted the ball. I wanted to be the one who had the last shot, or who sparked the final goal. I wanted to lead the team, to influence victory. In those sports we saw good victories over the years, and I loved every one of them.

But in the sports I was not good at, I hung back. I passed the ball quickly, I looked to sub out at the end of a close game. It wasn’t cowardice exactly, but it was close. I knew the decisive moment was near, and I didn’t trust myself with it. Victory in those moments did not feel good, but instead, like relief.

As the years went on and my influence in the fraternity grew, I saw the same thing with my fellow officers. The good ones wanted to lead. They wanted the ball. As I got older and took on more serious positions, I started to want the ball, too.

When our biggest decisions came about, the same pattern emerged, and our best leaders stepped up. And in time, they helped others step up. When the biggest decision of my time in school came around, it was my best friend and I who stepped up and led our chapter to the right decision, one which has benefited us every year since.

When I left school the fraternity was in much better shape than when I arrived. It wasn’t me—it was those around me. We had great leaders, and they helped everyone grow. The stakes weren’t as big as they are for me now, and the current decisions are certainly more complex, but the truth remains: leaders want the ball.

Effective leaders build culture

In soccer, the jersey number 10 is sacred. Number 10 is the playmaker, he sparks the great plays. Number 10 is who you want on the ball when the clock goes past 90 minutes. Number 10 scores some goals himself, sure, but Number 10’s value is his ability to rack up the assists.

The technical leader should be the team’s Number 10. They should rack up the assists. They set the team up to win, and do it by building a winning culture.

If you reflect on your time on technical teams, both effective and otherwise, I bet you notice a simple pattern: teams evolve in the image of their leaders. The values and goals of the leader directly set the tone for the team. What the leader recognizes and rewards, the team produces more of. More importantly, the choices a leader makes, their team emulates.

Good leaders want to influence their teams. Good leaders want to model the right behaviors and set a winning tone for their team. Bad leaders simply want to manage the status quo, and eek out a little bit of a performance gain. Good leaders build culture, bad leaders make excuses why the culture can’t change.

Next time the big decision comes down the line, don’t just make a choice—lead. Bet on your team. Bet on your people. Have the courage to want the ball.

In the end, don’t manage. Lead.

Management presides over bureaucracy. Leadership empowers people.

Management dictates policy. Leadership expects cooperation and accountability.

Management enshrines standardization. Leadership models flexibility.

Management trusts process. Leadership trusts people.

Managers pass the buck. Leaders want the ball.


Evolutionary change fuels revolutions

We commonly draw a distinction between revolutionary and evolutionary change. We categorize some changes like the invention of the PC as revolutionary, and others like a new development framework as evolutionary. I suppose there is something about that idea that makes sense. There are changes that have immediate and wide-ranging effects. But I’ve never seen this as a useful distinction when you are trying to effect change.

This distinction is commonly used to comment on big changes in an industry, product category or culture. Commentators see the big, evolutionary change leapfrogging over many smaller changes. They categorize changes into these categories as an either/or proposition. A given advance must fit into one bucket or the other. There is another group besides pundits that also commonly uses this paradigm to think about how change works: managers.

This kind of hard, binary distinction is something that only an observer and not a creator could come up with. Creators know that these kinds of changes are really two sides of the same coin. That’s not to say that we don’t need both managers and commentators, but their perspective is, for the most part, that of an outsider to the creative process. Creators know that there is a very fine, if even distinguishable, line between the evolutionary and the revolutionary.

A revolutionary change can only be sustainably made after many smaller changes lay a foundation. Wi-Fi was revolutionary, but it was built on the back of other technologies. The TCP/IP protocol, radio communication, low-power radio receivers, fast high-quality encryption and more led the way to making Wi-Fi possible. For the outside observer, this appears to be an evolutionary change out of the blue, but to anyone in the networking sphere it was the inevitable result of many smaller (from this perspective) innovations.

We must keep this in mind when we are responsible for changing systems, be they software, hardware or processes. Small evolutionary changes fuel the big, revolutionary change. New principles and ideas build on what came before them. To effect change in a large scale system, leaders and designers must emulate this evolutionary process.

I think the easiest example here is contemplating how a large development organization might make the move to an agile methodology. The leaders certainly can decide to just institute the process changes—daily stand-ups, backlogs, fast iterations and the like—and train the staff in how it should work. This is the way many companies do it. But it is usually not very effective. Because agile is not just a way of running projects, it’s a way of thinking about software, a very different way than many people were trained to think about it. Implementing the changes to the process without changing the way people think, without trying to change the culture, will not result in a meaningful change.

Small changes—like your team seeing the wisdom in the agile principles—is not always easy, and it does not occur overnight. A change like this is evolutionary by its very nature. By constantly fostering small changes you can build the foundation for the evolutionary change you want.

Big changes are possible—they happen every day—you just can’t skip building the foundation for them.


For software, stasis equals death

“Stasis equals death.” I picked up this phrase from a really good book on screenwriting, and found that it absolutely applies to the software development world. It applies because it is utterly true. For developers of modern software, products are either under active development or they are dying.

This dynamic was clear in Apple’s transition last year to iOS 7. Within a few weeks of the update, most of the top apps were updated to the new look, and those developers that didn’t felt heavy pressure from their users. Look at anyone’s iPhone home screen months later and there are likely no iOS 6 apps on it.

The apps that were not updated now feel old. They still work as intended, but they no longer meet the expectations of their users, and as a result, they might as well not exist.

In the world of traditional enterprise software we see the same issues. Projects to develop or enhance an application finish, the operations team is (hopefully) trained, and the development team moves on to the next project. Sometimes on the same application, sometimes not. Either way, the project in production might as well not exist to the developers unless it breaks. After all, they have the next big project to worry about. The application enters stasis.

This leaves the users out in the cold. If they have changes for the application or issues that they need fixed, they must submit their issues to the ops team and feature requests to the product manager, or the roadmap committee, or whatever bureaucratic madness the governance folks can cook up. The timeline on addressing those requests then tend to run in months, not days. Even good software at this point starts to feel old, and broken.

It’s not just the users who are left out, either. As requests for work on the app stack up, and the next big project holds all the attention, a technical debt stacks up on the application in production. The issues may not be visible to the project team, or they may be known but “de-prioritized” in favor of the existing roadmap. Either way, this team runs a very real risk of launching a new iteration of the product that is even further away from what the users actually want and need. Even scarier is this simple fact: the longer these projects take to complete, the farther away they can stray from the users’ actual needs.

I know, right now you’re shaking your head and thinking, “It’s a good thing I don’t have this problem, because we are Agile.” Well, you might be right. Maybe.

Agile as a idea is a good one, maybe even the right one. But ‘Agile’ as sold by consultants and book publishers is not always the answer. There are a lot of people who will disagree with this, but it’s usually because they are selling Some Agile Way™.

Agile is nothing more than a set of ideas, ideas that build a culture. Merlin Mann talks about this concept in the latest episode of Back to Work, but the culture you create will define every aspect of your organization. Adopting Agile methodologies can help, but only if you are committed to building a culture that really believes in the idea of agile. Either way, and agile approach will not solve our problem by default.

So our problem remains. How do you avoid software stasis? By focusing your work on the product and not the project. This idea is not new, but it should be reconsidered by most of the software development world. Your primary concern should not be the project, but it should be the application.

There are many implications of these ideas, but product ownership can be implemented through a few simple concepts:

  • Organizing development teams by product. Each application or service has it’s own dedicated team
  • Tracking and scoping requirements by product, not project. For strategic initiatives you certainly need to have some mapping of the initiative’s requirements to the product requirements, but the system of record should be at the product level
  • Projects will come and go, but the product teams endure, as does the product feature and fix backlog
  • The product team must really own the product, and be in a direct relationship with it’s users.
  • The development process must trust the team with ownership of the product

This breaks applications out of stasis. In this model, the product is never “done,” it evolves. With the team and work focused around the product the developer and his users can really own and drive the product forward together, without the long lag times of the project focused timeline. It puts the software and its use at the center of development activities.

This idea does not solve all the problems of a dev team. Like all things in life there is no silver bullet. But by viewing products as the key organizing factor, an organization can focus on the constant changes required to keep up with users evolving needs and demands. Isn’t that, after all, the key mission of a development team?


Simple is hard work

When solving any problem, we tend to look to the simple answers first. This is a good response, and one that should be nurtured. But can easily be misled by that simplicity. This typically happens in two ways. First, we think the simple answer means less time than other answers. We tend to equate simple with fast. Second, we assume that a simple answer is easy to implement. We assume that what is easily understood can be easily done. We equate simple with easy. As a result, we often misunderstand the impact of a simple answer to a hard problem. Because, truthfully, simplicity is hard work.

Let me make this more concrete. Take the idea of a microservices architecture, something I have been thinking about a lot lately. The microservices architecture is “an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API.”

This is a simple idea. In fact, most laymen will be able to grasp it with about 10 minutes of explanation, and IT professional should be able to grok it in a matter of minutes. The idea is straight-forward and elegant. Break a single system down into multiple, small systems. Put application logic into the services, not the data or transport layers—”smart endpoints and dumb pipes” as Martin Fowler puts it. Call the services asynchronously, and run each of them independently. The concept is not complicated.

But implementing this idea is hard work. How do you deal with decentralized data stores? How do you coordinate operations between services without tightly coupling them? What about managing API documentation? Automated testing and deployments on multiple frameworks or platforms? All of these questions sit squarely in the middle of the microservices concept and must be answered.

With all of those questions in mind, is the idea still a simple one? I’d say so. The idea is still clear and easy to understand. But it doesn’t look easy now, does it?

One of the best lessons a good developer or architect should learn is how to continually ask the question, “then what?” Always push to the next question, to the next decision. Consider each one fully in turn, and then in light of the whole. You cannot be satisfied with the simple, obvious answer if you cannot work through all of its consequences. Don’t be fooled, in the software world simple is never easy. It’s hard work.