This marvelous graph was produced by Michael DeGusta in an erudite post on The Understatement blog on the disruption the recorded music industry has endured in the last 40 years. It’s a gloomy picture, and working for a publisher (of books, magazines, video, apps and eBooks) gave us cause to ruminate on some of the underlying drivers of the collapse of music publishing.
In a nutshell, after a decade of economic crisis the music publishing industry no longer sells a lot of CDs, album sales have been replaced by sales of singles on iTunes, and pirating has been widespread. The only legacy format remaining is kept alive by the vinyl nerds (and I’m looking at you James).
There are a plethora of moving economic variables at play in this chart from the time CD sales peaked around 2000, but we think it’s possible to turn it into a mathematical formula that can be further explored by publishers in books, video, and newspapers. We call it the DPQ, short for the Dalton-Pierce quotient:
m State of economy (misery index): the misery index is the inflation rate added to the unemployment rate. It is a raw, but effective index of economic suffering.
f Format of content: in music, the arrival of the widely agreed standard of the MP3 file enabled recording, storage, playback, sharing and commercial transactions to take place over a single song.
a Atomisation of content into chunks: the single has replaced the album as the unit of consumption in the music industry. You can read all about that on the original post.
d Devices for consuming content: in music the arrival of MP3 players (notably the iPod, but remember the Rio?) heralded a major change. Cheap, portable players were supplemented mid-decade by cheap, gargantuan hard disk drives that could store a whole music collection. If you doubt the impact hardware can have on an industry, check out the arrival of the Sony Walkman and the consequent fattening up of the cassette market on the chart between 1982 and 1985.
c Control of distribution: a trip to the music store to buy the latest album or 45 was a great adventure for me in the 1970s and 1980s. Music publishers grew strong and controlled the retail supply chain with iron fists, including the complementary industries in radio and tv for promotion of songs and albums. Nobody controls the internet – the best you can hope for is to control part of it – like Amazon music and Apple iTunes.
So the maths is simple: disruption is accelerated overall by the context of poor economic times, when consumers are motivated to change their spending habits. When the denominator in the equation gets smaller (as in the internet becomes the channel, and you lose control), disruption gets bigger by a lot. The multiplier effect of the 3 components of the numerator is self-explanatory – and in music publishing all 3 were impacted. Hence, massive disruption.
The same formula can easily be applied to other publishers and media – I’m presenting a short paper on this subject at the 2011 AIMIA V21 conference (Digital DNA) in Melbourne on the 12th of April, and look forward to a robust debate. The good news is there is a solution to making the quotient work for you, not against you.
Governance and funding of business investment using agile development methods are two hotly debated and misunderstood topics among the agile community.
With governance principles originally borrowed from the exacting fields of construction engineering and accounting, where things can often be calculated to 2 decimal places, it seems weird that we more often than not disappoint our customers by delivering a different result to that which was blueprinted or planned in software ‘engineering’.
Fred Brook’s 1975 treatise The Mythical Man Month was probably the first to hint at the
core differences between engineering a bridge versus the knowledge work of solving a problem by creating code. In the 20th anniversary edition he added an extra commentary entitled ‘No Silver Bullet’ in which he further reflected on the shortcomings of applying a mechanical engineering metaphor to software development. It is a must-read book for everyone involved in agile or lean development – and if you’ve read Clay Shirky‘s 2008 book Here Comes Everybody you’ll recognise some early sources of Clay’s work on complexity of organisations.
By the way Clay Shirky also has my favourite definition of governance – ‘rules for losing‘.
Fred aside, one of my greatest heroes of the software world is Tom DeMarco, who wrote the seminal text on project and engineering control called Controlling Software Projects: Management, Measurement, and Estimation (Prentice Hall/Yourdon Press, 1982). But this one is a book you probably do not want to own. Why? Well, approaching 70, Tom gives us the benefit of his time to reflect in the newsletter of the IEEE Computer Society in 2009, downloadable here:
My favourite quote, which describes in a nutshell why measurement and management aren’t as closely linked as he surmised 27 years earlier:
“Imagine you’re trying to control a teenager’s upbringing. The very idea of controlling your child ought to make you at least a little bit queasy.
Yet the stakes for control couldn’t be higher… now apply “You can’t control what you can’t measure” to the teenager. Most things that really matter–honor, dignity, discipline, personality, grace under pressure, values, ethics, resourcefulness, loyalty, humor, kindness–aren’t measurable.”
And a lot of product deliveries and projects I’ve been involved with acted way more like moody, irrational, changeable teenagers than respectable septuagenarians.
Who hid the mouse? A wonderful question posed by a 4 year old child trying to interact with the television that her boring Dad is insisting on watching 1970s sitcoms on. It is wonderful because it portends a generation of people who will not tolerate passive consumption of broadcast information jammed at them on a fixed schedule every evening after the news has finished.
This world is coming faster than we think, with the launch of smart (read internet connected) television in the next few months from mainstream manufacturers. Anyway, we commend this pair of videos to you as well worth investing some time watching:
Imagine a typical job interview situation. Bright young thing (BYT) in the chair opposite you (as the hiring manager) with a resume to die for, 2 open source hackapps in the local market, they’ve survived the pair programming test with your wiliest developer, and you’re secretly very happy with the skills and experience you are about to steal from a rival in a limited pool of technical talent.
So you pop one final question: “tell us why you are thinking of leaving your current employer?”
If they shoot back “well, I’ve really stopped learning there”, the interview is over. Do not hire that person.
Now in the current over-cooked Australian market for tech and product talent, you’d say I was crazy and irresponsible to offer that advice. Let me offer my defense.
People leave their education and take the skills they have gained to their first employer. The mixture of their personal traits (intelligence, customer focus, self-motivation etc), skills gained from academia, and background enable them to slowly master the work at hand with plenty of guidance. Soon enough though, the job gets stressful, repetitive, and money becomes an issue. So they jump. First job syndrome.
At job number 2, the workplace is different. Nobody knows precisely what our new hire doesn’t actually know, and with the likely change in corporate culture, along with expanded duties and responsibilities (to justify that pay rise) they will likely get along meeting expectations on the skills they brought with them. They will spend a lot of energy just fitting in with the new people, and may well apply their limited skills to the new tasks and environment and think they are learning new stuff. But they get tired, are a bit too busy with their social life to read much, and the work starts to feel a bit repetitive.
Soon enough, maybe a year later, maybe two, they jump ship to you for more money and ‘opportunity’ (or whatever you put in that job advert ;-). And they give you the dreaded line “I stopped learning there”, making them sound ambitious and intelligent all in one go.
Now, when did they actually stop learning stuff? Last week? Last job? The one before that? Or at University? For me, ‘I stopped learning’ is a lame-ass excuse and a mealy-mouthed defense to a recruiter. Learning starts with the individual, it is their own responsibility. In this world, it is almost impossible to stop learning give universal access to information. It is cheaper than ever through e-books, blogs, tweets, and ahem web pirates. And if they’re from an agile employer, something is badly wrong – they should be learning every time they have a retro or pair program.
Past behaviour is definitely the best predictor of future behaviour.
I suggest a follow-up question, which gets to the heart of the problem in a tight labour market might be ‘what have you read lately that helped you in your job?’ or ‘what have you found interesting on the web lately in your field?’
No good answer, no hire. Nerf them on the spot.
Software Education run an annual conference in both Wellington and Sydney that focuses on the changing role of Business Analysts in the agile sphere – a tricky and much debated topic. I had the honour of being invited to present a keynote at the official dinner where I wove together tales of life in an agile world at Lonely Planet, with some key lessons from commentators ranging from the 1970s to the present day. Plus shot Martyn with a nerf gun. Excellent evening indeed.
We will post the presentation and speaking notes in a separate post.
Key Reading and Resources
- From Gutenberg to Zuckerburg, Gus Balbontin from Lonely Planet presenting at O’Reilly Tools of Change, 2011
- The Innovator’s Dilemma, Clayton Christenson, 1997
- How Great Leaders Inspire Action, Simon Sinek at Ted, 2010
- Dan Pink on the Surprising Science of Motivation, Ted Talk, 2009
- Steve Blank on Customer Development (The Four Steps to the Epiphany)
- The Entrepreneur’s Guide to Customer Development, Brant Cooper and Patrick Vlaskovits, 2010
- Rocket Surgery Made Easy, Steve Krug, 2009
- Mythical Man Month, Fred Brooks, 1975
- Steve Hayes’ Blog on The End of Passion.
My thanks to a wonderfully receptive audience and our fine hosts Softed.
A picture I have used often to talk about the journey to Agile ways of working shows a series of beautiful chrysali hanging from a branch.
The audience are generally lulled into a serene state as they presume I’m saying the change will occur silently, secretly before their eyes.
“Rivers knew only too well how often the early stages of change or cure may mimic deterioration. Cut a chrysalis open, and you will find a rotting caterpillar. What you will never find is that mythical creature, half caterpillar, half butterfly, a fit emblem of the human soul, for those whose cast of mind leads them to seek such emblems. No, the process of transformation consists almost entirely of decay.”
The truth is many organisations start with a small, insignificant project that should things go wrong, the consequences will be small, and the bizarre experiment can be swept under the carpet.
Start with something that has consequences, that people already have an emotional commitment to finishing – preferably to do something great or amazing.
A bit like NASA testing the first space-suit design in the space-race of the 1960s. Some of my favourite video ever made is of Joe Kittinger, the first ‘astronaut’ in many ways – the man who in August 1960 volunteered to test a space-suit design by heading to the edge of the atmosphere in an era when no rocket could get you there. So they sent him up hanging from a weather balloon.
How does he get back to earth? He makes the biggest leap of faith ever.
Apocryphal or not, this story from the science of space-flight does provide us with one of my favourite quotes, from German rocket scientist turned US space program engineer Wernher von Braun.
Wikipedia does a fine job summarising his life and work, and his contribution to the history of manned space flight through rocket design. I’m more interested in a view he held that aligns strongly with a lesson learned from Lean thinking.
As technologists we often feel compelled to automate things. The great promise of robots and computers was that they would take the drudgery out of daily tasks, leaving humans to think of higher matters. Like watching cats on the internet I suppose. But we usually automate the wrong things.
The natural temptation is to build software or machines that automate the complex, multi-variate, mind-boggling tasks to take out the variation and pain that would occur if a person did them. The resulting code or machine is naturally complex, perhaps incredibly clever, but equally unsuited to varying that solution by even a jot.
Lean teaches us to automate the simple, repetitive tasks and leave the complex, decision-heavy, multi-stranded, multi-variate tasks to a person (or people). When eventually they have innovated the task to be simple – you might automate it. Werner had a lovely take on this issue:
“Man is the best computer we can put aboard a spacecraft – and the only one that can be mass-produced using unskilled labour.”
This thought resonated with me today as we heard of the battles our smartest software developers have endured over the last 24 months writing a robot to interrogate text files and break down the content into different types. Every time the original document is improved for the product to be more customer-friendly (which is often), our robot breaks. Stimulated by the intellectual pursuit, we have coded our hearts out to solve this, often resolving that if the upstream folk could just stop changing the frickin’ format to please customers we’d be fine!
Then our own rocket scientist had a thought. What if we got people to do the job instead, giving them some simple tools to process, post-process and augment the content along the way? Ugh. Not sexy. Not ‘elegant’. But several times more efficient and producing output at the 99% quality level. Lunokhod thinking!
I close with another von Braun piece of wisdom, which perhaps comes from his rare reconciliation of both religious and scientific passions in his life:
“You must accept one of two basic premises: Either we are alone in the universe, or we are not alone in the universe. And either way, the implications are staggering”