Head of Design at BT / Founder & Designer at Real
Here at BT and EE, it can sometimes take longer than expected to ship value to our customers and whilst we continue to embrace a more agile mindset across our organisation, it's also hard to let go of old habits and more traditional ways of doing things.
Commercially, it's unpractical to spend large amounts of time and effort on big bets that might or might not work. Instead, we need to accept that customer behaviour and demand are transient so we need to move fast and learn as we iterate (nothing new to see here).
To help us move fast and meet that customer demand, we lean on several processes and frameworks (build measure learn canvas, five whys, problem framing canvas etc) but like any other team, we are still learning how we can best utilise these tools to our advantage to help us ship great products.
For a handful of squads in my tribe, shipping any form of value (quick wins or big bets) to our customers regularly can be a long, complicated and sometimes painful process that can take months.
Clearly, this isn't a sustainable way to continue operating and I wanted to help try and fix these problems as quickly as possible.
To help me better understand the problems we were dealing with, I conducted several squad interviews to learn first-hand what stopped them from shipping value to our customers more efficiently.
Here are the biggest issues we discussed:
The issues I uncovered worried me.
Some of the issues felt like they were fairly simple skills gaps that we could fill with the right level of training and support. Others felt more institutionalised and habitual which would be a lot harder to break down and adapt to a new way of working.
I desperately wanted to understand why some of these issues had become a common practice but there wasn't a very clear answer. When I asked the squads to help me better understand these problems - deadlines and technical limitations were suggested as the main reasons why our squads currently work the way they do.
My hunch was that whilst these reasons were a factor in holding our squads back it wasn't the full picture. Varying levels of experience, self-organisation and misaligned expectations for collaboration in each squad also contributed to some of the issues we had identified, and to help fix this we needed to hit the reset button and go back to basics.
Without wanting to re-invent the wheel I wanted to return to some of the more foundational ways of working for agile squads where I'd personally seen success in designing and building great products.
I started by revisiting Jeff Patton's guide to dual-track development where one part of the squad focuses on predictability and quality (development track) and the other focused on fast learning and validation (design and discovery).
This dual-track approach is not to be confused with 'duel' track where those two tracks in the squad are separate or competing in any way. In fact the very opposite is true. Whilst there are two tracks in the squad everybody is involved in discovery, planning and design tasks where possible.
Each discipline lead (product, design and engineering) also guided daily activities we felt could be taking place alongside some helpful tips that might encourage the squad to try something different to deliver more efficiently.
We also encouraged each squad to share daily diary studies documenting what went well and what could have been better. This data would help us identify areas for improvement when we come to scale this way of working across the rest of our tribe and beyond.Objectives
There were two sets of objectives that I'd identified; squad objectives that were measurable through common agile metrics (cycle time, lead time, number of deployments and throughput rate) and behavioural objectives that were measurable through regular feedback, retros, and observation after the pilot was completed.Squad objectives:
I identified five key results that would help us track the objectives throughout the pilot.
Our Pay & Control Costs (P&CC) squad were a perfect candidate for this pilot. This squad owns several customer goals including setting up direct debits, paying for services, managing your billing account and more.
A few members of the squad were still fairly new to BT and EE so we could get the benefit of a fresh perspective, they had existing access to our Loop design system, libraries, and staging environments as well as a meaty problem to solve: every month, we receive thousands of calls to our customer service team to help support people who have forgotten their login details and wish to make a payment or a customer who wishes to pay on behalf of someone else (i.e an elderly relative).
The squad put forward a hypothesis that providing an online journey for our customers to pay the debt off their account or pay on behalf of someone else with very little upfront information required would alleviate the pressure on our call centres and make our customers lives just that bit easier.
This was our chance to throw the rule book out of the window and change our existing way of working to see what might stick.
Here are a few adjustments we made:
The squads' hypothesis was tested and validated through a rapid single-featured MVP that was designed, built and shipped to a small percentage of our customers over four weeks.
Before the test was shipped, the squad established their measure of success as an increase in conversion rate (of customers successfully making a 'logged out' payment) between 5% and 10%.
After the first day, the conversion rate had increased by a massive 11%!
The results from the pilot were very encouraging. Not only did we drastically reduce both cycle and lead times for the P&CC squad we increased throughput rate and completely re-energised their way of working and approach to iterative product development.Pre-pilot squad metrics:
It was a bit of a mixed bag of results when looking back at our OKRs.
Whilst the squad were able to ship more incremental value to our customers every sprint and use a different type of MVP with clearer success metrics we weren't able to find an efficient approach to how we conducted research in an agile environment.
The squad themselves had all the support they needed when they needed it but the researchers were quite burnt out in the process and providing dedicated days of support like that clearly wasn't sustainable.
Our researchers were working hard - not smart.
Despite this observation, the squad's change in behaviour was exciting. Without asking, the squad took the initiative to completely change our existing deployment cycles from monthly to completely un-restricted. This might sound somewhat unremarkable but this was a massive step forward for us to be able to ship and learn on a much quicker time scale.Squad feedback
The squad were generally quite receptive and open-minded to the pilot. They learnt a lot and could see the benefits of what we were trying to achieve.
Having available user testing resource has been great alongside a very supportive tribe. Hannah, Scrum master
Testing within three days was stressful and we struggled to get the right kind of participants recruited in time. Having said that, it was great to see the squad organised and focused with clear learning objectives. Arthur, User Researcher
Working on specific tasks in a limited timeframe has allowed us to work in a much more focused way. We seem to be faster at unblocking obstacles and making sure the work gets done. Tim, Content Designer
Overall a great experience, we are working a lot more efficiently as a squad, constantly collaborating and supporting each other. Jordan, Product Owner
If we were able to conduct unmoderated tests ourselves it would be much easier, and we could test more often. Jon, Product Designer
Since the pilot concluded in mid-October the P&CC squad have continued to embrace this way of working to great success.
We've used their story to scale these processes and ways of working to other squads across BT Digital as well as other CFUs (customer-facing units) by hosting regular lunch n learns and drop-in sessions where people can come and learn more about what we achieved and how we did it.
As for a renewed approach to user research in an agile environment, it's back to the drawing board.
Our research ops team and I have since been collaborating toward a longer-term vision of being able to provide user testing capability in each squad by giving them the right training and direct access to usertesting.com. Doing so will hopefully be able to provide the flexibility and independence the squads need to deliver efficiently throughout each sprint.
More to come, watch this space!