Designing with data at Songkick

A look into product experimentation at Songkick

A bit of context


I joined Songkick as a product designer in 2017. Each quarter, teams at Songkick are given a specific problem to solve linked to product metrics. Then, the product, tech, and design leaders work together defining hypotheses and experiments to impact those metrics. When experiments get positive results they get refined and productised, and all this knowledge helps crafting the vision of the product.

Songkick helps millions of fans get tickets to see their favourite artists. To do so, they just have to track the artists they love, or import them from Spotify, Facebook or Last FM. Then, they'll receive alerts when those artists play nearby and they can set an alarm to buy tickets before they go on sale to make sure they don't miss out.

The fans team

As the product designer, I was researching user needs, defining and launching experiments, and working on the vision of the mobile app.

I had been using Songkick since I was living in Madrid, many years ago. But there, I was getting concert notifications about a couple of times per year. One day, I decided to track London as well, (Madrid<>London it's an easy trip). After doing that, my experience was completely different. My favourite artists were playing pretty much every single week in London. I was getting so excited every time a notification from Songkick would pop up on my screen, that it even influenced my decision to move to London.

I wanted to see how many inactive users were having a similar experience, since our main goal was to increase fans' activation.


First, I looked at users outside of London/New York, and I realised many of them were tracking a lot of artists, but had just a few concerts in their calendar. They were tracking a location where their artists would probably never play.

I wanted these people to experience the same enthusiasm I got tracking a city where I could actually see the artists I love. The hypothesis was: "If we present people with additional major locations, then they will track them because they are willing to travel large distances to see concerts from their favourite artists."

To test this quickly, before jumping into designs, we tried sending a notification in the app to track users' interactions. Would they tap in this notification? Would they track that city?

We sent a push notification to users that had less than 5 concerts in their calendars but were tracking more than 50 artists, recommending a location in a radius of 1,000 miles, where these artists were actually playing.

I was super excited when we launched the experiment and looking forward to having the results.

Actual footage of me expecting the results



But... spoiler alert. The experiment wasn't a success. When the results came back a few weeks later, only 0.3% had interacted with the notification. And from those, only 5% had tracked the location.

Overall, the numbers weren't big enough to have conclusive results and productise this feature.

Actual footage of me looking at the results

A change of approach

We kept experimenting in different parts of the app that had more traffic. But many times we reached the same conclusion: numbers weren't big enough. We needed bigger volumes to be able to test things quickly and get meaningful results. Good thing Songkick's website has 8M unique visitors per month.

We decided we would test our hypotheses on Songkick.com, and we would use that knowledge, and the discovery work we were doing, to inform the vision of the mobile app.

We launched a similar experiment for the website to 10% of users for 2 weeks: 

The module recommends people locations where their artists play

The results were very positive, 16.7% tracked at least one location, and from those who tracked, 68% ended up activating. After this, we productised this feature in both the website and the mobile apps.

Gimme gimme more

It was taking us a couple of days to build and launch experiments in the website, and a couple of weeks to get meaningful results. So we experimented with many different areas of the product. The great thing was that I could also use Hotjar to see exactly how people were interacting with these experiments, so on top of numbers, I also started spotting some behavioural patterns.

Artists recommendation module that shows similar artists when users are searching

A/B test during onboarding, allowing users to search for other artists to track or see recommendations (only if they didn't import from Spotify)


The vision

After these experiments I started translating these learnings into the mobile experience. We needed the app to be more proactive when recommending relevant content to people. Suggesting the right locations and the right artists, rather than showing all at the same level.

In terms of architecture, I could see that people were only looking at two main sections: the calendar of concerts and the alerts. And in many cases they would have an empty experience in both of them, so that was the first thing to change, ensure that they will never have an empty experience. Otherwise, mobile users were loosing all the discovery experience that website users had access to.

In the current app many users would stay in the first screen without interacting

I summarised all our learnings to put together principles for the mobile app: 

1. Never miss out on concert tickets for an artist you love. This is the core of Songkick. Users need to get notified when tickets for their favourite artists are announced, it they are playing on a location they can travel to.

2. Relevant content only. Quality over quantity, for users tracking a lot of artists we needed a way of deciding which artists were more important to them.

3. Going to concerts is a social experience, so we need to design for that. We discovered a big proportion of users were missing out on concerts because they didn't know if their friends liked the band they wanted to see. We needed to change that.

The main areas of the product reimagined. Discovery, Alerts, and a Social profile.


Conclusion

During my time at Songkick the most precious learnings I got were: 

- Be creative and try ways of getting the learnings that you need in a short time.

- Don't get too attached to experiments, but spend time making sure they are well designed and will give you results you can trust.

- If they fail, try and understand why, and then decide to iterate or move on.

- If you're A/B testing, change one thing at a time, or you won't be able to understand your results.

- Keep working on the vision, and see how it changes with every new learning

Thanks for reading! 


Other projects: