When it comes to getting more downloads the first thing everyone thinks about is promotion — get the app in front of more people — but that's only half of the equation.
The other half, which is as important but often neglected, is what percentage of those that see the app actually downloaded. That's called the app store conversion rate and is calculated by dividing the number of downloads your app had in a given time period by the number of impressions it had.
Here's where this becomes critical. Getting more downloads for an app with a low conversion rate is going to require a lot more effort than for an app that has a high conversion rate.
For example, say you want to get 1,000 new downloads. If your app's conversion rate is 10%, you'll need 10,000 impressions. If your conversion rate is 25% however, you'll only need 4,000 impressions to get the same number of downloads. While these numbers are really simple and unrealistic, they make it very clear that conversion rates matter.
You can find your app's conversion rate in the User Acquisition reports:
Someone who sees your app for the first time makes the decision of whether download it or not very quickly. If they're not already familiar with the app or brand they'll use everything that's on the app store page to make that decision, which means elements such as the icon, text, screenshots, and videos are critical in turning an impression into a download.
To improve your conversion rate, you'll need to tweak those elements to fit your target audience as much as you can. This includes both applying best practices and also tailoring elements for your audience. But how do you know what works best for your users? You can use your intuition, but that's rarely enough (and in some cases even totally wrong).
That’s where A/B testing comes in.
A/B testing is a formal method of measuring the likelihood that a change to an element will improve your conversion rate. It's done by splitting impressions into two groups, showing each different versions of the element being tested, and measuring the conversion rate of each separately. The better conversion rate wins.
Using A/B testing we can then easily find out which screenshots, colors, tone, etc. give the best conversion rate and highest downloads without having to guess.
Discover new keywords, monitor ranks, and snoop on competitors.
Google Play has built A/B tests right into the platform, providing developers all the tools they need to run and measure the impact of A/B tests. Apple doesn't, so we'll focus on A/B testing with Google Play in this guide.
Using Google Play you can test the icon, screenshots, video, short description, and long description.
You can test one element at a time but can run multiple tests concurrently, but we highly recommend that you don't and instead focus on one element. This way when downloads go up (or down) you can pinpoint the reason with certainty.
For most developers, the biggest gains will come from experimenting with screenshots, video, and sometimes short description. Screenshots are most popular both because they're the first thing new users look at, and also because they offer the most flexibility in terms of getting creative.
Once you've decided which element to test think about what kind of change you'd like to test. There's a good list of ideas below to get you started.
You're now ready to start testing:
Experiment name - Give your test a description and unique name. A simple way to do that is to include the element being tested and the start date of the test (ex. "yellow-icon-20200101")
Store listing - Select the default "Main"
Experiment type - This part's a bit tricky unnecessarily, but here's the straightforward version: if you're testing graphics and aren't localizing select "Default graphics", otherwise select "Localized".
Language - If you selected a localized experiment, use this to select which language it will be shown to.
Audience - Select the % of impressions you'd like to see the new version of the element you're testing. We recommend letting as many users see it so you can get results faster, so leave this at the default value (should be 50%).
Your test is live 🎉 Now comes the hard part — waiting for the test to gather data.
While you wait you can (and should) monitor how things are going. Google will show you how many downloads your A and B groups are generating by day and also how many of those stick around, which means they didn’t just download the app, they’re also using it.
If a test isn't generating the results you expected right away keep it going and give it more time, but if after a few weeks you're still not geting results you should probably go back to the drawing board. Also, if your test is generating negative results you'll want to end it and find a new change to test.
The simplest way to determine whether a change helps or not is to calcuate the conversion rate and pick the higher one, right? Yes, but... Because A/B tests are run short-term, usually for just a few weeks or less, we need a way to determine if a change that's performing now will also perform well in the future.
For that we turn to statistics, and speficially to statistical significance. Google automatically calculates Statistical Significance for you so we'll spare you the math, but if you really want to know check out this article.
We consider a test to be done when we have statistically significant results. Google will do that automatically as well and let you know when your test has enough data to make a determination.
With the test completed it's time to choose a winner. The better conversion rate is the winner. If it's the original version just go ahead and end the experiment by clicking the Stop Experiment button at the top of the page. If the winner was your new change you'll want to make it permanent. Google will let you do that right from within the experiment.
What’s important is to make the change as-is and not to add any last-minute additions to it, which is something we tend to do once we see what works. If you have another idea for a change go ahead and create an experiment for it as well.
Things like colors and tone have trends just like everything else, so it's important to always plan your next test but also keep a close watch on download trends. An easy way to do that is with email/Slack reports. If downloads don't continue to grow as you'd like take the time to set up a new A/B test.
Here are a few ideas you can experiment with:
There are many more things you can test, but a simple test is better than no test at all. Industry experts recommend running a test at any given time.
Want to get more downloads? Try our new suite of App Store Optimization tools to get ahead of the competition and increase your downloads. Get started →
Ariel answers questions from the live audience about App Store Optimization and app marketing.
Learn the basics of App Store Optimization - how a few simple changes to your app's name can increase your visibility and gain more downloads.