Growth 11 min read  - May 31, 2019

Core5: “We Are Not Afraid to Kill Projects, It’s the Only Way to Survive in Highly Competitive Market”

Ex-Director of Global Advertising at Wargaming and Co-Founder of Core5 Pavel Lando shares his vision on setting benchmarks for mobile products and developing competitive UA, engagement, and monetization strategies for mobile games.

As the former Director of Global Advertising at Wargaming, Pavel Lando oversaw advertising strategy for the company's desktop titles and its mobile incarnation, — Blitz. Since then, he has co-founded mobile gaming company Core5. It is currently soft launching and testing 9 titles that range from hypercasual to mid-core genres. In our interview, Pavel shares how his varied experience in digital and as an entrepreneur has helped him and his team develop products for a highly competitive market. We also talked about setting benchmarks for their products, constantly testing and adapting in order to be competitive. If you are looking for a better way to enhance your UA, engagement, and monetization strategy, keep reading.

You have a number of products in soft launch, how do you test your games? What is the volume of traffic needed for testing?

First, you need to understand the goal of a particular test and what you’re testing. If you’re testing really early engagement for the original core gameplay, and you don’t know the way users are going to react, even the first 200-300 installs is enough to get a first impression when you have 10% Day 1 retention rate, or 50%, or 30%. You definitely need more if you want to be sure about the way core gameplay works — about a 1,000 installs is more or less OK.

It’s always a matter of how much accuracy you need. If you are OK with 5% delta in your retention rate, then even 300 installs is enough. If according to the test you get 40% Day 1 RR, in reality you will have between 35-45%. That’s essentially sufficient if you want to know whether it works or not. If you need a smaller delta, 1,000 installs will provide you with 3%, enough to understand whether you need to dig deeper or not.

As for later engagement, like Day 7, Day 30 RR, you will need at least 200, 300, 500, or better 1,000 active users to have the same deltas. For example, if your conversion from an install to a user who has reached deeper in-app event is 5%, and you need at least 300 users, you’ll need to purchase 15,000 installs.

The situation with monetization is pretty similar. The conversion rate from an install to a player is not that high. 5% is a very good number, normally it ranges between 2 and 4%. So you won’t be able to tell anything based on 200 installs. You will need 10,000-15,000 installs in order to draw early conclusions about the way the game is monetized. If you want to dig deeper and perform a certain amount of A/B tests, you’ll need more: 30-50 thousand installs. Industry whales like King or Supercell, can have their games in soft launch for more than 18 months. The amount of traffic they’re driving towards these prototypes is enormous, and sometimes they still aren’t ready to release. I understand why: they’re not happy with the metrics, and are trying to tune it. Still, in order to be 100% sure of your monetization you need to purchase at least 10,000.

Which geos do you use for soft launches? When you launch new games, what difficulties do you face in terms of scaling a UA campaign?

In terms of geos to soft launch, it’s a standard list. We try to avoid saturated markets, like Canada or Australia, where CPI is high. We use something more evident and less expensive: Netherlands and classic Asian countries. Normally you track the overall audience size, the percentage of English speakers, GDP, GDP per capita, and CPI rates — Chartboost and Facebook provide this data.

As for scaling, at Core5 we’ve only had 2 relatively small releases, which did make us rethink our benchmarks. We thought that 40% Day 1 retention rate is OK for hypercasual, now I understand that it should be at least 50%. Unfortunately, the titles we released had lower metrics. So we couldn’t purchase traffic to these titles for scale.

Ride the gun! gameplay

For now, all I can say about launching a mobile product is based on my experience at Wargaming and with World of Tanks Blitz. But the situation with Blitz was different, it’s a part of a huge franchise, and on release we already had a huge amount of organic traffic, especially in the CIS and Eastern Europe. Our BizDev guys also did a great job, both on Apple and Google we had good featuring which was very important to our initial push. We purchased traffic on a huge scale, mostly focused on the unit-economy. Our analytical team provided us with LTV of the project, and we tried to make our CPIs match the LTV.

Basically, scaling is about building a system where you constantly get new app creatives and keep testing them. My benchmark for the amount of ad sets for the project is much higher than it used to be, for a big title it’s more than 150 ad sets. And by ad sets I mean completely different app creatives and messages targeted towards different audiences, without localizations and resizes.

Another part is developing a continuous process of iteration and testing, of targeting, messaging, regions, audience, etc. With testing you gain a certain amount of experience, and even before the tests, you might project what’s going to work best. But sometimes the results might be really unexpected. This is my core attitude to scaling.

You need to be fast in terms of optimization, not to purchase something you don’t need, not only in terms of fraud, but in terms of the relevance of the traffic.

The third part is definitely analytics. It’s good to use external systems to prevent purchasing fraud, but when you’re working with video networks, affiliate networks, it’s impossible to have 100% clear traffic. You’re going to be cheated. Just focus on the amount of fraudulent traffic, the time of your reaction, and your tracking capability. As for anti-fraud software, each time I spoke to their representatives and asked: “Can you tell us in detail how you tracked this?”, the answer in most cases was “It’s our patented technology, we can’t share any details, just trust us.” So it’s mostly about analytics on your side. Both in terms of predictions and fraud. The holy grail of prediction is LTV based on early user behaviour: Day 1, Day 2, Day 3 maximum on mobile.

What about organic user acquisition? How does a game get discovered? How important is ASO?

Today it’s not a question of whether to use ASO or not, it’s like cleaning your teeth.

You need to have an appropriate title, description, and you need to understand the basics of creating your promo video, screenshots, etc. The next stage is to A/B test all of this. Google provides its internal tool, and there are tools like Split Metrics, that can help you with testing on iOS. But I think you should do it after you’ve tested your core gameplay and very early app engagement. Also when you track conversion rate from clicks to installs, and it’s higher than 50% without A/B tests, it seems like you know how to purchase traffic, create titles, descriptions, and screenshots. After, it’s inevitable to constantly iterate, A/B test to track all the trends regarding the keywords, etc.

What do you think is important to show in the video on your main page? How closely should it reflect the gameplay?

According to my experience and what I see across the market, you should be very cautious with creativity in video. You need to answer the question: what this game is about precisely. Normally, it’s something in the style of Let’s Play. It is neat and appealing from a visual point of view, but comes without extras like text or $10K CG. That’s more than enough for our very early engagement tests.

With creatives, generally it’s a good idea to see what your colleagues are doing. That being said, you shouldn’t take it for granted, you need to double-check everything. At Wargaming we were developing our new landing page, and according to the metrics it wasn’t super successful. We had much more efficient landing pages in the past. But our competitors started to blindly copy us, simply because this landing page was developed by Wargaming. With all due respect, we are not gods, and it’s normal when you iterate. You should not copy blindly.

When should one start considering monetization strategy? During the prototype stage or is it better to add later on and test? How do you approach it?

Well, I would say you definitely need to think about it from the very beginning: the moment you commit to a genre is the moment you should start planning a monetization model. There was a time when I thought I was the smartest guy in the room and I could invent something new and groundbreaking in terms of monetization. I quickly realized you don’t need to perform too many tests at the same time. If you’re testing original core gameplay, don’t test a unique approach to monetization. Just test core gameplay, get normal or perfect engagement metrics, and then switch to monetization. Use things that are quite well-known across the market, do not invent, make this standard program, and after that move to improvisation.

So, really, we aren’t trying to reinvent the wheel in terms of monetization. For hypercasual obviously in-app advertising is the focus, plus a certain amount of IAPs But we don’t make a big stake on it, we understand that it can bring you very, very small portion of your revenue. We also tried subscriptions, which can add 20-30%, sometimes even 40% of the revenue. As for mid-core, we are mostly focused on IAPs, but a certain amount of rewarded videos are also useful. Otherwise we favor a normal approach to subscriptions. Try to avoid misleading the user: when you make him subscribe without an immediate opportunity to unsubscribe.

Rollapong gameplay

You have a game that has subscriptions, IAPs, ads monetization, altogether. Why?

Frankly speaking, these days it’s kind of classic for hypercasual. Maybe except for IAPs, but it didn’t require too many resources to implement. It was also an additional test to show that you cannot make too much money from the IAPs in a hypercasual title.

Did the IAPs work?

No. It wasn’t a surprise, it was kind of obvious. Sometimes you don’t expect results and you don’t get results, and you see how clever you are, that your projections are working. In terms of subscriptions, it was an experiment as well. A couple of companies across the market started working on subscriptions for hypercasual, and a significant amount of revenue was generated with their help.

So, we did the same and realized – yes, subscriptions work. Not the same way in application, since the majority of your revenue is still generated by advertising. But if you have 20%, up to 40% of incremental revenue coming from subscription – why not? The major point is that it’s not a subscription for the sake of subscription. You have to provide a certain amount of content for this.

What do you provide for the subscription?

It’s a standard list. As far as I remember we offer unique skins, a certain amount of in-game currency daily, 3 revives. So it’s more or less the same cost and value as buying perks individually with IAPs.

What metrics do you use to track performance: retention, of course, but what else?

Well, sometimes retention is a vanity metric.

It’s not always about the retention, it is important to compare apples with apples. Besides retention, we track conversion across the overall funnel. We track medium time per user, average session length, the amount of launches per user, the amount of launches from the top 10 users, etc. If we have some kind of monetization on a very early stage, we track the amount of banner impressions per install, overall, and definitely payments.

But at the stage we are right now, especially if we are speaking about the hypercasual genre, where the majority of revenue is generated by in-app advertising, you don’t pay too much attention to IAPs. If we’re talking about mid-core, we definitely pay much more attention over there. At this point we’re more focused on in-game engagement, and the marketability of the game. Here you don’t need to iterate too much in terms of creative materials, more on the product side, and the feedback and metrics from the top of the funnel — including CTR, CR, CPMs. It can say a lot regarding the product overall. All in all, we have a chart of 25-30 metrics. I won’t name all of them, but the core metrics I’ve already discussed.

Which analytical tools do you use?

We mostly use AppsFlyer and Facebook Analytics, plus Google Firebase for servers and additional analytics tools. We tested a couple of other analytics softwares: Game Analytics, Amplitude. But we are a startup and try to save our money. In order to make segments in Amplitude, you need to pay, and we perform a huge amount of A/B tests. Facebook analytics provides this functionality. It’s not focused on gaming, but with a little bit of creativity and imagination you can create additional in-app events, that can help you track what is going on with your economy and currency flood. Facebook Analytics is more than enough.

There is a problem with data accuracy, however. Sometimes FB analytics is super accurate and sometimes it lies, not in terms of in-app events, but mostly in terms of the amount of installs, so we constantly compare it with AppsFlyer, native Unity analytics, and Firebase data. You need to create additional custom install events, use more than one analytics source, make comparisons, and you will find the truth somewhere in between.

Speaking about predictions — do you do LTV predictions? At which point of development does it become important?

You can make LTV predictions only after you have a certain amount of traffic — at least 10,000 installs. The more, the better. We haven’t reached this stage yet. We’re more focused on optimizing our core and meta gameplay. Monetization optimization is our 3rd stage. But to get a first sense of your LTV, you don’t need super sophisticated models. One of the businesses I tested after leaving Wargaming was building a machine learning model which could provide LTV projections based on early engagement data. I am a fan of analytics, so when we reach a certain amount of installs and have enough data for our core product, I will start digging into this. We’ll try to build a sophisticated model, using machine learning algorithms to deeply understand how users will behave in the next 6 months or hopefully 12 months or maybe more.

How many iterations do you generally go through while developing a game?

In terms of the prototypes we’ve developed, as far as I can remember, around 25. Not all of them were public. If we play ourselves and it doesn’t feel fun, it’s not addictive, we won’t spend time and money on traffic purchasing. All in all we had 9 products in soft launch and around 30 iterations. One product might have 4-5 iterations of the soft launch, being A/B tested each time to see how it influences our metrics. If we see a positive trend, we keep on working on this project. If we see a negative trend, or a stable trend, 2 or 3 iterations in a row, then it becomes a question of whether we need to continue working on that project or not. We are not afraid to kill projects, and I suppose this is the only way to survive in this highly competitive market.

Has your audience had any unexpected reactions to any of your games? Anything that surprised you?

One game we developed, a mix of match3 and 2048, we started working on it about a year ago. It seemed OK, and then Day 1 retention rate was 10% — it was awful. But the worst situation is when the metrics are mediocre. You don’t know what to do next: they are not too bad to close the project and not good enough to keep on investing in this project. In this case it was an easy decision: OK, 10% Day 1 RR, let’s make one iteration, if we don’t see any changes — we will close this project. We made additional iteration, it didn’t help, so let’s forget about it. We’ve invested in our education, let’s move on.

I really like this approach. We’ve educated ourselves. It’s not like we failed, we did something and now we know more than we knew before.

It’s always good when you get feedback. Then it’s a matter of patience and resources. How many tasks can you handle? Because you die not when you don’t succeed in your tasks, and you die when you don’t have enough resources to continue.

What’s your most valuable lesson from Core5?

Don’t think that you’re the smartest guy in the room. The reality can always surprise you.

Especially when you’re making something that hasn’t been done before. Core5 reminded me of a very easy rule: do what you need to do and everything’s gonna be fine. I had an experience of a successful business, I had an experience of a less successful business. If you love what you do, and believe in it, everything is going to be OK. Just do it step by step, use an iterative approach, try to look as far as possible, but don’t forget that you are walking and sometimes you need to look down. Just have fun.

If you’re ready to scale your app business, now is your chance to get onboard. Let’s talk about your business needs — reach out to us here.

Marc Llobet
Product Marketing & Growth @ Appodeal
Monetization Desktop Column BlogBanner

Resources 2 min read

Guide: How to Run a Game Concept Test

Conducting Game Concept Tests before you start building your game drastically increases your likelihood of building and launching a successful game. Check out our 45-page guide for the detailed step-by-step instructions on how you can start running your own game concept test!

Resources 2 min read

Marc Llobet
Marc Llobet

Growth 8 min read

Rewarded Video Ads in Mobile Games

Tips, strategies, and best practices of rewarded video ads in mobile games. Find out how to integrate them and what works best.

Growth 8 min read

Marc Llobet
Marc Llobet