QA automation from scratch for a mobile app leader

Back to homepage

Overview

Wooga is the leader of hidden object games, a very popular category on mobile. They surpassed $1 billion in lifetime revenue with their top title.

I joined June's Journey, the flagship title of the company, to build their test automation infrastructure from scratch, and train engineers to write tests themselves.

  • Framework: Unity Test Framework
  • Language: C#
  • CI/CD: Jenkins
  • Type: End-to-end, Integration, Automation tools

The Problem

Test automation is not a widespread practice in the game industry. Wooga wanted to ramp up their testing velocity, but didn't know where to start. This is common for game companies where test automation is not a widespread practice and the knowldege around it is scarce.

No test automation infrastructure

My initial mission was to build the test infrastructure for the game. Only a few tools where available for Wooga's use cases. The test infrastructure was expected to fulfill important criterias:

  • Easily adopted by engineers
  • Should help test the releases faster
  • Works with Unity, Wooga's game engine of choice

Shifting Left

With the test infrastructure the engineers were expected to:

  • get feedback quickly when introducing a change
  • write tests themselves to increase coverage and velocity

So beyond just a choice of tooling, the challenge was to build an infrastructure the engineers could contribute to.

My approach

Managing multiple stakeholders

The June's Journey team was composed of multiple stakeholders. This meant the test infrastructure was not only built to serve the QA team, but also the engineering and content team.

To fit the needs of these multiple stakeholders I regularly met with them and made sure our priorities were aligned.

I also made a point to contribute to every single release testing to make sure I understood the pain points of existing QA testers.

A test infrastructure fitting the needs of the team

The very first thing to do was to pick a tool. I evaluated 3 tools with a proof-of-concept. I took a decision in collaboration with the engineering team and recorded it in a decision record for future reference.

I decided to use the Unity Test Framework because it could be used in the same development environment the engineers were already familiar with. This provided lower friction and a higher chance for them to write and fix tests.

Once the tool was chosen I built the first tests covering the most critical paths of the game. In the process I established best practices in the form of design patterns and wrote documentation to explain how to use them.

Finally I trained engineers to write tests themselves. To achieve that I used pair programming and mob programming.

A tool for content designers

One of the most work intensive part of designing content for the game was to check it in-game for possible defects.

This meant building a new version of the game, installing it on a phone, setting up the account to enable the right content (not released yet), and then manually playing the whole unit. You would have to repeat the whole process for every change.

One of my effort was to provide the content team with a tool to automatically play a specific area of the game so they can check it via a video recording.

Results & Outcomes

30% of the release testing automated

The critical tests being automated were largely inspired by manual test cases writen by QA testers. In this context, the result from automated and manual tests were merged inside TestRail, the test management software.

This allowed QA testers to only test the left-over cases, reducing manual testing for releases by 30%.

Engineers enabled to write tests

The training and the documentation was proven effective as the engineering team started to develop tests for their own use cases.

Engineers were also able to fix issues uncovered by failing tests, before their changes were merged to the main branch.

Hours of work saved

The recording tool I provided to the content designers saved them hours of work. While the tool was not fully automated, because of the nature of the content being checked, it was one of the most successful project inside the QA team.