At DoorDash, mobile is an integral part of our end user experience. Consumers, Dashers, and merchants rely on our mobile apps every day for delivery. And our Android team moves fast to ship impactful features that improve the user experience and the efficiency of our platform.
In the past, we verified the functionality of our releases by manually running them through a set of regression tests. In order to scale the development and release process with our growing team, we have tried to automate as much of the testing process as possible. We have taken two major steps to achieve this:
- Adopt a testable design pattern
- Automate end to end integration tests to replace manual testing before releases
In order to make code testable, it is important to decouple application logic from Android components. Model View Presenter (MVP) and Model View View Model (MVVM) are the two most popular design patterns used in Android apps. MVP decouples logic from the view and makes it easy to write tests for this logic. MVVM uses data binding to accomplish this. We chose MVP because it was easy to learn and it integrated with the rest of the architecture in our apps very well. We have been able to achieve close to 100% test coverage of view and application logic using this pattern.
Let’s take a look at the login functionality in our app using MVP:
This architecture gives us a clear separation of view, business, and application logic, which makes it very easy to write tests. The code is broken down into the following components:
- LoginContract.java defines the contract between the view and the presenter by defining the two interfaces.
- LoginPresenter.java handles user interactions and contains logic to update the UI based on data received from the manager.
- LoginActivity.java displays UI as directed by the presenter.
- AuthenticationManager.java makes the API call to fetch the auth token and saves it to SharedPreferences. This class gives us the ability to decouple application logic from MVP. Other responsibilities of manager classes that are not covered in this example include caching and transforming model data.
Oftentimes when implementing a new architecture, the tendency is to rewrite the entire app from scratch. In a fast-moving startup such as ours, this approach is not feasible since we need our engineering resources to work on product improvements in our evolving business. Instead, as we make major changes in areas of the codebase, we take the opportunity to refactor to MVP and add unit tests.
Automate end to end integration tests to replace manual testing before releases
MVP with unit tests alone isn’t enough to guarantee stable releases. Integration testing helps us to make sure our apps work with our backend. To accomplish this, we write end to end integration tests for critical flows in our apps. They are black-box tests that verify whether our apps are compatible with our backend APIs. A simple example is the login test. In this test, we launch the LoginActivity, enter the email and password, and verify that the user can login successfully:
We use Espresso for writing this test. As we are making a real API call, we have to wait for it to finish before moving to the next test operation. Espresso provides IdlingResource to help with this. As defined in the isIdleNow method, LoginIdlingResource tells Espresso to wait until the login call finishes and the app goes to the DashboardActivity.
We took the following steps to build out the infrastructure for running acceptance tests:
- Configured a database with sample data and a local web server to service API requests on a machine in our office
- Target this web server in our tests. For us, this involved creating a new build variant with a different base url for Retrofit
- Set up Continuous Integration with Jenkins to run these tests. We use the same machine to host the Jenkins CI server that runs these tests every time code gets merged into our main development branch
- Configured Github webhook for Jenkins to report build status for every pull request
We write and maintain acceptance tests for critical flows in our apps. Continuous integration using Jenkins has helped us catch bugs very early in the release cycle. It has also helped us scale our development and testing efforts as we don’t need a lot of manual testing before our releases.
These testing strategies have helped us immensely in making our releases smoother and more reliable. While the automated acceptance testing strategy outlined in this article has some initial setup cost, it scales very well once you build the infrastructure. In addition, MVP makes sure that new features are tested at the unit and functional level. We hope that these steps will help you reduce the amount of manual testing and gain increased confidence in your releases.
Some resources that helped along the way
- Setting up MVP: http://engineering.remind.com/android-code-that-scales/
- Setting up build variants: https://developer.android.com/studio/build/build-variants.html#build-types
- Setting up an emulator to target a local web server: https://medium.com/@crysfel/updating-the-hosts-file-on-android-emulator-d61a533a76cf
- Setting up continuous integration using Jenkins: https://github.com/codepath/android_guides/wiki/Building-Gradle-Projects-with-Jenkins-CI#create-the-build-job