Contents
    Basil Pastukhow Head of QA

    Banking App Testing: Strategies, Tools, and Best Practices

    Banking and financial services form a domain, where security and performance are the highest priority. Firstly, mobile banking is growing in popularity among the users and many new companies join the market. Statista reports that the number of individuals actively using online and mobile banking services reached 1.9 billion worldwide as of 2020 and is forecast to increase to 2.5 billion by 2024.

    Secondly, banking apps deal with sensitive data and therefore become a number one target for cybercriminal attacks. According to VMware Carbon Black threat data, for three months of 2020 (February, March, April), attacks targeting the financial sector increased by 238%. 

    In this regard, no one will doubt that thorough testing plays a significant role within the banking app development cycle, where no gaps or omissions are accepted. In this article, we want to tell how we perform mobile banking application testing at Surf and how the best testing practices help us create great products.

    How we test banking apps at Surf

    Challenges in testing banking applications

    Banking software tests shall result in smooth, failure-safe functioning and complete security. To avoid any bugs, we run automated and manual tests for every new release and test all user scenarios to check whether data is updated and received from the server without any delays, so that users can smoothly run any transaction or get prompt information about the new bank products. 

    In the case of banking apps, we are often in charge of the front end, while the back end is the scope of other teams. We have no access to the tools available for the bank, and to meet this challenge, we practice working in close cooperation with the client’s side using available tooling in case the front end team is delivering faster.

    The key challenges that we at Surf face while testing banking apps are:

    • a lot of third-party integrations

    Banking apps are characterized by numerous integrations with third-party systems, such as bill payments or trading accounts. This may make testing more complicated, as the QA engineer needs to have a clear vision and understanding of how the system functions to find out what exactly may fail and what effect it can make upon other elements and components of the complete system.

    • many features with complex logic

    The specific nature of banking app features lies in the complex internal logic to be taken into account during tests. For example, a simple user action such as entering a couple of digits in the finance app may start a complicated internal calculation process.

    • growing banking apps functionality

    The financial industry is a highly competitive and ever-changing world. To withstand the competition and meet the growing users’ needs and demands, banking apps cover broad functionality and are regularly updated and extended along with new regulations and new technology-driven opportunities. These are the scope and the frequent updates that make mobile banking app development and testing especially challenging.

    Surf best practices for mobile banking app testing 

    The testing flow that we apply at Surf is universal in its core and has been developed for more than 10 years based on the various projects’ experience, native and cross-platform equally. Our portfolio covers projects for many industries, including the banking and fintech sector. Along with the standard widely-spread rules and procedures, we have some practices that help us not only to ensure app quality at each stage of development but also to save precious time. 

    • early review of requirements and design

    When the Surf team receives tasks for app development, our QA team starts with the review of the requirements for their completeness, correctness, and consistency. In parallel to the requirements review, we also go through the design to make sure it covers all app states and to avoid any inconsistencies between various platforms. We check whether the requirements and design do not contradict each other.

    The earlier the bug is detected, the fewer costs are required to find and fix it

    • common structure for banking application testing test cases

    At Surf we have our own format to be followed for test cases and checklists and it’s used on all projects with a few exceptions. 

    We have created a single common structure for test cases: we write them in an identical way according to the structure, but with the creative touch inside of each test case. The approach helps to improve the process of the app testing: accelerating test reviews, making the test maintenance and project onboarding easier and more transparent. When our experts have to shift between different projects every team member is aware of what is written and how it shall be implemented. 

    • component tests for new features

    When developers are ready with some part of the app, the QA team’s task is to test it. In case this is a new feature, we run the detailed component tests. They are much like the model-based tests when some model is defined (for example, a screen, a button) and it shall be tested according to the previously written test cases and checklist.

    This is how a checklist for a complete feature looks like. We test entry points and the interaction of the tested feature with others. In this way,  we perform all the required tests. In case any bugs are found, we create bug reports, prioritize them, and send them to be fixed.  If the build contains fixed bugs, they are also retested and are either closed or reopened. 

    • test automation

    As the project grows in functionality, manual tests require more effort, and with time the performance of the QA team will become insufficient.

    After about six months of active development activities, it becomes impossible to perform full testing manually. Extending QA staff to meet the increasing demand can be very costly. With the time running, the project will arrive at the situation when the smaller part of it is tested, which is the part with the most bugs (80/20). Therefore, test automation is strongly recommended for long-term projects, and the banking apps are mostly of such a type. 

    The next reason for the automated tests to be implemented in banking apps is that these apps include a lot of complicated logic with repeated actions that are a good fit for automation. For example, regular payment processes with numerous fields.

    The test automation can ensure the quality and velocity of testing for regular releases that the manual testing will fail to maintain in a definite time. The automated tests do not depend on human resources, they are performed quickly and on a regular basis. This means, one person can be added to the team to write the automated tests and this will result in increased coverage. It’s probable that in 5 years of development, test automation will be responsible for 90% of test coverage, while the rest will be tested manually.

    In long-term projects, the test maintenance per year requires about 30% of the time spent writing the tests. For example, the automation test coverage for a feature takes 100 hours. After that, maintenance of the feature testing will require 2.5 hours monthly. It shall be noted, however,  that this refers to the fixed features only. In case of serious changes or redesign/refactoring, we speak of actualization rather than of maintenance.

    • Flutter test automation

    Surf has been one of the pioneers in Flutter development. As we started developing with Flutter, the question arose of how we can automate Flutter in terms of testing. 

    First Flutter banking app in Europe for Rosbank developed by Surf

    Flutter offers the possibility to write native automated tests with Dart and cover all test types as required. There are the following test types in Flutter:

    UNIT tests are designed to check a specific module of the system, for example, to check whether a component controller sets the required state. This does not refer to the interface. Therefore, there is no need to emulate the app. At Surf, UNIT tests are written by developers as they are more engaged in the internal app logic. While testing automation engineers write for the next two types: widget and E2E tests, as both test types are most of all connected with the finished app and test cases.

    – Widget tests are a level higher than UNIT tests. These tests allow emulating widgets and performing the required testing with them, which is great news when talking about Flutter, as inFlutter everything is a widget. For widget tests, we initiate widgets of different types and check their functionality. They can be both complete screens and separate elements: some fields, buttons, checkboxes, and others. In this context, our approach with the unified structure for test cases is of great advantage: mapping widget tests is easy with our test cases that cover in detail each element or screen.

    Widget tests allow early detection of bugs that are usually visible in the course of manual testing only. The developers have a possibility to fix them promptly. Thus, this approach makes a positive impact on development velocity and code quality.

    – E2E tests (End-to-End) imply complete loading of the app and simulating the user actions in a real infrastructure with real services, API, and others. Such tests in Flutter are called integration testing, but we use the E2E term for clarity.

    Our portfolio includes a Flutter-powered B2C banking app project, where we implemented test automation and the required infrastructure so that the bank can run the tests themselves.  About 65% of the app code is covered with End-to-End automated tests — the testing process with the detailed emulation of the user environment. This means simulation of various user activities – mouse click, pressing the button, filling in forms, switching between screens, and so on. Such testing runs before releasing builds — it takes time, but allows a detailed check of the app quality.

    • testing before release

    The final test run is performed for the release version of the app before transferring it to the client to check whether the important features function smoothly, prove the stability of new functionality and confirm that the latest code changes have made no negative impact upon the already existing features. At this stage, there is no time for detailed testing of every element, we use scenario tests to completely cover the user journey. 

    After successful test completion, we perform migration to check whether the previous version placed in stores is correctly updated with the new version. 

    • testing tools 

    We have a certain set of testing tools that our project teams use in most cases: 

    Dart+Gherkin — tooling for Flutter native automated tests;

    Calabash (Ruby+Cucumber), RubyMine — tooling for cross-platform automated tests

    Jira — a tracking system for bug tracking and project management;

    Xray — a full-featured test management tool, Jira plugin;

    Charles — an HTTP proxy for testers to view all of the HTTP and SSL/HTTPS traffic between their machine and the Internet;

    Figma —  a vector graphics editor and prototyping tool;

    Various analytics systems (Firebase among others);

    Android Debug Bridge (ADB) — a versatile command-line tool that lets you communicate with a device;

    Xcode — a tool to build iOS apps for the testing purposes;

    Android Studio / Visual Studio Code — code editor redefined and optimized for building and debugging modern apps;

    CI/CD of our choice to deliver and test our apps is Jenkins — used to build and test software projects continuously, making it easier for developers to integrate changes to the project;

    Mocker – internal service to mock back end responses in case the front end team is delivering tasks faster.

    Still, when the client prefers a different testing workflow or some other tooling or services we are ready to adapt. We are always ready to include additional checks and tests based on the client’s feedback, for example,  finding out causes of newly detected bugs, writing API tests, swagger review, and such. 

    Summing up

    To sum things up we would like to emphasize once again some factors that allow the Surf team to ensure the quality and stable functioning of the banking products we were glad to develop for our clients:

    • our QA workflow that covers the entire app development process from project requirements and UX design to the final release;
    • good knowledge of the banking and finance domain, as we have many industry-specific projects under belt, both in B2B and B2C sector including pioneer Flutter-powered banking apps;
    • a single common structure for test cases and checklists that is clear, transparent, uniform, and easy-to-maintain;
    • proven experience with writing and implementing automated tests that allow increasing test coverage and saving time.

    If you need more information about our team and approaches, do not hesitate to contact us

    Discuss your project