When you’re looking at complex translation workflow systems with a high degree of process automation, testing becomes an essential part of the implementation process. A well-structured test procedure with predefined test scenarios and test cases is indispensable for meticulous test result documentation. It can be helpful to use a test management system, especially if there are a variety of people involved in the testing procedure.
However, this does not replace the test plan and structure, which are always needed, never mind the system or form of documentation. Here is a rough outline of a test project for a large translation workflow system, laid out to explain the possibilities and problems and give you some ideas on the main aspects of translation workflow testing.
Not only do large translation workflow systems often come with a substantial number of interfaces — to web portals, authoring systems, publishing systems, terminology management systems — they are also internally customized to the customers’ specifications. The implementation of a translation workflow system often happens in an IT project setting. An important segment of project time and resources should be planned for testing and approval of the finalized system, including all interfaces and customized features.
Standard features of the new system, if any, do not need to be tested by the customers, since they are buying standard software and rightfully expect to receive properly functioning software in all standard aspects. However, all other functions and features that are customer-specified need to be thoroughly tested and approved by your team of specialists. This is because only the people who specified the system’s behavior and features really know what the system should and should not do. The worst thing that could happen is that you go live with the system and the essential process steps don’t work properly. Then you start losing data, time and money.
Test planning time and resources
A translation workflow system is usually tested on different levels — installation test level, performance test level, smoke test level, functional test level and so on. It is common for the technical test categories to be organized by the IT department of your unit or company. Often the IT department will manage the test project or subproject and monitor the whole test process. It is very important, however, that functional tests are given a substantial part of the allocated time and resources in the test project. The task of creating functional test scenarios and cases, as well as the functional testing itself, should ideally be given to people familiar with the new translation workflow system and the customized processes and functions, namely the specialists who specified them.
The classical test procedure consists of the creation of a test handbook as well as test concepts, scenarios and cases. A well-written test handbook is worth its weight in gold. It defines things such as the acceptance criteria for approval of the tested translation workflow software, meaning at which error margin the software will still be released or accepted. It also specifies the number of test cases necessary per feature or function and defines the applied error classes:
Type 1: Fatal error, no system approval: Errors that cannot be avoided or fixed and which make the system’s use impossible.
Type 2: Critical error, no system approval: Errors that can only be avoided or fixed at unreasonably high cost to the company, such as by excessive additional manual work.
Type 3: Severe error, system approval possible: Errors that make system use impossible, but for which economically reasonable alternatives exist.
Type 4: Light error, system approval possible: Errors that make the use of the system only slightly more difficult or don’t affect the system at all.
The test handbook should ideally be created and handed over to the software supplier of your system before the project starts. It helps to clarify the expected quality levels and test targets and specifies approval criteria for the testing and approval phase of your project. When standardized accordingly, the test handbook can be used for all software test procedures of your company or unit.
In a perfect world, a test concept is created as soon as possible — ideally before start of the project — since its contents influence the supplier’s effort directly. The test concept defines test scope, procedures, resources and time planning of the testing and approval phase. Unfortunately, it is often not created or is created too late, when all the conditions with the supplier of the software have already been negotiated. Without a test concept accepted by your supplier, it will be very difficult to discuss test results later if these don’t meet your expectations and requirements.
Test scenarios are both logical clusters of test cases for your system’s functions and features that need to be tested and a logical story for your testers to follow, as seen in Figure 1. They help organize your test cases into a logical order and function groups, which can then be tested in a row by one tester, or at the same time by a number of testers, depending on the amount of different test case scenarios that are needed.
Test cases and corresponding test steps depend on the test scenario in which they are located. Test scenarios as well as test cases are either documented in a test management system, or for smaller volumes can also be documented in Excel or Word. A complete test case usually consists of a unique identifier; description of the expected system behavior; test steps in the order in which they are performed; requirements; category; author; and the resulting status, such as pass or fail.
Testing, reporting and error tracking
Last but not least, your test cases need to be tested. It is advisable to have an adequate number of able testers available, depending on the quantity of test cases and time available for testing. Usually, testing time grows thinner as the project proceeds because the system supplier often takes longer for system implementation than planned. Thus it is good to have a pool of testers available as needed, and tests that can be pursued at the same time. For this, the test environment needs to allow for parallel testing.
It is advisable to plan for a minimum of three testing phases. In the first testing phase you may expect a larger quantity of failed test cases which need to be reported to your system supplier for bug fixing. After a predefined bug fixing time, you will start retesting, and depending on the number of errors found in your retesting phase, you might want to think about having a second retesting round before approval. Your approval testing phase should be started when only a small number of insignificant or easy-to-fix errors are found, and you are confident that your supplier will have everything fixed and ready for final approval.
Reporting of test results will be easy if you use a test management system with the possibility to document and report errors directly to the supplier. This way you will also be able to get a quick overview of your test status, as can be seen in Figure 2.
If you use Excel or Word, it would be advisable to create some easy-to-read chart to give your management a gist of test progress without having to go into detailed descriptions. It’s best to ask your management what level of information they want. Do they need test scenarios, cases or even steps; do they need detailed bug descriptions; or do they just want statistics?
An error tracking system is needed when implementing a large translation management or workflow system. It can become tedious and confusing to track errors and bug fixing progress in Excel tables. A web-based error tracking system is extremely helpful here, because everybody working with the system can enter and track errors no matter where they are located. As another plus, the system supplier receives all the errors directly and reacts quickly. Progress is always updated and everybody informed so that there is no need to write emails about the status of bug fixing or track the progress manually. If you have an error tracking system at your company, use this. Otherwise clarify with your system suppliers, ideally before starting the project, if they have a usable error tracking system available.
Ideally your IT department plans testing, integration and production environments for the new translation workflow system. The development environment will be at the system supplier if you are not developing your own software. The initial expense of implementing three environments will save you time and money overall. Not only will you be able to test your new system safely before going live, but you will also have the means to test all future changes properly before updating them to your production system, thereby reducing system failure to a minimum. In the beginning of the project you might only be starting with integration and production environments. A testing environment will hopefully be added later, when updating your system becomes a necessary and regular task.
Well begun is half done
Whatever you want to achieve with your translation workflow system, thoroughly planned test environments and processes will definitely help you get smooth translation workflows up and running. If you want to save money in the long run, you might want to think about investing more time and resources in your test management processes. In the end, system failure and data loss during an important localization project are not only a nuisance but can be very costly indeed.