Developers, follow these 10 steps to test your manufacturing code

Testing is an under-rated step on the journey to manufacturing excellence. It often lacks formal procedures. It can be rushed through when things get behind.

Testing needs to be an integral component of your project plan, executed with respect. Thorough, disciplined testing delivers:

  • Fewer go-live issues and corrections, not to mention issues in later use
  • Greater operator uptake/use → higher ROI for the project
  • Systems that best carry out the original strategic initiative.

Ten steps on how to test your manufacturing system code

There are as many ways to test as there are to create code. Still, I find it handy to have a guide that I work by.

  1. Unit Testing Test each step of the code, piece by piece. Does each one do what it is supposed to? This step is surprisingly difficult!  As you focus on the role of each piece of code – checking, for instance, that it not only ‘works’ but does what it was designed to do – you must also keep in mind the big picture about how the pieces work as a whole.
  2. Data set testing Create a number of realistic test scenarios (often, this is done best as a team effort). Here’s where you see that testing is an art as well as a science. As you become more expert, you’ll be able to come up with more efficient, faster testing scenarios, that test several components together rather than each tiny step on its own. As a whole, your scenarios must cover every aspect of the code.
  3. Integration testing Now test how all the pieces come together – the complete system. As an example, you may have tested one machine in a solution, now you’re testing how the code works for all affected machinery. You may be simulating the whole factory in action. Note that this is when it becomes more difficult to troubleshoot – which is why diligence in steps 1 & 2 pays off. The earlier you catch a problem, the better.
  4. Regression testing Now we’ve tested the complete solution, piece-by-piece and as a whole. Your solution may be a brand-new, standalone entity, but it more likely interfaced with software developed in the past – so you need to verify that your new code is not affecting the functionality or performance of past solutions, and vice versa. A common method is regression testing, re-performing tests completed in the past to check whether any faults have surfaced.
  5. Performance testing Now we’re getting beyond the basics, and closer to what the customer will focus on. Often, the complaint will be that the codes needs to execute faster. Is your code executing quickly enough for the application it serves?
  6. Defect tracking When you’re debugging you may be able to ‘fix’ the problem without finding the root cause. Train yourself to look for root causes. Testing can be highly complicated, but over time you’ll start to learn where the best start point is, and how to best use the tools available to you, such as the SQL Profiler.
  7. User Acceptance Continuing with the customer viewpoint, consider UX, the user experience. Is your program a pain to work with? If so, it’s time to tweak or re-design. The classic ugly scenario for a data-entry screen is a too-busy design. Keep it simple, and you’ll decrease data entry errors for years to come.
  8. Iteration Say you’ve now made a change to fix a bug. You’ll need to understand what else you need to go back and re-test (I know, you don’t want to, I feel your pain). Understanding the big picture ­– which will be reinforced by your work in steps 1 and 2 above – will help reveal what needs to be retested.
  9. Group testing → When working with different teams, communication is critical.
    • Be ready to learn and speak the ‘language’ of other integrators.
      Example: Recently our Factora team was working with a website development team. During testing, it was more difficult to understand each other than if we’d been working with a supplier in the same industry; in questions back and forth, we had to avoid making assumptions as to whether the listener understood. Often it was better to explain the what – what was needed – than the why. For the plant MES, the information was stored on the Proficy database, but for the web developers, working on the warehouse system, their reference point was an application accessed by a URL. Considerable human communications skills were called upon to get Proficy and the web app to converse!
    • You’ll need a central, document game-plan that everyone can reference. In the above example, the more we had of centralized documentation – starting with the design, updated later to track testing – the better it was. This applies even if the two teams are just you and the customer!
  10. Not the time to cut corners Last but not least, testing can be a place where a PM tries to cut corners, in estimating, execution or both. Beware! Testing requires hard thinking, discipline and creativity; it does not take well to being rushed. Good testing is worth the time it takes, even when the deadline is coming at you like a tidal wave.

      Conclusion

      Testing is a science, an art, and a developable skill. It pays off handsomely, saving many more hours, days or weeks than it consumes. High standards in testing separate the average supplier from the excellent one.

      Think about the last system you implemented – could the testing have been improved upon?