Those who have implemented data integration solutions, such as the ones provided by our host here, must deal with so many details that testing often falls by the wayside.
However, testing is a smart way to determine if your data integration solution will meet expectations, as well as be secure and functional. Here are a few data integration testing concepts to keep in mind:
First, test each component by itself.
If you leverage an adapter to consume data from a database or application, then create a process where you can consume data using the adapter at differing rates of speed. Crank up and crank down the load on the adapter, and observe what occurs. It’s good to know the upper limit, because the application or database that you integrate with rarely limits the adapter.
Repeat this for the other components as well, including the data transformation and translation layer, and the integration flows. Use different types of security. The use of component testing seems odd, when it’s about the integration of various systems, but this is the best way to find out what works, and how well.
Second, test the data integration systems using a staging system.
Don’t use live data when testing data integration. You could leave your data in an inconsistent and unstable state. Make sure you test your data integration solution using staging systems that leverage testing data. The testing data should be as close to the production data as you can get, but don’t touch real data until you know that your data integration solution is sound.
Third, make sure to observe the data integration by auditing the test data.
If the data integration solution is moving a figure from a source system that’s of the value 56.78483 to a target system that only does money formats (XXX.XX), and thus the data needs to be rounded up or down, then you need to make sure that the data arriving is 56.78 and not 567.80. Same with text data, where you may be splitting off first and last names so they go into two fields, not one on the target.
The idea is to make sure that the data is processed correctly, and that it’s accurate to the logic placed in the data integration tool. I’ve seen huge problems in the past, such as overstating inventory requirements that lead to the loss of millions of dollars.
Finally, monitor and govern.
Part of testing a data integration solution is passive, such as the tasks I gave you above, and part of the testing is proactive. Monitoring the system to proactively to find issues and errors means that you find and correct problems before they become too much of an issue. Moreover, you place data governance systems on the data itself, to set polices as to how data in the source and target system will be managed.
Testing is important to most system development and operations, but many leave data integration out of the testing mix. Do so at your peril.