Sometimes processing on a DAP is critically important.
Point to links in the paragraphs to learn more.
All DAPL system processing is modular, whether pre-programmed with the system, or custom-developed.
All processing in the DAPL system is naturally multitasking, allowing data processing and critical real-time response requirements to be addressed independently.
In large systems, distributing the low level processing related to signal quality and information extraction yields a clean and efficient system design.
The DAPL system
The DAPL system isolates the mechanics of data acquisition so that application software can concentrate on what needs to be done, not how to do it. The DAPL system provides pipe system management, memory management, configuration management and task scheduling.
All processing is modular
Processing is downloaded. Even commands considered to be built into the system are downloaded the same. You can apply a processing command to as many signals as you have to process.
All processing is multitasking
For high performance sampling and updating, multitasking is mandatory. You won't need complicated synchronization mechanisms, because synchronization is built into the data resources.
Embedded software can be difficult, but not when it is already done and ready to run. Most applications do not need to write any new processing commands — just specify the desired existing commands and send the data.
Custom tasks and DTD programming
Preparing specialized or proprietary processing software is no problem. Using the "hooks" provided by the Developer's Toolkit for DAPL, you can wrap your processing in a custom processing command, to operate onboard under the DAPL system. Functions receive services from the DAPL system through a DLL- like calling interface.
Intelligent data processing
DAPL processing excels at reducing large volumes of data to smaller volumes of essential information. DAPL processing can rapidly analyze data to identify the parts that are relevant and throw out the rest. Digital filters? Not a problem. Transforms, spectrum analysis, correlations... whatever you need to identify useful information in your data.
While not optimized to push values through processing one at a time for minimum latency, DAPL processing is consistent. The amount of time that other processing can take to delay a response is tightly bounded. An assured response in a fraction of a millisecond might be fast enough.
Easier just to do it in the host?
Whether doing all of your processing in the host is easier or harder depends on the application problem and the operating environment. GUI data analysis packages for host systems can be overwhelmed by massive amounts of data crunching on many channels. GUI packages often don't "play well" in automated systems with other supervisory control software, and they are pretty much out of the question for a remote station where nobody is available to operate a GUI interface. DAP processing can serve in place of the sophisticated software systems, or work in combination with them, distributing the computational load.