Explaining Blocks and Block Size in DASYLab

Last Modified: 11th Jan 2013
Category: Data Acquisition > DASYLab
Platform: All
Version: All
Article Ref.: 13C96
»Return to previous search
»Print friendly version of this article.
1 person has found this article useful.

DASYLab 13C96 DASYLab works in a block dependent mode. Each block contains measurement data, called samples, and a block header. The header contains information about the block, including the data type, the start time of the block and the sample interval. The program calculates a time stamp for each sample. Many modules verify the time stamp and data type to ensure that only samples with the same characteristics are processed together. In DASYLab, the data is processed block-by-block in order to enable measurements with high sampling rates (resolution). If you work with too small a block size, for example, 1 (therefore each sample), it can cause the measurement to be interrupted with the error: "The sampling rate is too high!", and DASYLab stops. It is more efficient for a module to process a complete block before passing this data to its output buffer than it would be to process single samples. Using a large blocksize increases the maximum rate at which data can be processed (See NOTE below) The choice of block size has other implications. Many modules actually perform a calculation with the data in each block, or a whole number of blocks. For example, the "Statistical Values" module offers functions such as Mean, RMS, Maximum and Minimum which can be calculated over an integer number of blocks, or over all data acquired since the worksheet started running. The response rate and update rate of display modules is primarily determined by the sampling rate/block size ratio (S/B ratio). Display modules are affected by this. The Chart Recorder and the Y/t chart in "Fast Recorder" mode will only update when a complete block is received, similarly the digital meter only updates once per block. Many output modules which control software polled hardware will only update once per block. Therefore, if you are running any kind of output, one of the limiting factors on the response rate will be the S/B ratio. NOTE: We must go somewhat further back for the explanation of why this is. Every module in the DASYLab worksheet is written into a module administration list. While DASYLab runs, a dispatcher assigns CPU time to each module in this list for data processing. Here, we find a physical speed boundary that depends on the size of your worksheet and the capability of your computer because the dispatcher must process all modules on the list within a specific reaction time that is defined by the overall sampling rate and block size. Let's look at 100 modules and a sampling rate/block size (S/B ratio) relationship of 1000. All of the data must be processed within 1 msec. (1/1000). With 100 modules, CPU time is assigned; that is 10 sec for each module to process its data. That is simply too little, the program will not be able to keep up with the data stream! The degree of precision that you define determines the sampling rate (resolution) of your measurement. The block size determines the cycle time for the Dispatcher and with it the time to process the collected data. The block size also determines other constraints, like the refresh rate for the visualization modules, the width of a FFT, etc. From our experience, we recommend that you use an S/B ratio of somewhere between 1 and 10. How Do I? DASYLab All All en All

If you can't find a solution on the Knowledge Base then please contact us on the Technical Support Request Form or by email or by telephone on +44 (0) 203 695 7810

For the time being we are unable to offer the following product ranges although we are currently working hard to increase the number of products we can offer in the future. Please contact us to talk about alternative products that we may be able to offer you.