I'm inputting 1 - 5 VDC on the DX1000/DX2000 and using scaling of 0.0 - 200.0%. How do I calculate the overall accuracy?

The measurement accuracy of the 1-5V range is ±(0.05% of rdg+3 digits), and the maximum digital display resolution is 1 mV. Formula becomes "measurement accuracy during scaling (digits) = measurement accuracy (digits) × scaling span (digits) ÷ measurement span (digits) + 2 digits."

First, calculate the accuracy of voltage measurement at 1-5V prior to scaling. "rdg" stands for "reading."
When inputting 1V, it's 0.05% x 1V + 3mV = 4mV (maximum resolution or less, rounded up). In other words, 4 digits. When inputting 5V, it's 0.05% x 5V + 3mV = 6mV (maximum resolution or less, rounded up). In other words, 6 digits.

Next, calculate the measurement accuracy when scaling. The scaling span is 200.0 - 0.0=200.0, and for digits, it's 2000 digits. The measurement span is 5.000 - 1.000 = 4.000, and for digits, it's 4000 digits. 
The measurement accuracy with 1V input when scaling is: 4 digits x 2000 ÷ 4000 + 2 digits = 4 digits, in other words 0.0% ±0.4%. 
With 5V input it is: 6 digits x 2000 ÷ 4000 + 2 digits = 5 digits, in other words 200.0% ±0.5%.

相关产品&解决方案


置顶
WeChat QR Code
横河电机(中国)有限公司