Sensor type and location as well transmitter characteristics, noise, and sampled data issues also can affect loop performance. Most continuous measurement sensors and transmitters have relatively fast dynamics and a noise filter, which can be approximated by a first-order lag with a one or two second time constant. Temperature sensors are somewhat slower as the sensor is in a thermowell, and these measurements have a larger, 15–30 second time constant.
Noise is often a problem in flow, pressure, and level measurements. Because flow is a very fast loop, controller tuning can be set to ignore noise by using low gain and rely on a large amount of reset to take significant action only on sustained deviations. On slower, non self-regulating loops like level, noise in the measurement can degrade potential control performance by preventing the use of higher gains and/or derivative action in the controller.
Excessive filtering of a signal to reduce noise would add effective deadtime to the loop, thus degrading the loop performance. One technique for reducing high amplitude, high frequency noise, without introducing an excessive lag, is to rate limit the signal to a rate comparable to the largest physically realizable upset. This approach chops off peak noise and allows a smaller time constant filter to effectively reduce the remaining lower amplitude, high frequency noise.
Non-continuous measurements, such as produced by the sample and hold circuitry of a chromatograph, can introduce significant deadtime into a loop. Also, the nature of the periodic step change in value prevents the use of derivative action in the controller.
Distributed Control Systems often sample the transmitted signal at a one second interval, sometimes faster or slower depending upon the characteristics of the process response. One concern related to sample data measurement is aliasing of the signal, which can shift the observed frequency. However at a one second sample interval, this has seldom been a problem for all but the fastest process responses. A general rule for good performance is to make the period between scans less than one-tenth of the deadtime, or one-twentieth of the lag in the process response.