Research Paper on Sensors
Essay by people • August 1, 2011 • Case Study • 1,371 Words (6 Pages) • 1,868 Views
A sensor is a device that measures a physical quantity and converts it into a signal which can be read by an observer or by an instrument. For example, a mercury-in-glass thermometer converts the measured temperature into expansion and contraction of a liquid which can be read on a calibrated glass tube. A thermocouple converts temperature to an output voltage which can be read by a voltmeter. For accuracy, most sensors are calibrated against known standards.
Contents
1 Use
2 Classification of measurement errors
2.1 Sensor deviations
2.2 Resolution
3 Types
4 Sensors in Nature
5 Biosensor
6 See also
7 References
8 External links
UseSensors are used in everyday objects such as touch-sensitive elevator buttons (tactile sensor) and lamps which dim or brighten by touching the base. There are also innumerable applications for sensors of which most people are never aware. Applications include cars, machines, aerospace, medicine, manufacturing and robotics.
A sensor is a device which receives and responds to a signal. A sensor's sensitivity indicates how much the sensor's output changes when the measured quantity changes. For instance, if the mercury in a thermometer moves 1 cm when the temperature changes by 1 oC, the sensitivity is 1 cm/oC (it is basically the slope Dy/Dx assuming a linear characteristic). Sensors that measure very small changes must have very high sensitivities. Sensors also have an impact on what they measure; for instance, a room temperature thermometer inserted into a hot cup of liquid cools the liquid while the liquid heats the thermometer. Sensors need to be designed to have a small effect on what is measured; making the sensor smaller often improves this and may introduce other advantages. Technological progress allows more and more sensors to be manufactured on a microscopic scale as microsensors using MEMS technology. In most cases, a microsensor reaches a significantly higher speed and sensitivity compared with macroscopic approaches.
[edit] Classification of measurement errorsA good sensor obeys the following rules:
Is sensitive to the measured property
Is insensitive to any other property likely to be encountered in its application
Does not influence the measured property
Ideal sensors are designed to be linear or linear to some simple mathematical function of the measurement, typically logarithmic. The output signal of such a sensor is linearly proportional to the value or simple function of the measured property. The sensitivity is then defined as the ratio between output signal and measured property. For example, if a sensor measures temperature and has a voltage output, the sensitivity is a constant with the unit [V/K]; this sensor is linear because the ratio is constant at all points of measurement.
Sensor deviationsIf the sensor is not ideal, several types of deviations can be observed:
The sensitivity may in practice differ from the value specified. This is called a sensitivity error, but the sensor is still linear.
Since the range of the output signal is always limited, the output signal will eventually reach a minimum or maximum when the measured property exceeds the limits. The full scale range defines the maximum and minimum values of the measured property.
If the output signal is not zero when the measured property is zero, the sensor has an offset or bias. This is defined as the output of the sensor at zero input.
If the sensitivity is not constant over the range of the sensor, this is called nonlinearity. Usually this is defined by the amount the output differs from ideal behavior over the full range of the sensor, often noted as a percentage of the full range.
If the deviation is caused by a rapid change of the measured property over time, there is a dynamic error. Often, this behaviour is described with a bode plot showing sensitivity error and phase shift as function of the frequency of a periodic input signal.
If the output signal slowly changes independent of the measured property, this is defined as drift (telecommunication).
Long term drift usually indicates a slow degradation of sensor properties over a long period of time.
Noise is a random deviation of the signal that varies in time.
Hysteresis is an error caused by when the measured property reverses direction, but there is some finite lag in time for the sensor to respond, creating a different offset error in one direction than in the other.
If the sensor has a digital output, the output is essentially an approximation of the measured property. The approximation error is also called digitization error.
If the signal is monitored digitally, limitation of the sampling frequency also can cause a dynamic error, or if the variable or added noise noise changes periodically at a frequency near a multiple of the sampling rate may induce aliasing errors.
The sensor may to some extent be sensitive to properties other than the property being
...
...