Robots are now able to perform an amazing number of tasks, many of which can be traced directly to improvements in sensor technology over the past several years. Not only have individual sensors benefited from miniaturization and advances in calibration and sensitivity, equally important is the joining of diverse sensors—sensor fusion—that has given robots multi-tasking capabilities, allowing them to expand well beyond the single, repetitive functions to which they were relegated for decades.
Non-traditional robotic applications today include character recognition, RFID tracking, e-nose, communications activities and many more. Today’s robots have more dexterity and are more flexible, adaptive and intelligent.
Advances in vision, touch, smell, taste, and hearing are bringing down the price tag for new robotic applications and at the same time increasing how accurately and reliably they perform. Add sensors to measure distance, light, rotation, magnetism, temperature, pressure, altitude and inclination, and the tools are there for a machine to respond to environmental cues around it. Not only can a robot’s sensors enable detection, vision, touch and more, the robot now has the ability to physically interact with its surroundings (as well as monitor itself using additional sensors).
Importance of calibration
For all of these applications the key to successfully advancing robot usage is the ability to calibrate on-board devices. For robot activities involving sensor fusion or vision-based manipulation, both the sensors themselves and the manipulators must be calibrated (for example, vision relative to the robot’s arm length).
The challenge becomes clear when considering that each sensor in a sensor-fusion environment has different error characteristics. Today there is no general-purpose calibrating framework for each robot sensor and actuator that can account for these different error characteristics. Research and development is now focusing on this challenge of creating a flexible framework from which a sensor-fusion-laden robot can be automatically calibrated as each sensor is added.
To the extent that they can be, individual sensors are now calibrated either before or after deployment. An advantage of pre-deployment calibration is that cost can be reduced since it is possible for the manufacturer to calibrate a large number of sensors at one time rather than individually. In-place calibration, once a sensor is installed, is much more time consuming and the sensor is exposed to environmental factors that may affect calibration.
There is, however, help arriving for engineers tasked with calibrating sensors. For example, the OCB100 Series Auto-Calibrating Boards by OPTEK Technology and the OCB100-KIT design kit (Figure 1) provides engineers an opportunity to help designers become familiar with the capabilities of a variety of basic optoelectronic sensor types, including transmissive, reflective, and fluid.
The kit includes an automatic calibration circuit card that interfaces with each sensor to make evaluation easier. The calibration PCB can be interfaced to almost any optoelectronic sensor via the onboard Molex 70553-0038 four-pin header. The circuit provides automatic calibration and therefore provides an easy way to utilize the full range of production devices that may be provided on any type of sensor. The circuit allows the engineer to compensate for manufacturing variations, temperature changes, and device aging present in optoelectronic systems.
Figure 1: The OCB100-Kit by OPTEK Technology.
The OCB100 series is designed to minimize the change of optical devices due to manufacturing variance, temperature change, and device aging. With the OCB100, the design engineer can reduce the sensor-to-sensor variation present in many systems. By providing a pre-calibrated sensor to the system design, the engineer can enhance the reliability and consistency. Degradation of the LED or phototransistor is compensated for each time the system is calibrated allowing the system to provide a known, consistent output level resulting in years of consistent quality. The OCB100 series is designed to maintain the calibrated setting even if power is lost, thus allowing faster startup without the need for calibration every time the device is initiated.
Sensors for robotics
Let’s look at a few popular sensors used in robotics applications and how they are evolving to include advanced features that are making them easier to use. The SCC1300-D02 by VTI Technologies (Figure 2) combined gyroscope and 3-axis accelerometer with digital SPI interfaces is used in guidance systems, navigation, motion analysis and control, and robotic control systems.
Figure 2: SCC1300 component block diagram (Courtesy of VTI Technologies).
The sensor is based on capacitive 3D MEMS technology. The component integrates angular rate and acceleration sensing together with flexible separate digital SPI interfaces. While the angular rate sensor bias stability is exceptionally insensitive to all mechanical vibrations and shocks, the component has several advanced self-diagnostic and error-conditioning features.
SCC1300 sensors are factory calibrated. No separate calibration is required in the application. Trimmed parameters during production include sensitivities, offsets and frequency responses. Calibration parameters are stored during manufacturing inside non-volatile memory. The parameters are read automatically from the internal non-volatile memory during start-up.
STMicroelectronics’ STEVAL-MKI062V2 iNEMO Inertial module V2 demonstration board (Figure 3) is based on MEMS sensors and the STM32F103RE, an ARM-based 32-bit MCU with 256 to 512 kB Flash, USB, CAN, 11 timers, 3 ADCs and 13 communication interfaces. Combining accelerometers, gyroscopes and magnetometers with pressure and temperature sensors, the MEMs sensors provide 3-axis sensing of linear, angular and magnetic motion, plus temperature and barometer/altitude readings. This solution combines ST’s advances in miniaturization and sensor integration. This 10-DOF (degree of freedom) inertial system represents a complete hardware platform, which can be used in numerous applications including human machine interfaces and robotics.
To aid in user development and analysis, the STEVAL-MKI062V2 demonstration kit includes a PC GUI for sensor output display and a firmware library to facilitate the use of the demonstration board features.
Figure 3: The STEVAL-MKI052V2 inertial module V2 demo board.
For anyone wanting to design with a low-cost and easy method of distance measurement, the Parallax PING)))™ Ultrasonic Sensor easily measures distance between moving and stationary objects. Distance is measured from transmission to echo return using sonar and an ultrasonic pulse is emitted from the unit.
In setting up the PING sensor you should be aware that temperature has an effect on the speed of sound in air that is measurable by the sensor. If the temperature (°C) is known, the formula is:
= 331.5 + (0.6 x Tc
The percent error over the sensor’s operating range of 0 to 70°C is significant (in the magnitude of 11 to 12 percent according to the data sheet). The use of conversion constants to account for air temperature may be incorporated into its program (and the datasheet provides an example).
Figure 4: The Parallax PING))) Ultrasonic Sensor.
Sensor calibration is more important than ever given the rapidly expanding sensor-fusion landscape. It is vital that every sensor device in an application is consistent with regard to how variables are measured. If not, sensor data may be at best unreliable and at worst useless. As advances continue to take place, well-calibrated sensors will permit robots to become more flexible, reliable, mobile, and ideal for an ever-growing number of applications.