Give away medical masks when you place an order. learn more
The capacitive sensor contains several layers: a top layer of glass or plastic, followed by an optically clear adhesive (OCA) layer, then the touch sensor, then the LCD. The touch sensor is a grid of sensors that are typically about 5 mm x 5 mm. These sensors are built using indium-tin-oxide (ITO). ITO has some interesting properties that make it a great material for touchscreen construction. It’s over 90 percent transparent, but it’s also conductive. Some designs use a diamond pattern, which is visually pleasing, since it doesn’t align with the LCD pattern. Others use a simpler “bars and stripes” pattern. If you examine your device at the correct angle with good lighting, you may be able to see the ITO sensor lines with the LCD turned off.
Sensing mutual capacitance is fundamentally different from sensing self capacitance. To sense self capacitance, we typically measure the time constant of an RC circuit containing the sensor. Sensing mutual capacitance involves measuring the interaction between an X and a Y sensor. A signal driven on each X line and each Y line is sensed to detect the level of coupling between the sensors. Interestingly, a finger touch will decrease the mutual-capacitance coupling while a finger touch increases the self-capacitance value.
Various methods are used to determine the finger position from this information. One of the simplest is a centroid (center of mass) calculation, which is a weighted average of the sensor values in one or two dimensions. Using a 1-D centroid, the X coordinate above is (5*1+15*2+25*3+10*4) / (5+15+25+10) = 150/55 = 2.73. We then scale this position to match the LCD resolution. If the ITO sensor pattern extends beyond the LCD’s sides, some translation is performed for this as well.
Edges complicate the finger location problem. Consider the array shown above if the panel ended at each of the columns. The simple centroid shown above will start to “pull” to the right as the terms on the left drop off. To counter this issue, we must use special edge processing techniques that examine the shape of the remaining signal and estimate the portion of the finger that’s off of the screen.
Communication to the host processor
Once a valid touch signal is present and the X/Y coordinates of the touch are known, it’s time to get the data to the host CPU for processing. Embedded touchscreen devices communicate using the venerable I2C interface or SPI. Larger touchscreens typically use USB interfaces, since Windows, MacOS and Linux all have built-in support for HID (Human Interface Devices) over USB.
Although several different interfaces are employed, the OS drivers end up doing similar work with each one. We’ll discuss the Android driver in our example. Since Android and MeeGo are both built on Linux, all three use similar drivers.
The touchscreen driver’s interrupt triggers an interrupt service routine (ISR) that schedules a worker thread. No work is done in the ISR to maintain interrupt latency and prevent priority inversions. When the worker thread is called by the OS, it starts a communication transaction to read the data from the device and goes to sleep. When the communication transaction completes, the host driver has the data it needs to proceed.
The host driver translates the proprietary data format used by the device manufacturer into a standard format. In Linux, the driver populates an event’s fields with a series of subroutine calls, then it sends the event with a final call. For example, creating a single-touch Linux input event looks like this:
input_report_abs(ts->input, ABS_X, t->st_x1); // Set X location
input_report_abs(ts->input, ABS_Y, t->st_y1); // Set Y location
input_report_abs(ts->input, ABS_PRESSURE, t->st_z1); // Set Pressure
input_report_key(ts->input, BTN_TOUCH, CY_TCH); // Finger is pressed
input_report_abs(ts->input, ABS_TOOL_WIDTH, t->tool_width); // Set width
input_sync(ts->input); // Send event
This touch event then goes into the OS. Android saves the event’s history in the gesture processing buffer and passes the event up to the View class. Several touchscreen devices (like the Cypress TrueTouch™ products support hardware gesture processing. Hardware gesture processing relieves the host OS of the burden of gesture processing and in many cases it eliminates the processing of all touch data until a gesture is seen. For example, if you’re in your photo viewer, the host doesn’t have to process dozens or hundreds of touch packets to see that you want to flick to the next photo. No interrupts take place until you actually flick over to the next photo.