Skip to main content

Touch of class

Most people now own, or at least have used, a touch-screen device of some description. Most, for that matter, have tried Microsoft’s Kinect sensor, in which ‘you are the controller’. In the world of consumer electronics, ‘interactive’ has become a bit of a buzzword and manufacturers are dreaming up increasingly clever, interactive ways to control mobile phones, laptops or games consoles. One of the latest gismos, which is set to be released at the beginning of next year, is Leap Motion’s gesture recognition device, a USB peripheral that can be plugged into a laptop or desktop computer and creates a 3D interaction space where applications on the screen can be controlled with hand movements. The device, priced at $70, uses image sensors to track movements within an eight cubic foot space, allowing the user to navigate through a virtual 3D space akin to Tom Cruise in Minority Report or browse webpages and control applications, all without touching a keyboard or mouse.

This kind of Minority Report-style interaction is already available, albeit in the form of larger display systems for corporate presentations and the like rather than an everyday device for the masses. One such system is beMerlin from Swiss company Atracsys, which the user controls with gestures. It operates via stereovision using cameras from Sony to detect the 3D position of the user’s hand in front of the screen. The interaction space is illuminated with near infrared light, which Sony’s CCD sensors are sensitive to, up to around 900nm.

‘We developed the beMerlin system back in 2007,’ recalls Dr Gaëtan Marti, CEO of Atracsys. ‘It is real 3D interaction, but the biggest difficulty was to explain to people how to use it. Those who have tested Microsoft’s Kinect have a good idea how to interact with the system by moving their hands to manipulate virtual objects. Kinect was maybe the first 3D interaction system available to the mass market; before this though it was very difficult to educate customers in operating the system. We had actors inviting people to use the system and video tutorials to explain how it worked.’

Other systems available from Atracsys include its atracTable, a pool table-sized touch-screen table on which several users can interact, and the ICU, which responds to head movements and other gestures. The ICU is designed for retail environments, such as shop windows where the shopper can browse the store in a virtual world, trying on glasses, for instance, via the interface. The atracTable is more for advertising and product presentations.

All three systems incorporate industrial GigE Vision or FireWire cameras from Sony, cameras that were selected, according to Marti, due to the quality of the image, their longevity in terms of interface, and their ability to cope with relatively harsh imaging environments. An ICU device positioned in a shop window would have to deal with huge variations in light intensity, for example, while cameras within the atracTable have to be able to handle the heat generated within the system. ‘Sony’s industrial cameras have the necessary performance regarding the shuttering, speed and resolution, which standard webcams or other lower-cost models don’t provide,’ Marti comments.

Marti adds that industrial cameras have come down in price to a point where they can now be integrated into these sorts of interactive systems, while still providing the necessary performance.

Tracking finger points

A device like the atracTable lends itself very well to a group of people using the table together at the same time, which is one of the big benefits of large, interactive displays. UK company, Pufferfish, manufactures spherical displays, which, by their very nature as a sphere, makes them ideal for a number of people stood around the device all interacting with it simultaneously. Pufferfish, as with Atracsys, bases its touch-screen capability on camera technology, rather than an embedded capacitive mesh system used in smaller tablet and mobile touch-screen devices, which, according to Will Cavendish, technical director at Pufferfish, is vital to provide this multi-user, multi-touch functionality.

The Muse Resistance Tour, Wembley concert, 2010, recorded with 360° cameras positioned at different points on the stage. Credit: Mativision and Brontone (Muse management)

A tablet computer typically employs an embedded capacitive mesh, an invisible mesh of wires that register the user’s touch and tells the system exactly where that touch point is on the mesh grid. Firstly, says Cavendish, a mesh system simply isn’t applicable to a non-planar surface, which is why Pufferfish’s displays use an infrared camera tracking system instead. The added bonus of using imaging though is that the system will give an infinite number of touches, something that would be limited with an integrated or capacitive mesh.

‘That shared experience when interacting with the sphere is important,’ states Cavendish. ‘So being able to have an infinite number of touches and forms of interaction with our display is critical and that is enabled by the camera technology.’ And as the camera resolution increases and tracking capabilities improve then the systems will be able to handle even finer point touch.

‘Everyone nowadays has some sort of touch-screen mobile device and people’s expectations of the multi-touch user interface are defined by the likes of Apple,’ says Cavendish. ‘Trying to emulate that technology, which is for small-scale devices, and applying that to a large non-planar display is very challenging.’

Pufferfish’s displays, which incorporate cameras from Allied Vision Technologies (AVT) to track an individual’s finger points, have been used for a range of applications, from interactive displays on trade show booths, to educational devices in museums. The Escher museum in The Hague, for example, has installed the company’s PufferSphere M sphere (the product has 60cm and 90cm screen versions), both as a self-contained application where visitors can interact with Escher-based imagery on the screen, but the sphere is also used as a giant trackball for controlling content that’s projected on the surrounding walls. Pufferfish has also recently announced a partnership with the systems control company Crestron to develop the sphere as a command portal in buildings for controlling things like lighting, the intercom, blinds, the internet, etc, all through the display. Pufferfish also supplies its PufferSphere XL series, which are larger 2m inflatable screens.

The spheres work by flooding the inside of the display with near-infrared (NIR) light, which leaches out of the screen. Any reflected radiation from a person touching the screen is detected by an AVT Guppy Pro camera with a 180° fisheye lens and infrared filter positioned next to the projector and IR array. A PC collates the coordinate data, translating it from a fisheye image and outputting it in traditional Cartesian coordinates. The data is transferred to separate applications which are then mapped back onto the sphere.

The camera footprint had to be particularly small, according to Cavendish, because the available mechanical space in the confines between screen and projector and all the other components within the display is minimal. ‘We needed a small-format, industrial camera that was robust and could be relied upon for its longevity for a fixed installation,’ he notes. ‘We also needed a fast frame rate and satisfactory vision capabilities in the near infrared bandwidth, which the AVT camera provides.

‘AVT has a good reputation for product quality and longevity of service, and as we’re looking at fixed installations, that was an attractive prospect,’ Cavendish continues. ‘We also wanted performance to remain as expected at the higher end of the camera’s temperature range. Because we’re using a lot of infrared to light the sphere, there can be a lot of heat gain, especially for a fixed installation running continuously. The peak temperature might be in the upper part of the camera’s specified operational range and we needed to be assured it works correctly at these temperatures over long periods of time.’

The camera is synced with strobed IR LEDs. An image is captured with and without the strobed IR light in order to subtract ambient infrared from the final image, which gives much more accurate tracking. By removing the influence of background IR radiation, the display can be installed in a daylight-filled room and still operate correctly, Cavendish says.

‘One of the greatest challenges with a camera-based approach [to interactive displays] is ambient lighting,’ comments Cavendish, which is the case for both Pufferfish and Atracsys systems. The devices therefore need to be positioned correctly, as bright sunlight, for instance, can blind the cameras and interfere with the touch-screen capabilities. With a spherical system, ambient lighting can be even more problematic, as it will influence the display from 360°, compared to a wall display where this window is only around 45-60°.

Pufferfish is currently developing software for hand and gesture recognition, as well as doing more with camera tracking and using computer vision to track crowds for interactive devices. Higher camera resolution and smaller formats, Cavendish notes, are two attributes that could aid in Pufferfish’s design process. ‘If we’re able to move to smaller camera formats, we can look to develop multiple camera systems within multi-touch displays, which will give significant higher resolution touch-screens,’ he comments, adding that with higher camera resolution the screens will get closer to absolute finger point touch and gesture recognition.

Concerts in 360°

Lighting was also an issue for post-production video company, Mativision, when developing its latest video system, and one of the reasons why the company utilises industrial cameras. The system synchronises up to 10 cameras each with a spherical 360° field of view that record an event or live performance from different positions throughout the show. The company has developed its own interactive video player that allows the viewer to switch between different camera scenes while audio and video remain in sync. ‘It’s a simple idea,’ says Anthony Karydis, founder of the company, ‘but took a lot of development in terms of both software and hardware to reach a point where we can position up to 10 cameras and record an event from different viewpoints.’

Pufferfish’s displays use camera technology to track a person’s fingers touching the screen. Credit: Pufferfish  

The system filmed the rock band Muse perform at Wembley arena in 360° footage, as well as recording performances and music videos from Slash and Pendulum. ‘The lighting in a rock concert, say, will make imaging difficult, as there will be very dark moments followed by very bright instances, with altering colour and intensity,’ explains Karydis. Mativision uses Ladybug2 spherical cameras from Point Grey. ‘Point Grey cameras handle these illumination conditions very well; we are able to alter the imaging parameters of the camera to get a good dynamic range,’ he says.

Ladybug cameras have six sensors producing six individual images to give a 360° field of view. The latest version is the Ladybug3, offering six times 2.0 Megapixel resolution. Karydis notes the Point Grey cameras are small and lightweight, which means they can be transported easily. They’re small enough to be positioned such that they don’t interfere with the performance itself and also can be placed away from the control station. ‘We are not restricted by cable length,’ he states. ‘This is one of the major factors that made Mativision a success, because it allows us to position cameras 100m away from the stage and control them remotely.

‘The cameras have proved very reliable,’ Karydis continues. ‘They have been dropped and still work, as well as operating in the rain in specially developed housings.’

The cameras can stream live images directly from the event. The post-produced content, however, undergoes some processing to correct stitching, where necessary, or colour-correct it and ensure audio is synced with video, etcetera to obtain high quality video.

Point Grey’s software provided with the camera stitches the six different images in real time, which works well for live streaming. Stitching errors can occur, however, such as when stitching between distant and nearby objects, or object straddling two sensors, which will give different perspectives. ‘Distortions in the image can create very strange effects,’ Karydis explains. ‘A guitar neck might appear broken or bent or a face might look split. These effects can be ironed out during post-processing, although many of the stitching issues we initially experienced have subsequently been resolved by Point Grey in firmware updates.’

Karydis explains that, for post production, the recording settings for the camera might not be the most obvious. ‘The image recorded might not look that good when we record it, but it’s the best starting point to reach the best possible image after processing,’ he says.

Mativision is developing applications for the viewer to be able to use the software more creatively. ‘The system currently allows the viewer to choose what they want to watch and move around inside the image to adjust their viewing position from camera to camera,’ Karydis comments. ‘While this is interactive viewing, it’s not a creative process, i.e. the viewer only controls what they watch. We’ve developed software, which is in an advanced development stage, which will allow the viewer not only to have this interactivity but also to create their own view of the concert or rerecord sections.’



Topics

Read more about:

Consumer electronics

Editor's picks

Media Partners