Skip to main content

Drone lighting gives photographers perfect picture

Computer scientists at MIT and Cornell University have developed a prototype autonomous helicopter providing lighting for photographers, which readjust its position as the subject moves to achieve a constant lighting effect.

The researchers concentrated on producing an effect called rim lighting, in which only the edge of the photographer’s subject is strongly lit. They will present the work at the International Symposium on Computational Aesthetics in Graphics, Visualisation, and Imaging in August.

Rim lighting was chosen for the initial experiments because it’s a difficult effect to produce. ‘It’s very sensitive to the position of the light,’ Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and is now a senior researcher at Nokia, said. ‘If you move the light, say, by a foot, your appearance changes dramatically.’

Based only on the specification of the rim width – the desired width, from the camera’s perspective, of the subject’s illuminated border – the helicopter not only assumes the right initial position but readjusts in real time as the subject moves, enabling delicate rim lighting of action shots.

The purpose of the tests was to evaluate the control algorithm, explained Srikanth. Algorithms that gauge robots’ location based only on measurements from onboard sensors are a major area of research in robotics, and the new system could work with any of them.

With the system, the photographer indicates the direction from which the rim light should come from, and the miniature helicopter flies to that side of the subject. The photographer then specifies the width of the rim as a percentage of its initial value, repeating that process until the desired effect is achieved.

Thereafter, the robot automatically maintains the specified rim width. ‘If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he’s looking 90 degrees away from you, then he’s exposing his chest to the light, which means that you’ll see a much thicker rim light,’ Srikanth explained. ‘So in order to compensate for the change in the body, the light has to change its position quite dramatically.’

In the same way, the system can compensate for the photographer’s movements. In both cases, the camera itself supplies the control signal. Roughly 20 times a second, the camera produces an image that is not stored on its own memory card but transmitted to a computer running the researchers’ control algorithm. The algorithm evaluates the rim width and adjusts the robot’s position accordingly.

‘The challenge was the manipulation of the very difficult dynamics of the UAV [unmanned aerial vehicle] and the feedback from the lighting estimation,’ said MIT professor of computer science and engineering, Frédo Durand, who is co-author of the paper. ‘That’s where we put a lot of our efforts, to make sure that the control of the drone could work at the very high speed that’s needed just to keep the thing flying and deal with the information from the lidar [the UAV’s laser rangefinder] and the rim-lighting estimation.’

The algorithm looks for the most dramatic gradations in light intensity across the whole image and measures their width. With a rim-lit subject, most of those measurements will congregate around the same value, which the algorithm takes to be the width of the rim.

In experiments, this quick approximation was able to keep up with the motions of both the subject and the photographer while maintaining a consistent rim width.

The researchers tested their prototype in a motion-capture studio, which uses a bank of high-speed cameras to measure the position of specially designed light-reflecting tags with millimetre accuracy; several such tags were affixed to the helicopter.

Related articles:

Further information: 

Massachusetts Institute of Technology

Topics

Read more about:

Technology

Media Partners