Drone-lighting-MIT-Cornell-University-1.jpg

MIT and Cornell University researchers are exploring how to use unmanned aerial vehicles to help provide optimal lighting

A team of researchers from MIT and Cornell University have developed a prototype system that uses small aerial robots with built-in lights to assist photographers in creating the perfect lighting for their work.

The aerial robots move around to create lighting effects that are specified through a camera-mounted interface.

In the team’s prototype system, the aerial autonomous robots automatically situate themselves around the subject to create a lighting effect called “rim lighting,” where the subject of the photograph is strongly lit only around the edges. The research team decided to experiment with rim lighting first because of its level of difficulty. According to research team member Manohar Skrikanth, this particular lighting effect is difficult to achieve because the position of the light is very important – a slight movement of the light can easily change the entire appearance of the subject.

Drone-lighting-MIT-Cornell-University-2.jpg

In the prototype system, the photographer can indicate where the rim lighting should be coming from and the aerial robot will fly to that space so that it can light the subject from that direction. The photographer will then specify a required rim width as a percentage of the initial value until the photographer is satisfied with the lighting effect that results. The aerial robot will stick to this specific rim width to maintain the effect it creates.

The photographer’s camera is connected to the system and sends images to a computer that is running the researcher’s algorithm, which processes the image and sends commands to the drone so that it can adjust its position and stay within the specified rim width.

According to the researchers, the challenge was to enable the system to work fast and at the same time manage the difficulty in manipulating the drone and in keeping up with the estimation of the rim lighting width. The researchers developed an algorithm that tracks the highest gradations in light intensity across the image and measures their width. The measurements will be around the same value in a rim-lit subject and the algorithm will take this as the width of the rim. This approximation helps enable the system and the drone to keep up with the photographer and maintain the rim width, thus maintaining the rim-lighting effect.

The researchers tested their prototype system in a motion-capture studio to make sure the algorithm works.

Drone-lighting-MIT-Cornell-University-3.jpg

The research team, composed of Srikanth, who had his graduate studies and postdoc research at MIT and is currently working a senior researcher at Nokia; Fr├ędo Durand, who is an MIT professor of computer science and engineering; and Kavita Bala, who did her PhD at MIT and is currently at Cornell University, will be presenting their study, titled Computational Rim Illumination with Aerial Robots, at the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging to be held in Vancouver, Canada in August 8 to 10.

The project shows how drones can be used as an alternative to jobs that are often performed by a human. More information and documentation on the study can be found on this link.

Source: MIT News

Images: Manohar Srikanthh

Comments

Quantcast