US researchers develop 50gigapixel camera

June 21, 2012 // By Nick Flaherty
By synchronizing 98 tiny cameras in a single device, electrical engineers from Duke University and the University of Arizona, have developed a prototype 50gigapixel camera.

The new camera has the potential to capture the 50gigapixels of data (50,000 megapixels) with resolution five times better than 20/20 human vision over a 120 degree horizontal field.
The researchers believe that within five years, as the electronic components of the cameras become miniaturized and more efficient, the next generation of gigapixel cameras should be available to the general public.
"Each one of the microcameras captures information from a specific area of the field of view," said David Brady, Michael J. Fitzpatrick Professor of Electric Engineering at Duke's Pratt School of Engineering. "A computer processor essentially stitches all this information into a single highly detailed image. In many instances, the camera can capture images of things that photographers cannot see themselves but can then detect when the image is viewed later. The development of high-performance and low-cost microcamera optics and components has been the main challenge in our efforts to develop gigapixel cameras. While novel multiscale lens designs are essential, the primary barrier to ubiquitous high-pixel imaging turns out to be lower power and more compact integrated circuits, not the optics."
The software that combines the input from the microcameras was developed by an Arizona team led by Michael Gehm, assistant professor of electrical and computer engineering at the University of Arizona, with support from US defense agency DARPA.
"Traditionally, one way of making better optics has been to add more glass elements, which increases complexity," said Gehm. "This isn't a problem just for imaging experts. Supercomputers face the same problem, with their ever more complicated processors, but at some point the complexity just saturates, and becomes cost-prohibitive. Our current approach, instead of making increasingly complex optics, is to come up with a massively parallel array of electronic elements. A shared objective lens gathers light and routes it to the microcameras that surround it, just like a network computer hands out pieces to the individual work stations. Each gets