Animated characters brought to life by 3-D printing
August 03, 2012 // R Colin Johnson
The virtual world is being brought to life by reverse engineering the rendering operation that draws on-screen characters in video games and other software animations. Harvard University researchers will describe a patented new algorithm that uses three-dimensional printers to create personalized action figures from animations at next week's Siggraph 2012 show in Los Angeles.
Software animations create both realistic and fanciful characters, but their makeup and capabilities need not match those that are possible in the real world. Harvard's software, however, translates the primary characteristics of the on-screen characters into articulated components that together realize a figurine that can be created by a 3-D printer.
By observing the on-screen appearance and actions performed by the character, the Harvard algorithm determines the ideal locations for the character's joints—either ball-in-socket or hinged—then optimizes their size and location using the physics of the real 3-D world. Once the reverse rendering operation is complete, a detailed file is sent to a 3-D printer, which creates a completely assembled version of the action figure.
The Harvard researchers expect their invention to be useful not only for personalized action figures for consumers, but also for professional animators who today create mannequins by hand. With the reverse rendering algorithm, animators will be able to quickly create action figures which they can use to experiment with different stances and motions in real world recreations of virtual worlds.
Harvard’s Office of Technology Development has filed a patent which it aims to license to a cloud-based service that will used 3-D printers to create customized, user-generated figurines from existing animation software.
Funding for the product was supplied by the National Science Foundation, Pixar and the John Simon Guggenheim Memorial Foundation. All news
Mars Probe Looks for Atmosphere 'Lost in Space'
September 21, 2014
NASA's latest Martian probe is fast approaching the Red Planet to begin exploring for the first time its upper atmosphere, ...
Sensor Hubs Aided by IEEE-2700-2014 Datasheets
Daimler to test autonomous driving in California
Can plant-based batteries be a viable option?
Structural electronics - the next big thing in smart cars?
Elastomeric camouflage switches texture and colour
September 18, 2014
Finding their inspiration from nature's most skillful camouflage artists such as octopuses, squids, and cuttlefishes, MIT ...
Solar microinverter to break USD1bn barrier in 2018
7nm EUV could ease 10nm squeeze, says ASML
Europeans flex skills to commercialize OLED and OPV devices
- Putting FPGAs to Work in Software Radio Systems Handbook
- Flexible and Low Power Driving of Solenoid Coils
- How to Protect & Monetize Android Apps
- Power Modules: The New Super Power
InterviewCEO interview: AMS' Laney on driving a sensor-driven business
Kirk Laney, CEO of Austrian mixed-signal chip and sensor company AMS, wants to leverage the opportunity that technology affords to create new markets for sensors and sensor interfaces.
Filter WizardCheck out the Filter Wizard Series of articles by Filter Guru Kendall Castor-Perry which provide invaluable practical Analog Design guidelines.
Linear video channel
READER OFFERRead more
This month, Trinamic Motion Control is offering you to win one of four TMCM-1043 development kits for its highly integrated, NEMA 17-compatible TMCM-1043 stepDancer stepper motor module.
Offering designers an easy-to-use PC-based GUI that allows one-click modification of motor drive current, micro-stepping and other key parameters, the intuitive kits are custom designed and developed for...MORE INFO AND LAST MONTH' WINNERS...
December 15, 2011 | Texas instruments | 222901974
Unique Ser/Des technology supports encrypted video and audio content with full duplex bi-directional control channel over a single wire interface.