The human-machine interactive experience has been up for a rethink at Cambridge, MA -based Fluid Interfaces group from MIT's Media Lab. These are the innovative champions in thinking up enhanced interactions and systems that can be more responsive to people's actions. One such project is the creation of the "Flying Pantograph." A very simple way of describing it would be that it is a flying drawing machine, but that would hardly scratch the surface.
Let them explain: They said the project "transposes human-scale drawing acts to a physically remote output canvas in different scales and aesthetics. A drone becomes an 'expression agent' - modified to carry a pen and be controlled by human motions, then carries out the actual process of drawing on a vertical wall."
Yes, they are using a drone to draw. Writing in BGR, Yoni Heisler-said the MIT students "affixed a camera to a drone capable of capturing, in real time, the pen strokes of a user. That information is then used by the drone itself to effectively copy everything the user is drawing."
Beyond that, something special is also going on. The authors said, "we engage audience with a drone-based drawing system that applies a person's pen drawing at different scales in different styles. The unrestricted and programmable motion of the proxy can institute various artistic distortions in real-time, creating a new dynamic medium of creative expression."
Note they chose the word, pantograph. The title of their project, said Fast Company, the Flying Pantograph, takes its name from a device first created in the 1600s. "Pantographs use mechanical linkages to duplicate (or scale up) a drawing, mimicking the motions of a human artist to copy their lines on a different piece of paper."
(One discussion of an old pantograph notes how by moving the pointer over a diagram, a copy of the diagram was drawn on another piece of paper.)
"The seemingly straightforward technical realisation is in fact a combination of non-trivial mechanical and algorithmic solutions," said the authors, Sang-won Leigh, Harshit Agrawal and Pattie Maes in their paper, "A Flying Pantograph: Interleaving Expressivity of Human and Machine," which was presented last month in Eindhoven, Netherlands.
They said in their paper that they used the OptiTrack motion capture system to allow for fine-grained tracking of hand drawing motions. The desk and the canvas are also tracked by the system so that their software can automatically adapt to different spatial configurations of the installation.
"Through such exploration," said the team, "we seek to understand how a person meanders through a space of expressive intentions, when a machine-driven 'pantograph' adds its own artistic intentions to it. We aim at creating deeper levels of engagement, from both human and machine ends, so as to enable us to internalise a more synergistic art form."
John Brownlee in Fast Company wrote more about where this concept could go, and that would be for "many more applications than just street art."
Brownlee said, "Using more than one Flying Pantograph, multiple artists could collaborate on a piece in real time without even being in the same room.
With some tweaking, the Flying Pantograph could even have applications for the disabled, too: For example, someone who doesn't have the use of their arms could control a Flying Pantograph to write something on a whiteboard using eye-tracking."