eDavid (robot)
Manufacturer | University of Konstanz |
---|---|
Year of creation | 2009 |
Type | Articulated robot |
Purpose | Paint |
Website | https://graphics.uni-konstanz.de/eDavid/ |
e-David is a painting robot developed at the Universität Konstanz (University of Konstanz) that calculates brushstrokes from an input image and paints the image on a canvas. The project began in 2009 as a one-armed welding robot modified to be able to paint. The robot's arm can interchange between different brushes and pens, and is equipped with a distance sensor to measure exactly how far the arm is from the canvas.[1] e-David features two main painting methods: predefined stroke candidates and dynamically generated strokes.
Development
[edit]e-David was developed by Prof. Dr. Oliver Deussen, Thomas Lindemeier, Mark Tautzenberger, and Dr. Sören Pirk of the Department of Computer and Information Science. The goal for the project was to build a robot that mimics the manual painting process: painting in iterations until the desired image is achieved. This differs from earlier painting robots because instead of computing the perfect set of strokes and painting once, the robot paints in iterative steps, moving the optimization process from the computer to the canvas. e-David implements features from earlier works, including AARON, a robot developed by Harold Cohens to paint abstract art. However unlike AARON, e-David focuses on representing an input image accurately rather than generating abstract art.[2]
More recent developments include the ability of e-David to reproduce single strokes or writing from a reference, as painted by a human. The robot is also able to iteratively improve on its reproduction attempts: as the brush is a flexible tool, it introduces a certain positioning error in the brushstroke on the canvas. e-David now compares the difference between the stroke it paints and the prototype stroke and computes an updated brush trajectory, to minimize the error. Results are stored, and the machine builds a database of movement trajectories and resulting brushstrokes.[3]
Painting process
[edit]e-David takes in an input photograph and generates a set of brushstrokes which vary in length and width. The algorithm implements a visual feedback loop.[4] The loop follows these steps:
- paints a set of computer generated strokes
- checks to see if the painted strokes accurately represent the input image
- generates the next set of strokes to more accurately match the input image
This process mimics the human painting process, just as a painter would lighten and darken a painting to satisfaction rather than paint a perfect painting in one attempt. This also allows the robot to autonomously correct mistakes caused by dripping paint, deformed brushes etc., as such a defect is spotted when comparing the input image with the current canvas state.
Stroke generation methods
[edit]e-David utilizes two different stroke generation methods: predefined strokes, and dynamically generated strokes.
Predefined strokes
[edit]This method chooses the new strokes from a set of 3 different lengths, 3 different widths, and 60 different directions, resulting in 540 different possible strokes. At each location in the image all of the 540 possible stroke candidates are “tried-out”, and e-David chooses the stroke that matches the image at that point with the highest quality. During each iteration of the feedback loop, e-David paints 300 predefined strokes. The resulting brushstrokes are short in comparison to dynamically generated strokes.[5]
Dynamically generated strokes
[edit]This method uses the image gradient to create longer brushstroke paths rather than many shorter strokes which are used in the predefined strokes method. The image gradient is how the color or intensity of an image is changing at any particular location. For example, if at a particular pixel in an image, the image is transitioning between a lighter color to a darker color, the gradient at that location will indicate which way the image is brightening/darkening and how fast it is brightening/darkening.[6] e-David generates a collection of brushstrokes perpendicular to the gradient's direction, resulting in lines along paths of similar color intensity.[5]
e-David takes all of these dynamically generated strokes, and chooses the paths that do not overlap. The chosen brushstrokes, usually longer and smoother than predefined strokes, result in dark and expressive paintings.
Comparison of styles
[edit]Predefined strokes result in very clean paintings, however the shorter preset brushstrokes seem more robotic than paintings created by dynamically generated strokes. One way to improve upon this painting style is to include longer and more varied brushstrokes, but this could slow down the process due to the larger number of predefined brushstrokes to "try out" choose from.
In dynamically generated strokes, areas of the painting with high detail are often too dark. However, dynamically generated strokes look more complex and artistic than predefined strokes due to the unique brushstrokes. One way to improve upon the dynamically generated strokes is to prevent the creation of dark spots by filtering fine lines in high detail areas.[5]
Public appearances
[edit]The artist Liat Grayver uses e-David to create contemporary art, which has been displayed at several venues in Germany and Israel. A mobile demonstrator of the e-David system has been exhibited in Leipzig and Zurich.
References
[edit]- ^ Falconer, Jason (18 July 2013). "eDavid the robot painter excels in numerous styles". Gizmag. Retrieved 11 November 2014.
- ^ Anderson, Mark K. (12 May 2001). "'Aaron': Art From the Machine". Wired. Retrieved 11 November 2014.
- ^ Gülzow, Jörg Marvin; Liat Grayver; Oliver Deussen (21 November 2018). "Self-Improving Robotic Brushstroke Replication". Arts. 7 (4): 84. doi:10.3390/arts7040084.
- ^ Owano, Nancy. "Calculating art: Meet e-David, the painting machine". Phys. Retrieved 11 November 2014.
- ^ a b c Deussen, Oliver; Lindemeier, Thomas; Pirk, Soren; Tautzenberger, Mark. "Feedback-guided Stroke Placement for a Painting Machine" (PDF). University of Konstanz. Retrieved 11 November 2014.
- ^ Jacobs, David. "Image Gradients" (PDF). umd.edu. Retrieved 11 November 2014.