fbpx
Home / Design  / Is this the future of creativity? AKQA and Google team up to create machine learning art

Is this the future of creativity? AKQA and Google team up to create machine learning art

At Semipermanent Sydney last week, one of the installations that drew a lot of curious creatives over was digital agency AKQA and Google’s Somesthetic Transfer display.

The AI system created by the two companies uses machine learning to take the style and texture of a pre-existing, human made artwork, and then applies the textures and depth to another image. The image is then 3D printed in many layers using a high definition UV printer to achieve the right texture and depth.  

There’s no shortage of robots making moves in the creative sector – IBM Watson helped produce a song that reached the Billboard charts, Molly is an artificially intelligent system that makes people’s digital dreams a reality by designing websites, while Berentson is programmed with ‘artificial taste’ to judge works of art, so what is Somesthetic Transfer bringing to the table that’s new?

AKQA Australia executive R&D director Tim Devine says the idea for the Somesthetic Transfer came from trying to capture the experience of a physical object, digitise it and then “re-physicalise” it.

For the experiment on show at SemiPermanent, the artificially intelligent tool was shown the human-painted picture of a mountain and lake.

It absorbed the textures and depths used, and this was all it was allowed to understand about the scene. It was also fed a photograph, and then told to draw its interpretation of the photo only using the knowledge it was given from the pre-existing work.

Devine says using this technology creates a more sensory experience of art, as the Somethetic Transfer can capture in its data the depth of layers in the painting, as well as the way light hits certain points and fractures.

Devine says with this process, a human is also being helped by a machine to have capabilities they previously couldn’t have mastered.

“It’s this idea that we’ll collaborate with machines – AI is going to augment us,” he says.

“I could never paint something like this but we created this artwork, so tools like this allow people to do more than what they might otherwise have been able to do and have a broader creativity set.”


Tim Devine and Elly Strang

Devine says it’s important to note that the art piece is a back-and-forth collaboration between person and machine – and the machine doesn’t have a “soul” to be inspired by and produce the work on its own (yet).

He says until creativity can be reduced down to a succinct definition, artificially intelligent art will always need to have a human involved in the process.

“We’ve been using robots in factories for years because they’re good at repetition, they’re good at accuracy, they never tire. People do, right? It has abilities humans don’t, but it also lacks abilities. Until we can reduce creativity down to a single definition, then I don’t know. I think AI is good for divergent thinking, so you can do lots of rapid iterations, we’re best together than apart at this stage.”

AKQA is exploring other ways it can collaborate with AI and machine learning. Devine says for now, it is a tool to create things with rather than a machine.

“What can you do with a pencil? You can write a story, draw a picture, so it’s same thing – it’s a tool, a collaborator, we can do lots of things to understand its nuance. The next iteration will be something along the lines of a larger scale version, the printer we use will be a larger format. We also want to recreate the emotion of the piece, and we could go even further with this.”

Elly is Idealog's editor and resident dog enthusiast. She enjoys travelling, tea, good books, and writing about exciting ideas and cool entrepreneurs.

Review overview