Anyone who loves a good use of data will no doubt appreciate its novel use in the latest music 'video' for Pacific Heights, which Sam Peacocke produced without the use of a single camera.
Peacocke was approached to create the video for Pacific Heights’ (a solo project of Shapeshifter member Devin Abram) new track Buried by Burden.
The song (and video) features talented Wellington singer and celebrated musician in his own right, Louis Baker, and is about memory. Peacocke decided to riff on this concept by using some new and old technology.
“I’d always been interested in the form that filmmaking is going to take in the future and technically the most interesting thing that I’ve seen and been thinking about is volumetric capturing,” he says. “Instead of filming from a particular angle with a particular lens you capture the data of the whole scene in three dimensions ...”
He says he used this technology as he feels the effect created appears quite similar to the way memory works.
“You remember the kind of vague feeling and nature of things but not specifics and they change and get degraded over time as your own agenda mucks up your memories so it was quite existential in the end with what it ended up being like. I wanted to visualise decay.”
He says his use of the technology is rudimentary, but it will look different in the future.
“If you look at the Lumière brothers, their stuff looked pretty different to what a full 4k movie looks like today. So, in 50 years it will look pretty amazing, what you can capture in 3D and people might look at volumetric as a crazy, grainy, black and white kind of thing.”
Peacocke says to create the video he used the wrong tools for the job intentionally to get the result he wanted.
“So, it’s basically captured in two systems. The environment was all captured with an industrial laser scanner that they use for scanning building sites and factories and it’s a super expensive thing that you set up on a tripod and leave for five minutes but it sends invisible laser beams and measures the distance. So you end up with 50 million points per scan,” he says.
“So we got that data and we managed to import that into a visual effect and then we got a Microsoft Xbox Kinect and that’s got a very low resolution laser scanner in it, but it scans in motion, so it’s taking a new reading 30 times a second then that’s what we scanned Louis with and then we found other ways through lots of mucking around and programming apps and some custom code to get them both into the same visual effect software.”
So, essentially, he’s placed scanned moving people into a scanned three-dimensional environment.
He says in the future he would love to do something live with the technology.
“With the systems we have built, you could interact with it. Like you’re in a game but it would be happening live. Or you could wear a VR headset but instead of being locked in one position you get to fly around.”
The video, in part, was influenced by another video he did for Neil Finn a couple of years ago.
“We did some scanning with the kinect and you could actually interact with it within a web browser but it was a bit ahead of the times and no one was really using Google Chrome then but it’s still pretty cool. I wanted to explore that a bit deeper and on my own too.”
We look forward to seeing what comes next from Peacocke, who at the moment is also busying himself with his own drone-based film company.