Tuesday, April 26, 2016

Google's Deep Dreaming, your favourite artists and YOU.

What if your favourite dead artists were still painting fresh works? Fresh works containing themes *you* specifically desired? Are you still sad that Francis Bacon perished? Are you gutted that H. R. Geiger fell down some stairs and died? Isn't it sad that we don't have more of Gustav Klimt's stunning paintings from his gold phase? I think so.

But here are some paintings they never made:




What is this sorcery? We've entered a new age. To explain a little...

Google's Deep Dreaming neural net is completely melting my brain. First, there's what Google's brain-in-a-jar-with-eyeballs makes of images you feed it. Google researchers employed layers of artificial neurons, progressively working on different levels of an images structure, letting it amplify what it *thinks* it sees. The results seem to invariably involve dog-lizards, fish and bird life where previously there may have only been spaghetti:
Exhibit: A.
You can experiment with this marvellous craziness yourself here: deepdreamgenerator.com

This alone is worth toying with. For example this portrait of me holding a baking dish becomes something of a Dr Seuss trip, complete with fish-lizards, mutant turtle-birds and shirt monkeys. Click the images below for larger versions:



close up weirdness
This is obviously fantastic. Like, really? Are we at that point where a computation can spontaneously add human-meaningful elements to an image? I... I guess we are. For the longest time computer-vision and image synthesis has been perfunctory at best, suited only perhaps to picking objects off a conveyor belt robotically or extending tileable textures from photos etc. We've all witnessed and read about the arrival of face-tracking and matching technology however, and now it's approaching an exciting tipping-point. Computers are no longer able to simply recognise faces, they're able to replace them believably in realtime. But I digress.

Extending on Google's research, other parties have created more online tools where you can supply the guesses for what the deep dreaming algorithm sees by giving it a source image to choose elements it recognises from. This is like saying 'Make me a new image from this photo in the style of this image'. For example:

Who doesn't like Van Gogh's Starry Night? 

Brian painted by Van Gogh?
 I know what you're thinking. What if The Great Wave of Kanagawa was really about skulls instead of tsunamis? Well:

Me in the style of Duchamp's Nude Descending a Staircase? Yes.


Naum Gabo yeah yeah!
The main tool I'm currently using is an Android/iOS app called Pikazo: www.pikazoapp.com
This app allows you to upload your chosen combinations to the cloud where the computation is performed. It is intensive and as such, only a limited resolution is permitted - somewhere in the realm of 500px on the longest side, and this takes roughly ten minutes to produce. You can currently upload up to 3 combos at a time as an obvious compute load and bandwidth constraint.

I got a little carried away with this. There just seems to be so many new cool possibilities! Too see my whole gallery of experiments, click here: www.julianbutler.com/Art/Deep-Dreaming/


I'm not sure what this means for art and originality. Obviously the combinations I've managed to produce are in no way able to be passed off as legitimate works by the original artist at all. But then, now the new work is 50% my contribution? According to copyright law and the internets this may be the case. Everything is a remix huh.

However, I think the strength of a successful image still lies equally in the concept behind the image, as well as it's execution, and currently the computer isn't coming up with too many great artistic concepts on its own. 

Yet.

-j

Stanley Donwood I probably owe you a beer.


Wednesday, April 6, 2016

Vive VR launch video

With both launch campaigns from Oculus and Valve/HTC in full swing, this Vive launch video really grabbed my attention. It's commonly acknowledged in the fledgling VR community that it's tough to convey what VR is like in words and that experiencing it is the best way to explain it to newcomers. This low-key mix of everyday people experiencing VR for the first time sells what virtual reality is like in a way that needs no words. Check it out:


Congrats to Valve and HTC for setting the tone. Onwards!

-j

Friday, March 25, 2016

It's Virtual Reality Christmas Eve!

It's been a long wait for people interested in getting their hands on the consumer VR hardware Oculus' founder Palmer Luckey first dreamt up on the MTBS3D forums four years or so ago. It's been an even longer wait for people [myself included] who were expecting it to take off in the mid-90s when The Lawnmower Man had us all thinking the shape of work to come was suspended leather couches and skin-tight glowing bodysuits. Thankfully that is not to be the case.

But, it's now finally time for this:


And the below gif singlehandedly sums up the mood of myself and the VR community:


Looks like I'll be waiting till late April/mid May for the delivery of my CV1 darnit. I was a day late to the kickstarter for the DK1. Had I been a little snappier I'd be hopefully receiving my free CV1 this Monday, as shipping to the original kickstarter backers has commenced!

Bring it on.

Sunday, March 8, 2015

Unity 5 realtime global illumination in VR

Well, after a pretty great response to the Sunshine demo, [big thanks to Oculus for sharing it on the front page] I'm ready to start looking at the next project.

I'm a big fan of American artist James Turrell, and I wanted to make an experience reminiscent of his work where you can just experience the effect of light. That's kind of simplistic I know, but his colours, spaces and projections look terrific to me and I'd love to go and see a show of his work, or even better, go see his Roden Crater when it's finished.

Not Unity, but James Turrell's 'Blue Planet Sky'.
So, the first thing to try with Unity 5 is the realtime global illumination system built in from Geomerics, called Enlighten. I wanted to illuminate a scene completely with only one light. Here's a clip showing my current demo:


I recorded this clip in VR mode and looked around using the Oculus Rift, so my apologies to those looking for a standard single eye view. Regardless, you can see the effect of the sunlight bouncing around during the daytime phase and then the same effects during the moonlight phase.



Effects like this used to only be possible in an offline rendering situation with all the indirect bounce-light calculations sometimes taking hours. With a little bit of up front pre-computation [in this case about 45" on my computer] this lighting can play back at realtime speeds.

This image, from a mental ray render in Maya in 2003 took about an hour to render from memory?
So it's a really exciting time to be making interactive 3D content - realtime GI is a major stepping stone towards realistic lighting that can immerse the viewer.

-j

Friday, February 20, 2015

Sunshine Observation Deck now on Oculus Share

My demo was approved and made publicly available on Oculus Share today, and it made the front page - Woohoo! It's an honour to be selected alongside others such as the Apollo11 Experience in a long list of new additions and updates.


Oculus host the versions you can download now, so I probably won't kill the free dropbox bandwidth I'm currently using. But you'll need an Oculus account to download these so I'll keep these links live for anyone who doesn't have one:

OSX:
https://www.dropbox.com/sh/ptsph0fkn0r3yob/AABQfls0b3SY2s92KeP9nFQma?dl=0

Win7/8: Note: This demo runs great in win7, I've not tested in win8 however.
https://www.dropbox.com/sh/s9gpcjt67phby5z/AACt8otcG3wwwxg2kFC1g5Rya?dl=0

Thanks to Oculus for providing a curated hosting solution.

-j

Wednesday, February 11, 2015

Sunshine Observation Deck VR demo released!

You can now download the demo and try it for yourself [Requires Oculus Rift DK-2]. Here's a youtube clip to give you a taste:


There's more info here on the Oculus Dev Forum post. And here's a download link to the builds if you want to jump in:

OSX:
https://www.dropbox.com/sh/ptsph0fkn0r3yob/AABQfls0b3SY2s92KeP9nFQma?dl=0

Win7/8: Note: This demo runs great in win7, I've not tested in win8 however.
https://www.dropbox.com/sh/s9gpcjt67phby5z/AACt8otcG3wwwxg2kFC1g5Rya?dl=0

Time to start thinking about the next project. This time a Unity5 VR test of the real time global illumination capabilities of Enlighten from Geomerics.

-j