Wednesday, December 3, 2014

Physically Based Rendering, cubemap reflections, parallax correction, light-mapping and more!

So, I feel I've tumbled down some sort of rabbit hole in the last few weeks. What started as an innocent attempt to model a room from movie has transmogrified into a headrush of new learning about the current state of the art in real-time rendering, and thus a fair amount of dissatisfaction with my current abilities. Only to be expected really!

The excellent Marmoset Toolbag in action
Last post I felt like all I needed was some realistic floor reflection cubemaps. Google got me off the starting line with some christmas ornament reflection maps that sorta worked:


Which, when used as a cubic reflection map, produced this sort of look on my floor tiles:
yeah... ish
I discovered that Unity can make these rather easily internally too. By choosing a transform internally near where the viewer would be situated I can render 6 x 90° camera projections that form a box with the images that everything inside that box that was shiny would see. This is great. Now my normal-mapped floor can reflect the sun! Now my own cubic reflection map looks more like this:



Which produces reflections like these:
The actual sun reflecting on the actual floor! I'm done! ... NOT.
But... well... this is all well and good if the floor is perfectly reflective. Which it's not. It should be covered in micro-abrasions that scatter the incoming light rays and making the reflections blurry. How can I create this in Unity? There's no 'roughness' slider in the default shaders and my metal floor is a long way from looking anything like the quality camera-lens renders above.

http://www.marmoset.co/toolbag/learn/pbr-theory
This began my investigation into quality reflection map creation which led me to Marmoset Skyshop for Unity. Marmoset make a fantastic set of shaders for Unity 4.5 and up that aim to mimic the energy-conserving properties of a surface and also provide a really good introduction to physically based rendering. I highly recommend reading their Toolbag2 Learning pages if you don't know where to start. Turns out, realtime rendering and offline rendering have a much larger overlap than I assumed with next-gen game engines requiring the understanding of BRDF functions and forcing artists to re-consider albedo [diffuse] and specular maps entirely differently than the past 20 years.

A little learning goes a long way in improving material appearance.
It seems each time I want to improve the appearance of my scene with Unity's rendering capabilities, I end up trawling the Unity Asset Store looking for 3rd party solutions. And there are a few real gems. However, with the imminent release of Unity5 and it's reflection probes, realtime GI, and new super shaders, is it worth spending any money to gain these abilities now? Perhaps not.

So my choices at this point are begin transitioning my project to Unity5's beta [which I have access to], or pay money to buy solutions that give me these effects now in 4.5 but may or may not be supported or required going forward...

For now I chose to apply what I've learnt to with what I have. PBR theory has really helped me get better results with the current shader controls - for example the leather texture below has baked-in cracks and grain, breaking up the reflections and giving the couch top appearance a much better feel:


Complete with hand sculpted bum impressions.
I'll make the jump to a proper PBR version of this scene as Unity5's tech releases. In the meantime I've started light-mapping the room's illumination to get better ambient occlusion and mood. The viewing couch is modelled and in place and some new shaders created.


A fun model to make with curved metals and a low-ish polycount.
Where O where art thine couch reflections in the floor???
With my current reflection mapping approach, things don't really line up properly as the virtual surfaces that the reflections are mapped to are actually situated an infinite distance away. Thus you can see the sun reflection on the floor screen-left is actually present where the wall reflections should be appearing. This requires a technique known as parallax-correction - something that Marmoset has an excellent solution for, and the fantastic Sebastien Lagarde documents on his blog here. I'm not sure I'll implement a solution for that as there are other things I'd like to move onto in this project.

Optically in the Rift view I've got a new starfield in the background [I realise exposure-wise that stars would likely be well under the sun's brightness and thus invisible, but you know, VR!], some sun shafts creating beams of light, some dust motes floating in the room for a little atmosphere [after playing Alien:Isolation in VR I couldn't help it - the modelling and lighting in that game is a masterpiece!] and a few other tweaks in store.


It's a place you can go and just sit and look at the sun.
Still hitting 75fps no problem at all in OSX and Win7 so I'm not really near the limits of my GeForce 680MX yet. Speaking of which, the current OVR 4.3.1 runtime and Unity integration produce rock-solid head-tracking in Windows and it's kinda stunning. OSX is still a little swimmy. Getting close to producing a youtube clip [maybe 60fps?] for people to try out too. 

I'll wrap this up now, but next post I want to detail what I've found out about timewarping, prediction alpha transparency, forward rendering Vs. deferred rendering and the Unity Pro Profiler, and of course some scene updates. 

Looking for where this began? Click here to visit the beginning.

So long for now!

-j

Tuesday, November 11, 2014

Game dev, realtime rendering and the Oculus Rift in my fan tribute to the movie Sunshine

I've been experimenting with the Oculus Rift DK-2 in Unity for some months now. It's incredibly fun to make a place from an idea you have and then go and visit it virtually. It literally keeps me awake at night when I have a moment of inspiration about something new I could try and then get all excited about how to bring it to life.

Most recently I've started creating a bit of a tribute to the movie Sunshine. I really love the solar observation deck depicted near the start of the film:

Sometimes it's small, sometimes it's big.
Cliff Curtis asking the sun just how many characters of different ethnicities he can get away with playing.
Characters on the ship spend time in this portal just gazing at the sun as they head towards it in their ship [to blow it up of course. I'm not kidding BTW]. And for the first two thirds of the film it's as though the sun is a character in the movie - the main protagonist almost. As for the last third, well you'll have to watch it to see.

I thought this might be a suitably contained idea to learn some realtime rendering skills and game dev abilities. I'm choosing not to focus on interaction so much as art direction. I'm aiming to make a place where you can just go and sit quietly for a while and watch the sun like the characters in the film did.

Along the way to this I've had the rude-awakening of just what realtime rendering means and how much we are unconstrained and sort of lazy in the film VFX world. In VFX if something takes even 24 hours to render, well, that's kind of ok because the end result should be amazing right? Well in games the goal is often to pull off stunning visual complexity 60, 75 and up towards 120 times per second. And often on hardware that lags behind the sort of computing power available to me at work.

Another cool moment where the crew gather to watch Mercury transition across in front of the sun.
I'm developing on OSX on a 27" 2013 Apple iMac with an nVidia 680M GPU with 2GB RAM. I do not have a 980GTX sadly. Although I'll be testing builds in bootcamp in Win7, I'm creating and compiling in OSX because my main DCC toolsets are there at the moment.

To do this, every aspect of scene creation needs to be efficient. Absolutely NOTHING that does not need to be computed should be:

Extra faces/verts/edges on your model that are not contributing? Get rid of them.
Small modelled-in details that could be represented more efficiently in a map? Map it.
Extra lines in your shader doing nothing? Get rid of them.
Extra geometry being transformed around that you can't see? Get rid of it.
Using the similar shaders on multiple different objects? Make it one. Make them all one!
Fancy looking particle collisions you can't see and don't really add much? Get rid of them.
Amazing post-effects bringing your framerate down? do you REALLY need them?

This approach really forces you to consider economies that impact art direction. How fast are procedural textures when you could paint a map? Do you really need to see all the curvature on the edge of that seat you won't get close enough to see if it's costing you 1 FPS? No. So it's a bit of a change from my day job at Weta.

So right now I've pretty much just got the sun, some flares, some particles, some observation lounge geometry and that's about it. But I also have 75fps on my 680MX on OSX so that's a good sign too. And given the current state of the Oculus SDK and runtime [4.3beta], I should be able to continue to get a decent framerate in the future and definitely a speed boost under Win7. The screenshots below are very work in progress and do not represent the final appearance of this demo.

My sun at the moment, slowly spinning, painted with map data from the Solar Dynamics Observatory.
Just like Minecraft, only hotter.
And of course in the Rift, looking out at my first attempts to paint the solar flare shape textures.
Here's some of the things I'm aiming to include in the final version:

* Have a viewer-controllable exposure facility so you can radiate yourself should you wish.
* Key music from the movie soundtrack - or ambient ship thrum noise.
* Viewer-triggerable Mercury transition.
* Heat-haze effect to depict the atmosphere in the viewing lounge.
* Sun-shaft optical effects.
* A seated figure with mirrorshades on so you can share your epiphanies. [Paging Cliff Curtis to the solar observatory deck]

Like I said, I'm not planning to focus much on interaction with this one just atmosphere really. But I'm having a blast steadily solving problems one after the other and learning classic game tricks to speed things up.

A couple of tools from the Unity Asset Store I'm using include Sonic Ether's Natural Bloom and Dirty Lens, and also ProFlares [which needs some Oculus compatibility updates].

Next up on my list is HDR cubic environment maps and physically-based rendering.

So much to learn. So many ways to screw up ;-)

-julian








Saturday, March 8, 2014

Night Vision Goggles


He said it was the best thing I ever gave to him.

They're not real military grade, but they work. They came with the Prestige Edition of Call of Duty - Modern Warfare 2 [I did not buy that, I bought these from a friend at work].

-j

Saturday, February 22, 2014

Lego Downhill Derby at the Ribble Street Races.

Jamie and Isobel on the course, getting ready to run.
This year as part of the Island Bay Festival, the Ribble Street Races held a Lego Downhill division. Jamie and I could not resist taking part, especially as we felt that the use of my old lego from 1980 had wheels in it that would trounce the competition.

Click here to jump straight to the photo gallery.

I still own most of the parts of this Lego Technic 8860 kit from 1980 - thanks to Mum and Dad for keeping hold of this for me!



So a week ahead of schedule we got down to work. We drew up a cheat sheet of what we felt were the main issues: terrain and hill gradient, wheel choice, centre of gravity and vehicle weight.

Secret plans, secret plans... plot, hatch, scheme.
Here's what we came up with, the main concept being a very low centre of gravity, and being fast and mean [so mean in fact that during the test run we rode roughshod over another vehicle monster-truck style - oops! Oh well, mess with the bull, get the horns I always say]. Check out the gallery for more images.

Say hello to 'Old Skool - 1980'. All your race R belong to US.
There was a huge variation in designs, with everything from boats [on wheels] and tanks to monster trucks and small nimble things that went fast but deviated off course very quickly. Jamie and I quickly found that our creation took it's sweet time to get up to speed, but then went relatively straight and became somewhat unstoppable.

The main race stipulations being that your entry had to fit within a 40cm cubic dimension. All parts must be actual lego. No adhesives or extra weight to be used or fitted internally. Your vehicle MUST be able to transport at least one Lego mini-figure. Didn't say nuthin' about having THREE mini-figures.

Helms-man, Navigator and Psy-Ops.
Sweet, forgiving asphalt. Not the usual stone-chip NZ road surface.
For Glory! ... and a smash up at the end
First look at some of the competition.
Heat two for the open class, Jamie gets a last minute tip on direction from a race official.
City Arts Manager Martin Rodgers from the Wellington City Council doing a great job on the mic.
Unorthodox Death Machine on the loose!
We managed 2nd place in the Open Class! [open cause if you had grown up help you couldn't enter in the younger age-brackets]. For Great Victory!! Here's Jamie on the podium letting the crowd in on a few secrets of our success:


We won video rental, dessert pizza from Hell.co.nz and a nice ceramic mug. Congratulations to all the well-deserved wins in the other divisions, [you can see more of the winners in the gallery] and well done Island Bay for hosting Wellington's very first Lego Downhill Derby.

Next year I think I'll take a more back-seat stance and have our kids build and enter their own vehicles now we know the lay of the land. We've got a heads up on next years design already.

-j


Sunday, February 16, 2014

A post in which I reveal my keyboard-fetish dreams have finally come true.

From this sorry thing...

A while back in an effort to force myself to learn to touch type at work, I blacked out the keys of my keyboard with duct tape. This worked. However, after two hot summers, the duct tape began to slide off and my co-workers said my keyboard was looking a little ghetto. It's true, my fingers were getting sticky and my typing full of errrrrrorsa nd musssstaaaaakes. Time for a replacement:

Enter the Filco Majestouch Tenkeyless Ninja!

To the future and beyond.

Thanks to the fine sales people at The Keyboard Company I am now the proud owner of this masterfully fine thing. And yes, those are metal [zinc to be precise] WASD keys. I've also replaced the 'f' and 'j' with radially etched Line2R green keys from wasdkeyboards.com and replaced the escape key with a red one. I'm sporting Cherry MX Brown switches with silencing rings under the main letter keys and the return key.

It is well nice to type on. I see many customisations in my keyboard-fetish-future.

Onwards!

-j

Wednesday, February 12, 2014

Net-neutrality, AppleTV and just what did Steve Jobs 'crack' then?

But you have cable TV, right?
We have two main competing ISPs [Telecom and Vodafone] offering set-top boxes that record and time-shift content from a single cable-TV style provider - SkyTV. The hardware they provide represents varying levels of competence and reliability and require extra fees to enable the viewing of hi-def channels, all in the context of constant ad-breaks and promos for other content. But enough moaning, it's clear that business model is coming to an end, however all too slowly.

I'm just about ready to be a cord-cutter. But here in NZ, data access and bandwidth caps produce hesitation in anyone who is not interested in paying additional $$$ in monthly bills to their ISP. Apple and other content providers still don't make legally available anything near the range of TV-series available like there is in Australia due to distribution deals with SkyTV. And I don't fancy having multiple iTunes accounts, VPN access and having to pay a stranger on Ebay a fee to buy a US iTunes voucher for me, scratch off the number and email it to me so I can buy content semi-legally in a timely fashion.

Seriously, when can I give money to the people who make the content and have them make it available when it's ready? It's 2014!! I'm moaning again. Sorry. The Oatmeal sums it up fantastically if you haven't already seen it.



So we began renting movies that are available through iTunes instead of physical media [DVDs, Bluray discs] through our local movie rental shop. At first, the iTunes delivery is stable and great. We have a 40mb cable modem connection which is more than enough to stream a 720p movie [even with a 15min wait at the start]. But about a year ago, quality of service from iTunes began to drop, with movies we rented pausing 2/3rds through and demanding we wait 20 mins for buffering. Why? Don't we have a fast enough internet connection for this?

Enter the Net-Neutrality debate.

From Wikipedia: Net neutrality (also network neutrality or Internet neutrality) is the principle that Internet service providers and governments should treat all data on the Internet equally, not discriminating or charging differentially by user, content, site, platform, application, type of attached equipment, and modes of communication.

As the internet is increasingly invaded commercially, we've all long suspected ISPs of being vulnerable to 'shaping' traffic volumes where they perhaps shouldn't. Recently this is a hotly debated topic in the US and the forces for and against net-neutrality are slugging it out, with consumers wearing any fallout.

On January 14, 2014, the DC Circuit Court determined that the FCC has no authority to enforce Network Neutrality rules, as service providers are not identified as "common carriers".

I don't know how this is going to play out, but I'm on the side of legislation that protects the internet from too much commercial influence and perpetuates the abilities of anyone to use it fairly as a communication medium, from freedom-fighters to Facebook, Twitter to TradeMe and back. You wouldn't want to have to pay your ISP for top-tier access to your favourite sites on top of monthly access and bandwidth caps would you? Me neither. We've all had enough of 'over-the-top' services from cell providers huh. Dumb-pipes await.

A few days ago, BoingBoing covered the discovery by an independent blogger that Verizon in the US are aggressively throttling Netflix traffic. This blogger, Dave Raphael manages to capture the discussion with a Verizon tech representative where it's admitted that this is in fact what is going on:


So what's this got to do with AppleTV then?

Well back when I bought AppleTV [2nd gen], I was high on the hope Steve Jobs was about to unveil an app store for it and we'd be able to do some of the things we do on the phone/iPad on the TV. That never came to pass, but Steve did, and all we were left with was the notion he'd 'cracked' it and we'd soon be blessed by something much better. And it's been just that, a notion.

Slowly Apple have been adding channels to AppleTV over the last two years. My impression was that this was the thin end of the wedge and that Apple were collecting content makers together one by one to quietly begin to be able much more varied and higher quality offerings than our traditional providers. Yet nothing has really materialised in terms of hardware despite rumours about large screens, 4k displays, bezel-free designs, magical remote control rings etc. Other rumours suggest Apple are hard at work tying up agreements and making deals behind closed doors, getting ready to do to TV what iTunes did to music sales.

Then this report surfaces on MacRumours detailing Apple's progress on building their own content delivery network:

http://www.macrumors.com/2014/02/03/apple-developing-cdn/

"Apple built its retail store chain because Steve Jobs wanted to own Apple's interactions with its customers. With iTunes and iCloud, Apple controls the data and the service, but must outsource the less visible but still incredibly important job of reliably delivering data packets to users. With hundreds of millions of users downloading apps, music, TV shows and movies -- with many of those being streamed in real-time to the Apple TV -- ensuring quality of service for all users will be essential. "

And now I understand. Apple have already made the assumption that net-neutrality is going down the drain and are positioning themselves to be able to guarantee quality of service to their customers with their own content delivery system. And as Apple are a company that prefer to have all their ducks in a row before rolling out a new product [well not always], I don't believe we'll see any large announcements about AppleTV or channels before these infrastructure updates are complete.

Given the timeline for the data-centre completions and the focus on a watch-style product right now and new iPhone6 rumours, I don't see TV announcements on the horizon for another year at least. Maybe I'm wrong, but, I don't think so. Looking forward to ditching the cable box though.

Update: It's 2015 and nothing has changed regarding Apple's approach to TV. It's effectively still a hobby for them. The iPhone6 is here, the watch is about to hit and no TV in sight.

-j

Wouldn't it be great if these services were available in NZ without using a VPN?







Saturday, December 14, 2013

My Fuji x100s Christmas Fauxhemian Transformation

I've been craving the Fuji-x100 since it came out. The combination of the rangefinder aesthetic and sharp, fixed 35mm equivalent lens have me thinking constantly of how many situations I'd be able to use one in where toting the 5DmkII might be overkill, and the iPhone5 would be underkill.

Luckily enough I've waited long enough that the successor to Fuji's upstart digital rangefinder, the x100s is now available and is going to be my Christmas present this year!

David Hobby's x100s in it's Lo-Fi travel duct-tape camouflage.

My usage theory goes like this:

1. Canon 5DmkII for assignments and clients and weddings etc.
2. Fuji x100s for holidays, short trips and places where too much gear is a burden.
3. iPhone5 for everything else.

David Hobby [Strobist] has been using the x100s for a while now and has an excellent series of posts covering it's flexibility and attraction to photographers everywhere:


And his YouTube run-through of the features is worth a look if you need a sense of what the camera can do and where it fits into his world: Click on the image below to visit YouTube and watch:


And if you wanted more, a search on Flickr shows plenty of examples by others.

I think my biggest stumbling block is going to be the x100s' excellent in-camera processing and jpeg output. This may mean not shooting RAW files as the increase in shooting speed and flexibility afforded is pretty impressive. I... *think* I'm going to have trouble committing to this over the xmas break and won't have my computer to compare images on, so maybe RAW+jpeg it will have to be.

I make use of the iPhone5's panoramic shooting mode regularly: 


and so I'm pretty stoked to read that the x100s has a mode for shooting this way too. It should mean higher res and sharper images in this format. 

The way the lens flares out is great. The built-in ND filter is good. The ergonomics are very nice. The shooting modes and film emulation is fantastic. The leaf-shutter and wide aperture should mean much more interesting looks outdoors in full sunlight. In short, I cannot wait for Christmas Day this year!!

I hope your Christmas is filled with family, fun, sun and good cheer!

-j