tag:blogger.com,1999:blog-18293320053731026802024-03-18T16:03:45.971+13:00blog.julianbutler.comphotography, VFX, LEDs, electronics, VR, coding and other fun topicsJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.comBlogger79125tag:blogger.com,1999:blog-1829332005373102680.post-33304836488710802602019-02-23T22:11:00.000+13:002019-02-23T23:02:22.548+13:00My Digital Lavalamp - or "The MkI Epilepsy Generator" - Part 3<div class="separator" style="clear: both; text-align: center;">
</div>
<a href="https://www.blogger.com/blogger.g?blogID=1829332005373102680" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBpR_0meJ8N-RCvA8H_6DrIQhcLRpcSFZ4Yd0RgVHtdlKLJgAP5jes_hhiioaq4TGGgWSpSQjYEmkCkrhxwNBQXMF_lOoLqf65TMa3vQhixY2whIcuQ6xP0yc9MNNGJ0XaIaUMZ460Z8c/s1600/CPU_usage_after.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSUxkvSkdZVndG9Zag_vDB5nNYUNNS0oFYRtFDAMi8rmffbhPTkht3uxAzQzNr9BhYzB1O-BgnfEyU4kMSdeARZ3oWtycp4XcVJWA_HoJ7xDxs3E9OmUif_PDVV1j3tlxiTaheEMzYEi0/s1600/RaspberryPi3.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a>Hola! You made it to Part 3, wherein I detail some of the discoveries, pitfalls and trials of implementing my standalone lava lamp software running in Raspbian.<br />
<br />
Previously, on My Digital Lavalamp:<br />
<br />
<a href="https://blog.julianbutler.com/2018/12/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">Part 1 - The inspiration and initial setup.</a><br />
<a href="https://blog.julianbutler.com/2019/01/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">Part 2 - The physical build.</a><br />
<br />
So, with the physical build complete [and having made my choices concerning buttons and lights], it was time to delve into setting up how the software driving the lamp worked ie: how it behaves when you turn it on and switch modes. I won't cover everything in absolute detail as that'd be boring, but I will link to the key places I found what I needed and discuss what I got working.<br />
<br />
<span style="font-weight: normal;"><span style="font-size: x-large;"><a href="http://openprocessing.org/">openprocessing.org</a></span></span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
<div>
<br /></div>
I discovered early in my processing sketch explorations that <a href="https://www.openprocessing.org/" target="_blank">openprocessing.org</a> was not only a cool site offering a live sketch programming facility, but that a large number of the sketches there were compatible with processing itself and would run simply by copying and pasting code into my own sketches. And thankfully the <a href="https://www.openprocessing.org/home/tos" target="_blank">openprocessing.org Terms of Service</a> regarding this are very friendly to creative endeavours:<br />
<br />
<b>"By submitting Content to OpenProcessing for inclusion on your account, you grant anyone Creative Commons license to reproduce, modify, adapt and publish the Content as defined by the license. If you delete Content, Wiredpieces Inc. will use reasonable efforts to remove it from the Website, but you acknowledge that caching or references to the Content may not be made immediately unavailable."</b><br />
<br />
This meant I could openly reuse and adapt existing sketches and techniques there and modify them to run on my lamp. This led to many evenings of fun discoveries like <a href="https://www.openprocessing.org/sketch/447722" target="_blank">this</a>, and <a href="https://www.openprocessing.org/sketch/537421" target="_blank">this</a> which became modes I could incorporate into my lamp like <a href="https://youtu.be/kiJ81XEa2kA?t=158" target="_blank">such</a> and <a href="https://youtu.be/kiJ81XEa2kA?t=305" target="_blank">so on</a>. I already had a long list of things I wanted the lamp to do, but I was quickly adding new ideas I hadn't originally considered. WIN!<br />
<br />
Some of the best modes remain modified versions of the test sketches that come with the fadeCandy codebase created by Micah Elizabeth Scott. I've yet to top those in terms of beauty, efficiency and speed.<br />
<br />
Processing also ships with an excellent library browsing and installation tool that gives you instant access to a wide variety of tools to do everything from sound interaction to computational fluid dynamics:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxC7K5WJAKMdXsIhCfus4lOb8BZlJewcE_lycEpufgWSQZkc1_GzTVwb90tbesAMHRYanzsag6DPPJ5nRyFVGLLweXJPEhSY3l4BGek-JMDbB3M2_B7YrJG_pjdFMs0F2u8IIYym5IOOw/s1600/processingLibraryBrowser.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1312" data-original-width="854" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxC7K5WJAKMdXsIhCfus4lOb8BZlJewcE_lycEpufgWSQZkc1_GzTVwb90tbesAMHRYanzsag6DPPJ5nRyFVGLLweXJPEhSY3l4BGek-JMDbB3M2_B7YrJG_pjdFMs0F2u8IIYym5IOOw/s640/processingLibraryBrowser.png" width="416" /></a></div>
<br />
Looking at some of these libraries I was inspired to create lavalamp modes that respond to changes in the weather, things that simulate hot fluids like a *real lavalamp*, as well as respond to sounds in the environment etc. There's even standalone voice-recognition libraries that I could use to get it to switch modes when I asked it to. So many possibilities.<br />
<br />
I'll put all the sketches I have and some other bits and pieces <a href="https://github.com/JRButler/Digital-lava-lamp" target="_blank">up on gitHub</a> for anyone who wants to build something like this or simply see what I did. It's time to talk about Raspbian.<br />
<br />
<span style="font-weight: normal;"><span style="font-size: x-large;">Raspbian</span></span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
The Raspberry Pi doesn't come with an operating system. Well, it can do, but you have to download and run it yourself from a SD card. You can <a href="https://www.lifehacker.com.au/2016/05/the-best-operating-systems-for-your-raspberry-pi-projects/" target="_blank">run many different flavours of linux</a> on a Raspberry Pi but the one I chose was Raspbian because it's the "official" operating system for the Pi and widely used.<br />
<br />
The first thing to do was download a copy of Raspian and flash it onto my SD card. I used an OSX utility called <a href="https://www.balena.io/etcher/" target="_blank">Etcher</a> for this. In no time I'd booted the Raspberry Pi with everything connected and was welcomed by the Raspbian desktop and config utility.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikM-UPVqLepFmEAGojvr6aH94BThJ8NvXUZqtSKcvopumWMKzvj7R0NkmFEa-DLZevOJgECI3VZUSLGs3edCvVJ9oh2_VXezjqL_ttlTAUNcEQvv7uWmpr8lFZNlXyWlBRBNYgNBmsD_0/s1600/raspbian3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="720" data-original-width="1200" height="384" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikM-UPVqLepFmEAGojvr6aH94BThJ8NvXUZqtSKcvopumWMKzvj7R0NkmFEa-DLZevOJgECI3VZUSLGs3edCvVJ9oh2_VXezjqL_ttlTAUNcEQvv7uWmpr8lFZNlXyWlBRBNYgNBmsD_0/s640/raspbian3.png" width="640" /></a></div>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQm-mopTEKlLty7GMCAN6blC5WoHUa1gjcjUdcm8hBXA9ENN2LZWXmodAsThxH3hRWK-23u_VUYS0932DOELibx8ezszkKx3gSbKksxLyXuqGoJ-lnxwv2PYWP9hx6hQ7PUQRpNC_Ox1g/s1600/MacMiniLate2014.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><br />
From here setup was a breeze, with one exception. My mouse pointer was lagging and jittering all <span style="font-family: inherit;">over the place. A quick search on Reddit revealed that I needed to add a line to the /boot/cmdline.txt config file. This is one of the many places in linux where you can tweak settings as the computer starts up. This issue is likely one that'll be fixed in later versions of Raspbian.</span><br />
<div style="font-stretch: normal; line-height: normal;">
<span style="font-family: inherit; font-kerning: none;"><br /></span>
<span style="font-family: inherit; font-kerning: none;">Add: </span></div>
<div style="font-family: Times; font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
<span style="font-kerning: none;"></span><br /></div>
<span style="font-family: "courier new" , "courier" , monospace;">usbhid.mousepoll=0</span><br />
<div style="font-family: Times; font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
<span style="font-kerning: none;"></span><br /></div>
<div style="font-stretch: normal; line-height: normal;">
to <b>/boot/cmdline.txt</b>.<span style="font-family: inherit; font-size: 16px;"><span style="font-kerning: none;"> There may be nothing else in that file if you're working with a fresh install where this might occur. I didn't encounter too many other issues apart from this with the exception of some </span></span>wifi glitches that were fixed in a subsequent Raspbian release, s<span style="font-family: inherit; font-size: 16px;"><span style="font-kerning: none;">o I won't cover them here.</span></span></div>
<div style="font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
<span style="font-kerning: none;"></span><br /></div>
<div style="font-family: Times; font-size: 32px; font-stretch: normal; line-height: normal;">
<span style="font-kerning: none;">Processing and Performance</span></div>
<div style="font-family: Times; font-size: 24px; font-stretch: normal; line-height: normal;">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQm-mopTEKlLty7GMCAN6blC5WoHUa1gjcjUdcm8hBXA9ENN2LZWXmodAsThxH3hRWK-23u_VUYS0932DOELibx8ezszkKx3gSbKksxLyXuqGoJ-lnxwv2PYWP9hx6hQ7PUQRpNC_Ox1g/s1600/MacMiniLate2014.png" style="margin-left: 1em; margin-right: 1em;"></a></div>
<span style="font-kerning: none;">----------------------------------------------------------------------------</span></div>
<div style="font-family: Times; font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
<br />
<span style="font-kerning: none;"></span></div>
<span style="font-family: inherit;">With that done it was time to install and test the Processing performance playing through some of my existing sketches created on the Mac. In recent versions of Raspbian, Processing comes pre-installed, which is great. But, in case it doesn't, here's a link to info about where to find the ARM-specific version and how to install it: <a href="https://learn.adafruit.com/processing-on-the-raspberry-pi-and-pitft/processing">https://learn.adafruit.com/processing-on-the-raspberry-pi-and-pitft/processing</a><br /><br />At this point, starting my Pi took about 10-15 seconds, after which it secured itself an IP address and presented itself on the network. I was able to connect to it in the OSX finder and simply drag and drop my sketches into a folder there and run them directly on the Pi by launching the pre-installed version of Processing. Which is when I first became aware of one of the chief differences between the Pi and a Mac Mini, namely CPU performance.</span><br />
<div style="font-family: Times; font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhpnFqukKAu0SWn1nO-R4-t6E-54QypsROB1k63cYYsiGCv7xIGtP57AN2dObxeZzIIXBjOGHHJ_RuYoxvfyCxUDhtqtPI1RahfXxMuJDB0wUI_MU4nJr0TjlfAse-ncI8oo9Oqsqjmxn0/s1600/MacMiniLate2014.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="792" data-original-width="911" height="556" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhpnFqukKAu0SWn1nO-R4-t6E-54QypsROB1k63cYYsiGCv7xIGtP57AN2dObxeZzIIXBjOGHHJ_RuYoxvfyCxUDhtqtPI1RahfXxMuJDB0wUI_MU4nJr0TjlfAse-ncI8oo9Oqsqjmxn0/s640/MacMiniLate2014.png" width="640" /></a></div>
<br />
<span style="font-kerning: none;"></span>
</div>
My <a href="https://browser.geekbench.com/macs/369">Mac Mini [late 2014] is a Intel Core i5 running 2 CPU cores at 2.8GHz</a>. So it's not exactly bleeding edge but it's got enough snot to chew through a lot of Processing draw calls, specially at the resolution I'd been prototyping my sketches at ie: 400x800 etc. Particle simulations run pretty smoothly, computational fluid dynamic solves run well enough to look like fluids etc. I was able to use Processing's additional 2D and 3D acceleration frameworks to speed up some effects too. I could afford to instance a transparent circular gradient .png file as a <a href="https://youtu.be/kiJ81XEa2kA?t=111">fire particles at high enough sizes and densities to look pretty cool</a>. <br />
<div style="font-stretch: normal; line-height: normal; min-height: 19px;">
<div style="font-family: times; font-size: 16px;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="font-family: times; font-size: 16px; margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgou747JRkeIsvju_GpVRBS2eHd7qU2gNegbuwBPg_-Og5XU8UL6mUvMRor6BxE5N6-LwuLW3aj4vbt76W9_G44h5dmMhyZSz7ATIYUWMXyrHB-99Or4SbCoSaqJcuO3Xe2BDJIRvz4Xbo/s1600/RaspberryPi3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="848" data-original-width="1001" height="542" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgou747JRkeIsvju_GpVRBS2eHd7qU2gNegbuwBPg_-Og5XU8UL6mUvMRor6BxE5N6-LwuLW3aj4vbt76W9_G44h5dmMhyZSz7ATIYUWMXyrHB-99Or4SbCoSaqJcuO3Xe2BDJIRvz4Xbo/s640/RaspberryPi3.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">WAT</td></tr>
</tbody></table>
The <a href="https://www.raspberrypi.org/magpi/raspberry-pi-3b-plus/">Raspberry Pi 3B+</a> however has a significantly smaller hardware capability. Although it's got a 1.4GHz Quad-Core ARM Cortex-A53 CPU, it's a very different CPU and draws a lot less power, creating a lot less heat and more importantly, simply doesn't crunch the same amount of Processing draw calls I was used to. I'd been spoilt by prototyping my lavalamp modes on the Mac Mini and was forced to make some economies and trade-offs to achieve the performance that I wanted in my lamp.<br />
<br />
I'll go through what choices I made to combat this briefly and then we'll move onto the FadeCandy installation:</div>
<div style="font-stretch: normal; line-height: normal; min-height: 19px;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSUxkvSkdZVndG9Zag_vDB5nNYUNNS0oFYRtFDAMi8rmffbhPTkht3uxAzQzNr9BhYzB1O-BgnfEyU4kMSdeARZ3oWtycp4XcVJWA_HoJ7xDxs3E9OmUif_PDVV1j3tlxiTaheEMzYEi0/s1600/RaspberryPi3.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSUxkvSkdZVndG9Zag_vDB5nNYUNNS0oFYRtFDAMi8rmffbhPTkht3uxAzQzNr9BhYzB1O-BgnfEyU4kMSdeARZ3oWtycp4XcVJWA_HoJ7xDxs3E9OmUif_PDVV1j3tlxiTaheEMzYEi0/s1600/RaspberryPi3.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><br /></div>
<div style="font-family: Times; font-size: 32px; font-stretch: normal; line-height: normal;">
<span style="font-kerning: none;">Compromise</span></div>
<div style="font-family: Times; font-size: 24px; font-stretch: normal; line-height: normal;">
<span style="font-kerning: none;">----------------------------------------------------------------------------</span></div>
<br />
Processing on ARM architectures doesn't currently support quite the same levels of 2D and 3D acceleration, so sketches that were using those things ran incredibly slowly [granted, I was only running one 3D acceleration sketch which I then abandoned]. But I was using 2D acceleration for drawing particles in some of the particles intensive sketches. This meant that I needed to make some changes like:<br />
<ul><a href="https://www.blogger.com/blogger.g?blogID=1829332005373102680" imageanchor="1" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBpR_0meJ8N-RCvA8H_6DrIQhcLRpcSFZ4Yd0RgVHtdlKLJgAP5jes_hhiioaq4TGGgWSpSQjYEmkCkrhxwNBQXMF_lOoLqf65TMa3vQhixY2whIcuQ6xP0yc9MNNGJ0XaIaUMZ460Z8c/s1600/CPU_usage_after.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSUxkvSkdZVndG9Zag_vDB5nNYUNNS0oFYRtFDAMi8rmffbhPTkht3uxAzQzNr9BhYzB1O-BgnfEyU4kMSdeARZ3oWtycp4XcVJWA_HoJ7xDxs3E9OmUif_PDVV1j3tlxiTaheEMzYEi0/s1600/RaspberryPi3.png" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;"></a>
<li>Disable the 2D draw support - depending on the sketch, it was adding maybe 25-30% performance gain over NOT using it. </li>
<li>Lower the overall sketch resolution, which was ok because lets face it, technically speaking the sample resolution of the array was low to begin with. 8x15 LEDs is not exactly high-def. And 400x800px is a little rich. Sketches now became more like 100x200px, which as we know from the MegaPixel Wars in camera sensor resolutions, is a non-linear improvement. </li>
<li>Lower the number of particles I was drawing. </li>
<li>Change the particle itself from a transparent gradient .png file to a simple flat transparent circle() in the draw function and use other techniques to regain the softness I'd previously had. </li>
<li>In other sketches where I was needing to draw lines or other shapes, simply lower the number of line() calls, make them fatter etc. </li>
<div class="separator" style="clear: both; text-align: center;">
</div>
<li>If I needed to achieve an animated wipe or transition, I considered using a single large image and transforming it somehow, compared computing it's appearance otherwise [in fact, this is how I achieve my <a href="https://youtu.be/kiJ81XEa2kA?t=151">startup array test look</a>]. </li>
</ul>
With most sketches getting one or more of the adjustments listed above, I regained about 50% of the performance I used to have on the Mac Mini, which was good enough. Some of the computational fluid dynamic sketches required deeper digging to optimise for the Pi as they had more complicated compute functions as well as requiring drawing many particles.<br />
<br />
I also found it helpful to specify a target frameRate() and work within that constraint until I'd achieved the speed and look I needed. By default I think Processing sketches will run as fast as they can - this may not be a good place to be! The look of your sketch may be affected by it's compute, so it's important to know what knobs to tweak to control appearance.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYCVcZcRPm9AyAIPAlbBgcODQtdxCJjO_Squ2Um48gZXFNzaWuGNU2SfSelOHjZJKAEx0RE5c-IPCvSoLmXxPFgWeAQmYIxeo3gRjncZMgg18Cym1Gle8N3nkT8f7L7Y8QBO2o2SNd754/s1600/CPU_usage_after.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="543" data-original-width="984" height="352" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhYCVcZcRPm9AyAIPAlbBgcODQtdxCJjO_Squ2Um48gZXFNzaWuGNU2SfSelOHjZJKAEx0RE5c-IPCvSoLmXxPFgWeAQmYIxeo3gRjncZMgg18Cym1Gle8N3nkT8f7L7Y8QBO2o2SNd754/s640/CPU_usage_after.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Even after shrinking and optimisation, my Simple Fire sketch can consume 83.2% CPU on the Mac Mini at 30fps.</td></tr>
</tbody></table>
Overall this phase was good. It forced me to economise and be efficient. It also made me work harder to achieve the look I was after and consider more difficult changes to some of the sketches I was working with.<br />
<br />
<span style="font-size: x-large;"><a href="https://www.blogger.com/blogger.g?blogID=1829332005373102680"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBpR_0meJ8N-RCvA8H_6DrIQhcLRpcSFZ4Yd0RgVHtdlKLJgAP5jes_hhiioaq4TGGgWSpSQjYEmkCkrhxwNBQXMF_lOoLqf65TMa3vQhixY2whIcuQ6xP0yc9MNNGJ0XaIaUMZ460Z8c/s1600/CPU_usage_after.png"></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSUxkvSkdZVndG9Zag_vDB5nNYUNNS0oFYRtFDAMi8rmffbhPTkht3uxAzQzNr9BhYzB1O-BgnfEyU4kMSdeARZ3oWtycp4XcVJWA_HoJ7xDxs3E9OmUif_PDVV1j3tlxiTaheEMzYEi0/s1600/RaspberryPi3.png"></a>FadeCandy</span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
FadeCandy has a server/client architecture such that you can run the server on a main computer housed somewhere central, and then multiple small FadeCandy units can connect to that server to control LED strips where you need them. The server is very lightweight. All you really need to do is run it at startup and forget about it.<br />
<br />
Here's an excellent guide for getting FadeCandy setup [on OSX] running a 8x8 array of addressable LEDs that also covers some wiring and power requirements: <a href="https://cdn-learn.adafruit.com/downloads/pdf/led-art-with-fadecandy.pdf">https://cdn-learn.adafruit.com/downloads/pdf/led-art-with-fadecandy.pdf</a><br />
<br />
But if you want to run standalone on a Pi you need to download and make a version of FadeCandy that'll run on ARM etc - this legendary Adafruit guide has all these steps and more including running the FadeCandy server at startup - <a href="https://learn.adafruit.com/1500-neopixel-led-curtain-with-raspberry-pi-fadecandy/fadecandy-server-setup#">https://learn.adafruit.com/1500-neopixel-led-curtain-with-raspberry-pi-fadecandy/fadecandy-server-setup#</a><br />
<br />
<b>Note:</b> in the guide I just mentioned, I stopped before creating the fcserver.json config file as I am simply running one FadeCandy board and didn't need to configure addressing the 3 boards used in that example.<br />
<br />
The FadeCandy server page has some useful utilities for testing your LED strips etc and will also show you the serial number of the FadeCandy board you have once it's detected ie: plugged in via USB.<br />
<br />
<span style="font-size: x-large;">Processing talks FadeCandy</span><br />
<div style="font-family: Times; font-size: 24px; font-stretch: normal; line-height: normal;">
<span style="font-kerning: none;">----------------------------------------------------------------------------</span></div>
<br />
How does Processing interface with the FadeCandy server though? How do you even know what is being sent to your LED array from your sketch? Good question. This is where the OPC class comes in.<br />
<br />
Shipping with the FadeCandy installation comes some example files created by <a href="https://www.misc.name/">Micah</a>, that contain her Open Pixel Control java class for Processing. This is a suite of methods for telling the FadeCandy server where a LED [or multiple LEDs] are positioned relative to the sketch window. It has calls that allow you to mirror your physical LED array in Processing so you can accurately gauge how to achieve the visual effects you need. You then use calls like ledStrip() and ledGrid() to construct a sort of sample array whose points will query the colour of pixels under them and send that info per frame to the FadeCandy board over USB and then onto your actual array.<br />
<br />
Here's what the OPC commands look like for my array:<br />
<div style="font-family: Times; font-size: 16px; font-stretch: normal; line-height: normal; min-height: 19px;">
</div>
<span style="font-family: "courier new" , "courier" , monospace;"> opc = new OPC(this, "127.0.0.1", 7890);<br /> opc.ledGrid(0, 15, 4, width*0.25, height/2, height/15, width/8, 4.712, true);<br /> opc.ledGrid(64, 15, 4, width*0.75, height/2, height/15, width/8, 4.712, true);</span><br />
<span style="font-family: "courier new" , "courier" , monospace;"><br /></span>
<br />
<div>
<div class="separator" style="clear: both; text-align: center;">
</div>
The first ledGrid() command arranges and fills the first FadeCandy output channel of 64 LEDs and the second ledGrid() call fills the second channel with the remaining LEDs. The '4.712' is the rotation amount expressed in radians required to get the sample array mapped in a vertical orientation to match my LEDs wrapped around a cylinder.<br />
<br />
Here's a couple of visual examples of the OPC dot sample array I'm talking about:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTff7VeSduNECx6UAnArbk234tUHv-4yD7-OGIgZZtig9-fSImLaWUAD-rsBkyUM7_3paHAfIOrDUE_i-7QeKRSyEzENs33-3iOonwkYTC3Q7i9TKTihqfU98iSwCt4IgcJ36xw94MEG4/s1600/FCserverDots.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="647" data-original-width="1133" height="364" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTff7VeSduNECx6UAnArbk234tUHv-4yD7-OGIgZZtig9-fSImLaWUAD-rsBkyUM7_3paHAfIOrDUE_i-7QeKRSyEzENs33-3iOonwkYTC3Q7i9TKTihqfU98iSwCt4IgcJ36xw94MEG4/s640/FCserverDots.png" width="640" /></a></div>
<br />
On the left, Micah's wavefronts example sketch at its original size, on the right, my optimised and shrunk particle fire effect.<br />
<br />
You can see in these two sketches the result of the OPC library calls placing white dots in the sketch window that mimic the physical array of my lavalamp LED layout so I can see just where the colours of the sketch will be sampled and displayed externally. Of course, the white dots are not sampled [otherwise the LEDs would just display white!], just the colours underneath them. <br />
<br />
So this means that you actually don't really need to run a sketch in a large window as there's a lot of wasted draw() that is not utilised. There's a lot of gaps between the dots. So, for an application like mine, I can make the sketch smaller and use a filtering function like filter(BLUR, 2) to handle smoothing the edges of moving shapes. <br />
<br />
<span style="font-size: x-large;">Startup, Shutdown and Sketch Changing</span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
One of the realties of using a full Linux computer [or most others for that matter] as a lighting fixture is that you can't simple unplug it when you want some peace. Computers have file systems that may be busy in the middle of writing some important information when you cut the power, and this can corrupt the file system, potentially making your computer not even start up. I had little choice but to consider making a nice way to shut the Pi down when I wanted to turn it off. And without having a mouse, keyboard and monitor connected to it, how can this be done?<br />
<br />
I needed a physical button. Well, two actually. One that could tell the Pi when to shutdown, and one to tell the Pi when to switch the current Processing sketch. Thankfully the Pi has <a href="https://www.raspberrypi.org/documentation/usage/gpio/">GPIO header pins [general purpose input/output]</a> that permit connecting many things up etc. This means with a little browsing at your local electronics store, a simple momentary switch can be used to send these signals.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMPQiyt6I8QodaBSVKW0pyWd4DhsKsj5rok7RF8x7hUKbgLHYAaYnz1oISdncitgp0Z4F0BcU8H6aIqy-8ul2t8GuEKveZfheyc5gSBFatgZHLYP8rt20lrUWlz9yEs-nEiSevhCW9YK8/s1600/GPIO_pins.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="597" data-original-width="800" height="476" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMPQiyt6I8QodaBSVKW0pyWd4DhsKsj5rok7RF8x7hUKbgLHYAaYnz1oISdncitgp0Z4F0BcU8H6aIqy-8ul2t8GuEKveZfheyc5gSBFatgZHLYP8rt20lrUWlz9yEs-nEiSevhCW9YK8/s640/GPIO_pins.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="text-align: start;"><span style="font-size: x-small;">GPIO pins on my Pi with two switches and a status LED connected.</span></span></td></tr>
</tbody></table>
After reading a bit I followed the instructions in this link to hook up my startup/shutdown button: <a href="https://howchoo.com/g/mwnlytk3zmm/how-to-add-a-power-button-to-your-raspberry-pi">https://howchoo.com/g/mwnlytk3zmm/how-to-add-a-power-button-to-your-raspberry-pi</a><br />
<br />
This link also includes the instructions for getting this small python script to run when the system starts up automatically. This is really handy and I thought I could simply reuse this script launching facility to start the python script that would listen for my second button pushes to make the lavalamp sketch change. However, this was not to be - more on that shortly.<br />
<br />
The setup in the link above requires making a small python script called listen-for-shutdown.py that uses a python GPIO library designed to sense voltage changes on GPIO pins and turn those into the system command that shuts down the Pi. You also make a shell script that will run this listening python script upon booting up the Pi. They're both small and lightweight - nice and simple.<br />
<br />
When you do turn off the Raspberry Pi, it takes about 10 seconds to halt all processes and get into a low power state where it's safe to cut the main power. Before it does that it flashes the TxD LED that is soldered onto the main board along with some other lights. More info on those lights <a href="https://www.raspberrypi-spy.co.uk/2013/02/raspberry-pi-status-leds-explained/">here</a>. It'll flash that light 10 times quickly, after which it's safe to cut the power.<br />
<br />
I chose to extend that status light to the outside of my Pi enclosure following another guide here:<br />
<a href="https://howchoo.com/g/ytzjyzy4m2e/build-a-simple-raspberry-pi-led-power-status-indicator">https://howchoo.com/g/ytzjyzy4m2e/build-a-simple-raspberry-pi-led-power-status-indicator</a><br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfT0lByyb_1fheUt9fkTb1c_jbYtfDYFBxX2nTIM-l_gDe8SXejI_4Krx4q6b63daIP92wf5gLFtkzIlA9x81A5wp3eccTDvGlSzRx8gwdVPZDz_3nHBDeDIgMqhQFn5cNgKu6Bldibf0/s1600/statusLEDsmall.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="544" data-original-width="800" height="434" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfT0lByyb_1fheUt9fkTb1c_jbYtfDYFBxX2nTIM-l_gDe8SXejI_4Krx4q6b63daIP92wf5gLFtkzIlA9x81A5wp3eccTDvGlSzRx8gwdVPZDz_3nHBDeDIgMqhQFn5cNgKu6Bldibf0/s640/statusLEDsmall.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="text-align: start;"><span style="font-size: x-small;">Here you can see on the rear of my lamp base, the extended status LED and the startup/shutdown momentary switch.</span></span></td></tr>
</tbody></table>
I extended the last switch to the front of the housing as the sketch change switch. This was another identical momentary switch. Like I said, I thought I could simply use the same script launching system as the startup/shutdown switch, but no luck. For some reason that wouldn't work and it took some digging to discover why.<br />
<br />
I wanted to have another python script running that would again just be listening for voltage changes on another set of the GPIO pins. This time though, the script would kill any running sketch-related processes, iterate through the string array of available paths to sketches, select the next logical one in the list and launch it. But every time I tried it, I couldn't even get it to launch the first sketch in the array. I tried about 4 different schemes for launching python scripts at startup, making the assumption that they were all failing. It was a little infuriating.<br />
<br />
Actually what was failing was that all the script launch schemes I was using were not designed to launch interactive applications like Processing. They were designed to launch small utility scripts at different <a href="https://www.liquidweb.com/kb/linux-runlevels-explained/">linux run-levels</a> beneath the user-space and interactive levels where Processing could be initiated. That's why the listen-for-shutdown.py worked, because it wasn't ever trying to start a complicated graphical program, it only ever invoked the terminal shutdown command.<br />
<br />
Once I understood this I found the information I needed. Here's where I read about that problem: <a href="https://www.raspberrypi.org/forums/viewtopic.php?f=91&t=219952">https://www.raspberrypi.org/forums/viewtopic.php?f=91&t=219952</a><br />
<br />
Then I had success. It was time to flesh out the main python script for managing the sketch list and which sketch played automatically on startup.<br />
<br />
<span style="font-size: x-large;">Speed and pre-compiling</span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
<br />
If you've used Processing on a fast computer, it might seem pretty interactive when you launch a sketch from within Processing itself. But, trying this on the Pi, I found a much longer delay when launching, like more than 5 seconds in some instances, compared to my Mac Mini's 1-2 seconds. This would be a pretty laggy experience if you pushed the button on the front of the lamp and had to wait 5 seconds before something changed. I also didn't have a way to launch a Processing sketch from within the program interface via Python, so I would have had to write some master framework within one main sketch that contained all the sketches. That would become onerous to compile and add new modes to. <br />
<br />
Processing offers a pre-compliation option where you can pack a sketch down into a standalone unix executable file compatible with your platform - in this case the Pi. When I tried this I discovered not only does it start up much faster on the Pi, but that I can control the launching of the sketch from the command line via Python. Success!<br />
<br />
By using the good old 'top' command while a pre-compiled sketch is running, I could see at a glance that the main CPU hog is java and also that there were no other java processes currently occurring. This mean that to stop my sketch immediately I can simply execute the 'killall java' command to nuke it. This is a bit like using a hammer to squash a fly I realise, but as this Pi was simply devoted to being my lavalamp and shouldn't be running any other java processes, it's fine to use this method.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8iOP54Yk7efV6tP2lNDcnf-xZ7AnfCJNcrX5GkIHYrJ_vk9cWZxW76nMqIdKLrQNj-oG3FtLbX9_cExODpqYeHs1cfC3O3J7ewy3-hf4pVxV19Hc0mEtuUsSVeAWu0JnXkF39xm78v3o/s1600/ssh_top.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="412" data-original-width="680" height="386" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8iOP54Yk7efV6tP2lNDcnf-xZ7AnfCJNcrX5GkIHYrJ_vk9cWZxW76nMqIdKLrQNj-oG3FtLbX9_cExODpqYeHs1cfC3O3J7ewy3-hf4pVxV19Hc0mEtuUsSVeAWu0JnXkF39xm78v3o/s640/ssh_top.jpg" width="640" /></a></div>
<br />
So, with the following pieces in place:<br />
<ol>
<li>A method to launch a python script after hardware boot,</li>
<li>A means to detect hardware button input,</li>
<li>A means of starting and stopping Processing sketches via the command line,</li>
<li>A speedy and self-contained sketch format,</li>
</ol>
I had all the pieces in place to create a master python script to control switching between sketches:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEibn79gYeXJGUIGNnPS9ERQVhTnjp51Gvt0mhFHe4RxXIL30p-mbE15iET2emB4T8Le3Zr7bUhYqlmJfozzlnCbVXc5vKRS5HkIsvKTQC46Qseq86naerLA-_X5Gl66kf6ziXu_sCsrXTE/s1600/sketchChangeList.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1337" data-original-width="956" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEibn79gYeXJGUIGNnPS9ERQVhTnjp51Gvt0mhFHe4RxXIL30p-mbE15iET2emB4T8Le3Zr7bUhYqlmJfozzlnCbVXc5vKRS5HkIsvKTQC46Qseq86naerLA-_X5Gl66kf6ziXu_sCsrXTE/s640/sketchChangeList.jpg" width="456" /></a></div>
<br />
It's pretty simple, and I'm sure someone reading it will find flaws and ways I could do it better. With most coding things there's always a better way.<br />
<br />
Should you want to use what I've made you can find this script and all the sketches in the gitHub repo: <a href="https://github.com/JRButler/Digital-lava-lamp">https://github.com/JRButler/Digital-lava-lamp</a> You're welcome to take it and modify it to suit your requirements etc.<br />
<br />
<span style="font-size: x-large;">Stretch Goals</span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
Phew. That's a lot of stuff. Now that it's done, I've got a few stretch goals in mind. I reached <a href="https://blog.julianbutler.com/2019/01/my-digital-lavalamp-or-mki-epilepsy.html">my initial goals</a> of making the lamp completely standalone and having switchable sketches triggered by a hardware button. While getting there it occurred to me that having a full linux computer running the lamp sitting on the local network meant that I could do some other tricks with it:<br />
<br />
<ul>
<li>Have it serve a webpage over the local network so guests could control the lamp from their phone via animated gif buttons showing the different modes.</li>
<li>Employ a standalone voice recognition package to control the modes.</li>
<li>Run <a href="https://homebridge.io/">HomeBridge</a> on the Pi to interface with some HomeKit devices via voice control.</li>
<li>Make more sketches that interact or reflect the status of other things ie: <a href="https://ifttt.com/">IFTTT</a> integration.</li>
<li>Use my new found skills to build something like this - <a href="https://youtu.be/-SNIlKsPiwU">https://youtu.be/-SNIlKsPiwU</a></li>
</ul>
<br />
Anyway, that's about it! Thanks for reading.<br />
<br />
-j<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP6yyQWFw1KabeQ3nldz7g6gjCOP9tHYHqYEm9GYo5niX0xheAMIVosdhyphenhyphenJGBQ7YXuVtoMhw6kszPR0fKqh5UYnzSX_R3G_jaRohwNGWJD_UgxedakIu3RtPfoe_6CBqzmg5kcsFWqvVQ/s1600/bloggingDesk.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1200" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiP6yyQWFw1KabeQ3nldz7g6gjCOP9tHYHqYEm9GYo5niX0xheAMIVosdhyphenhyphenJGBQ7YXuVtoMhw6kszPR0fKqh5UYnzSX_R3G_jaRohwNGWJD_UgxedakIu3RtPfoe_6CBqzmg5kcsFWqvVQ/s640/bloggingDesk.jpg" width="640" /></a></div>
<br />
<br /></div>
Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com11tag:blogger.com,1999:blog-1829332005373102680.post-45132764178174966092019-01-20T17:01:00.001+13:002019-03-10T22:40:14.133+13:00My Digital Lavalamp - or "The MkI Epilepsy Generator" - Part 2Ok, welcome back. Last we left off I'd managed to get LED array wired up, correctly oriented with the final vertical grid arrangement, and playing sketches while connected to my mac. It was time to get it wrapped around the central PVC cylinder and into the upside down vase on a temporary wooden base while I figured out what the diffuser solution would be.<br />
<br />
If you haven't already seen it, <a href="http://blog.julianbutler.com/2018/12/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">here's Part 1.</a><br />
<br />
I decided to rule evenly spaced lines along the length of the PVC tube as a guide to make sure that I placed the LED strips in the right spots. I really wanted them evenly distributed. I also thought that due to the fragility of the strips themselves, that I'd use 3M double sided tape pads instead of hot glue to stick them to the cylinder. The LEDs have the potential to heat up which might have caused the glue to fail.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhA0cObenR1pEt-iKt7HHt1pg7lBBxiNP8bHsh6RIARNeWYTaqbjs6ucZ4qCNkk2DG4PXBUC6wsdrXK9XRQRsfEr5-T5VGxeXd4_ZtugHbOMvnXBZ2U88PQKDtMDnRz3uC3Mxu0QwVZIqY/s1600/lavalampWoodBaseNoDiffuser.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="542" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhA0cObenR1pEt-iKt7HHt1pg7lBBxiNP8bHsh6RIARNeWYTaqbjs6ucZ4qCNkk2DG4PXBUC6wsdrXK9XRQRsfEr5-T5VGxeXd4_ZtugHbOMvnXBZ2U88PQKDtMDnRz3uC3Mxu0QwVZIqY/s640/lavalampWoodBaseNoDiffuser.jpg" width="432" /></a></div>
<br />
I carved a temporary channel for the power and data wiring to pass out underneath the edge of the vase and just sat the glass instead of securing it. This allowed me to get a sense of the space I had to work with and what other considerations to make when building the final base.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKKSmWR8yW5EBYEbN5Ke2Y7Ob4EtaFsl5YbVa43c_hf826MCfEOxD1wETFzke9RVt8kwC0rkFtQ15Wq4wQnDtkPlRqQTB8pf6xAZlLDrSBew-zPohaymJ1i_h7xSJ9Ld3aeNMwy9T02fE/s1600/noiseCloudsNoDiffuser.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="281" data-original-width="500" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKKSmWR8yW5EBYEbN5Ke2Y7Ob4EtaFsl5YbVa43c_hf826MCfEOxD1wETFzke9RVt8kwC0rkFtQ15Wq4wQnDtkPlRqQTB8pf6xAZlLDrSBew-zPohaymJ1i_h7xSJ9Ld3aeNMwy9T02fE/s640/noiseCloudsNoDiffuser.gif" width="640" /></a></div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7tpSePW9o6mXq037l-cg0eAjsUKQp2Z1YdjEdAMUQVORj2Qs_7XkSxs073iPK8ueIV9GCdXL8OpZaEN__CGd7rMr94rY3mrFx5xtlv3e-3s2XLI98Ehln_rYw-8fJ_C-mIJWTWiWxxVE/s1600/plasmaRingsNoDiffuser.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="281" data-original-width="500" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7tpSePW9o6mXq037l-cg0eAjsUKQp2Z1YdjEdAMUQVORj2Qs_7XkSxs073iPK8ueIV9GCdXL8OpZaEN__CGd7rMr94rY3mrFx5xtlv3e-3s2XLI98Ehln_rYw-8fJ_C-mIJWTWiWxxVE/s640/plasmaRingsNoDiffuser.gif" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Starting to look like something Tony Stark might have on his desk.</td></tr>
</tbody></table>
So this is a good spot to be in. I can prototype sketches to run and the array is in its final configuration pointed in the right direction. Now to test diffuser materials.<br />
<br />
<span style="font-weight: normal;"><span style="font-size: x-large;">Diffusing</span></span><br />
<span style="font-size: large;">----------------------------------------------------------------------------</span><br />
<br />
The first and simplest is a piece of paper. Which... is too hairy:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiW9S3gzaFYFznNTeJjAkuctBUDLbymubIjvhXMWZZtz-oAPj5Y-IUTTqegRnT3ljDYHLQf-2tNQg8C81gKtZQG0cNve2zrVQtXyosv1WQFBsMwzGXy6oKniyHrAwAG9kUYfCBB1W2EMNs/s1600/paperDiffuser.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiW9S3gzaFYFznNTeJjAkuctBUDLbymubIjvhXMWZZtz-oAPj5Y-IUTTqegRnT3ljDYHLQf-2tNQg8C81gKtZQG0cNve2zrVQtXyosv1WQFBsMwzGXy6oKniyHrAwAG9kUYfCBB1W2EMNs/s640/paperDiffuser.jpg" width="480" /></a></div>
<br />
It does effectively hide the individual LEDs, but it's really not the thing to use. After a visit to the art supplies shop I came back with two different thin, frosted polypropylene materials to test:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy1Pj4mWjd6d-e9yhqll-snF_-ZtY9MPOHldcQcPAfEobvn2yx9YOcllI2U7MmpOYnXUsqbp9di3P2gaN-gtaZWXV5i0F_VCJMsp4-72fIkPf91O_BFjt84vty19uUMq9kfScSlsEVwVk/s1600/arcrylicDiffusersTests.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1200" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy1Pj4mWjd6d-e9yhqll-snF_-ZtY9MPOHldcQcPAfEobvn2yx9YOcllI2U7MmpOYnXUsqbp9di3P2gaN-gtaZWXV5i0F_VCJMsp4-72fIkPf91O_BFjt84vty19uUMq9kfScSlsEVwVk/s640/arcrylicDiffusersTests.jpg" width="480" /></a></div>
<br />
You can see in the top row that the LEDs are far too individually visible with the first material. And even two sheets weren't enough to give the right effect. However the second material worked better, and after doubling it to two sheets thick it gave exactly what I wanted. None of the individual LEDs were really visible on their own and any patterns effectively blended nicely to produce smooth gradations of colour.<br />
<br />
Testing it in the dark with a particle simulation yielded nice soft shadows and a warm campfire effect.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgq-feq7VsshGs3cbzAnAOQXMKYAdax1LOEbfI0mQZJgriN2oVt6zm9yg9C7fY9Fqx7ybzRSwhA9xdBq4Ler9Nj8ZbEo8nx9JmFOuBUxw6LWdD7-LGbfwYGWGjuM-RNYOW6Qy9w9Pvtyps/s1600/campfireMode.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="281" data-original-width="500" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgq-feq7VsshGs3cbzAnAOQXMKYAdax1LOEbfI0mQZJgriN2oVt6zm9yg9C7fY9Fqx7ybzRSwhA9xdBq4Ler9Nj8ZbEo8nx9JmFOuBUxw6LWdD7-LGbfwYGWGjuM-RNYOW6Qy9w9Pvtyps/s640/campfireMode.gif" width="640" /></a></div>
<br />
Shooting these clips with my iPhone, I began to confront the exposure issues that make shooting bright lights in dark places a bit tricky. It's really hard to get a sense of the right colours the lamp is capable of via digital cameras. What seems burnt out white at the bottom of the fire in the GIF above is actually really nice on the eyes. However, later I realised shooting under-exposed with a larger sensor would produce far better results. More on that later.<br />
<br />
<span style="font-size: x-large;">Standalone</span><br />
<span style="font-size: large;">------------------------------------------------------------------------------</span><br />
<br />
So, it was about at this point I started considering a couple of things. 1: What is the final form of the base going to be, and 2: I wanted to put this somewhere else in the house and NOT have to have it connected to a computer via USB the whole time. This meant I needed to choose a small form-factor stand-alone computing platform that fits in the base I design, and can run the Fadecandy server plus compute the processing sketches that power the lamp.<br />
<br />
Now, having chosen the Fadecandy board to control the lights, the choice of platform is reasonable [OSX, Windows, Raspberry Pi and Linux] but does NOT include Arduino. In fact one of the reasons Fadecandy exists is to do serial computation at a rate that most Arduino boards cannot. So the only other readily available small form-factor platform is Raspberry Pi. It's also one I have a shot at doing more complicated coding and setup in due to it's embedded Python support.<br />
<br />
There ARE more high-end small form-factor platforms out there ie take a look at https://www.hardkernel.com for more. AND those platforms offer potentially much more CPU power for computing sketches at a rate similar to my mac mini. But they cost more, need to be shipped internationally, require more cooling etc. Plus I could go buy a Raspberry Pi locally and start messing with it.<br />
<br />
One more thing, the now famous and useful page I linked to in Part 1 - <a href="https://learn.adafruit.com/1500-neopixel-led-curtain-with-raspberry-pi-fadecandy/dry-run" target="_blank">1500 NeoPixel LED Curtain with Raspberry Pi and Fadecandy</a> concludes with the following statement:<br />
<br />
<span style="font-family: "arial" , "helvetica" , sans-serif;"><b><i>"Python and C both perform quite well on the Raspberry Pi. An expert programmer could make this a self-contained system, not reliant on a networked computer to drive the animation."</i></b></span><br />
Now, that's what I needed to do. I wanted a fully standalone lamp with a button on the front to change sketches and maybe a knob to adjust a parameter in the currently playing mode etc? And the Raspberry Pi is available on the local network via wifi, so what about serving a webpage with animated GIF buttons for the different modes so guests can change it? Whoah, slow down there dude, you don't even have a Pi yet.<br />
<br />
Ok, time to fix that.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhy8k900xtXZjpBQply-iYy6Mo6eX_-_zv_iRC750r6uaGFWD0adcCUjeOyUs6h9c9HdUhaC8na1cdRiI-varL13Pxv-45pFNQXZRaJPgtw5fRtpvFHWYX6TvcGCjdAU4K16z16GRH7r6c/s1600/RaspberryPi.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhy8k900xtXZjpBQply-iYy6Mo6eX_-_zv_iRC750r6uaGFWD0adcCUjeOyUs6h9c9HdUhaC8na1cdRiI-varL13Pxv-45pFNQXZRaJPgtw5fRtpvFHWYX6TvcGCjdAU4K16z16GRH7r6c/s640/RaspberryPi.jpg" width="480" /></a></div>
<br />
Raspberry Pi Model 3 B+. NZ$60. A full quad-core linux computer you can fit in a pack of playing cards ... well, almost. It was pretty cool to unbox this, plug it into my monitor, give it a mouse and keyboard, an operating system [Raspian] on a SD card and have a computer up and running that comes with Minecraft pre-installed!? It certainly gave Jamie some ideas about building a lunchbox laptop with Retropie to take to school.<br />
<br />
A couple of evenings tinkering around and I had the Raspberry Pi running my own lavalamp sketches completely standalone.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCUtz8x6hgqU1VvG4EPpNLbRNtE23pYUmy5aKaBUpkb3SepURJq-BvHv-eMqKNeAjrcbe-5dFa_80cb9-TMu8BuTV1awcUIGdBM3lRyNe0WfVZOSyqvhRjMOZkXOyxyZb97F4fIZSt6g4/s1600/standalone.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="564" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCUtz8x6hgqU1VvG4EPpNLbRNtE23pYUmy5aKaBUpkb3SepURJq-BvHv-eMqKNeAjrcbe-5dFa_80cb9-TMu8BuTV1awcUIGdBM3lRyNe0WfVZOSyqvhRjMOZkXOyxyZb97F4fIZSt6g4/s640/standalone.jpg" width="450" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">You can see here also a hardware button and a green status LED connected to the GPIO pins. More on that later. You can also see what a cluttered mess my desk is.</td></tr>
</tbody></table>
I'll cover the software side of this in the next post or two. But now that I had the Pi set up and knew how big it was I could start thinking about the base for the lamp.<br />
<br />
<span style="font-size: x-large;">Style choices</span><br />
<span style="font-size: large;">---------------------------------------------------------------------------------</span><br />
The original ION lamp that inspired this whole thing was superbly minimalist. I was unable with my time and access to tools & materials to mimic their aluminium touch sensitive casing. I used an upside-down $14 vase anyway! I didn't want to do something too out of keeping. Looking around, I saw a variety of things that could work, maybe with some modifications etc<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhS68jHOWGXnbpLGL5o2yM-gqu36lKq5KeFhn3_RnDarD5AoLIRIFaVTYIYoH6FkqcMmOWQOCiKMKoWrTB-H2gG7gGTRMXI29tjzSEgoLxZAaSwp-yCIP1tVJKGbrzEfS7qDr6vFAz8NU0/s1600/baseInspirationIdeas.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="1582" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhS68jHOWGXnbpLGL5o2yM-gqu36lKq5KeFhn3_RnDarD5AoLIRIFaVTYIYoH6FkqcMmOWQOCiKMKoWrTB-H2gG7gGTRMXI29tjzSEgoLxZAaSwp-yCIP1tVJKGbrzEfS7qDr6vFAz8NU0/s640/baseInspirationIdeas.jpg" width="632" /></a></div>
<br />
I wanted to keep a natural material feel to it, so maybe some wood? I really liked the white mesh of the light cover in the bottom right hand corner of the image above. That would provide a suitable amount of air flow for cooling as well as look relatively tidy? I kept looking and stumbled across this lamp in a local hardware store:<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJ0-67p0_k0ri45iwn4OmV4XH57PDHWDzqlShYdEpsHX_9ou-E7FRztaU2Od5UjxH5pFYU1FF49DNYNNVobX8XU8A6pZqBjxaQ5FFF8f063GXXxQsovZ_NUE2qArYas1sNjkagWvbM7bQ/s1600/copperLamp.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJ0-67p0_k0ri45iwn4OmV4XH57PDHWDzqlShYdEpsHX_9ou-E7FRztaU2Od5UjxH5pFYU1FF49DNYNNVobX8XU8A6pZqBjxaQ5FFF8f063GXXxQsovZ_NUE2qArYas1sNjkagWvbM7bQ/s640/copperLamp.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Oh hells yeah. A little steampunk maybe but, meh.</td></tr>
</tbody></table>
Plus it had a nice patch of solid material at the bottom that could provide a nice two-panel appearance. I brought with me the measurement of the diameter required to ensure the Raspberry Pi would fit inside laying flat, and it gave a little extra room which I'd need to accommodate the USB and power connections. Just how close that would come, I was soon to find out.<br />
<br />
I was planning to run the Raspberry Pi in a headless fashion, so the HDMI and keyboard and mouse cables would not be required. I was connecting via wifi also, so no ethernet to consider. Just power, a USB-A or two for the Fadecandy and a mic perhaps?<br />
<span style="font-size: x-large;">Construction</span><br />
<span style="font-size: large;">---------------------------------------------------------------------------------</span><br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCK1hHZ62Z08qCsr2Z7ZAJpz9sZZcI2C3HTzg_jZqpJ9aQj7apfms5WJWHNiqXsLFlUHQNtz3rB8p7O4TqNQc1Olx9lHQ25BWZnCumEQuJc6mR2Jp4w_7wcaaF_LQzet2dMqEnJ35zVnY/s1600/meshCutsNice.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="600" data-original-width="1155" height="332" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCK1hHZ62Z08qCsr2Z7ZAJpz9sZZcI2C3HTzg_jZqpJ9aQj7apfms5WJWHNiqXsLFlUHQNtz3rB8p7O4TqNQc1Olx9lHQ25BWZnCumEQuJc6mR2Jp4w_7wcaaF_LQzet2dMqEnJ35zVnY/s640/meshCutsNice.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The cutting disc on the dremel made short work of the copper cylinder</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgaNPdSpUSr3I0o78IoQ6EyFk4Xsu0SbWVFdYZiit4xU9Qu8tgtt9fLe5jt7cONH-rRUe-qqDE91Yn2MeifIGtSMWXOdUsqjKqACutGNKOBFPuLltFs2Fa4iC2yRj2BS9U2f5ZXPt67gIg/s1600/itFits.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="600" data-original-width="800" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgaNPdSpUSr3I0o78IoQ6EyFk4Xsu0SbWVFdYZiit4xU9Qu8tgtt9fLe5jt7cONH-rRUe-qqDE91Yn2MeifIGtSMWXOdUsqjKqACutGNKOBFPuLltFs2Fa4iC2yRj2BS9U2f5ZXPt67gIg/s640/itFits.jpg" width="640" /></a></div>
I found a cheap cheese platter board at a local homewares shop that gave me some nice dark hardwood that made an excellent base to mount the Pi onto, and also provided the wood effect I wanted for the glass vase/cover to stand on. I cut two circles out and sanded the edges to provide a snug fit into the cylinder mesh.<br />
<br />
So I underestimated the amount of room the USB-A plugs would require to fit inside the mesh cylinder and had to embark upon some shroud-stripping and smallification.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGuP-l4xKdoXR0nP8v9_-NjDR16WKqL2pNgt0xPyw7oyiKGl3WIBf6rmscceO07q_z13fTXiBWHkANjw4xUxlfuwMWrLoTQMkpyqmVU8z4BYEe5NWu9COJKxV5CBGLuXj9iH-2WHOmErE/s1600/USBbend.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGuP-l4xKdoXR0nP8v9_-NjDR16WKqL2pNgt0xPyw7oyiKGl3WIBf6rmscceO07q_z13fTXiBWHkANjw4xUxlfuwMWrLoTQMkpyqmVU8z4BYEe5NWu9COJKxV5CBGLuXj9iH-2WHOmErE/s640/USBbend.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This might look a little dodgy but I'd really only exposed the ground wire and planned to heat shrink it anyways.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHwtNkK4pOLDC1KfWgzk-aESPc3XkuGdadzuujXommSoDsIbk3GNa7W4fQQ6gbwAF3Sjbd2F4jk0j_fFLlPEdWgZo7PIPvKZbDPNcMVgRy2B0XMs9vg4BK6lnplfNCrwUn4m88kto98B8/s1600/USBbendCorner.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="522" data-original-width="800" height="416" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHwtNkK4pOLDC1KfWgzk-aESPc3XkuGdadzuujXommSoDsIbk3GNa7W4fQQ6gbwAF3Sjbd2F4jk0j_fFLlPEdWgZo7PIPvKZbDPNcMVgRy2B0XMs9vg4BK6lnplfNCrwUn4m88kto98B8/s640/USBbendCorner.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It fits, and it works. Phew!</td></tr>
</tbody></table>
I mounted the Pi up on some standoffs that'd give me some room to deal with cables and the Fadecandy board too. I countersunk the screws on the bottom so they wouldn't protrude. I planned to put some rubber feet on once it was complete.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgG6G7yyLODonyJcQtuaRIrAB2NAxxNhK1uAFUEBZZfJoL7TlldLSbxgH0Ga_SY3bndfqJqkELw5x2RZ6Ful5Yfmz-UgTGypaKeU5rwpyt3CZk2K79Q5fRS3kTPR1OBcBfBbwc9RrXGe_k/s1600/counterSunk.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="562" data-original-width="800" height="449" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgG6G7yyLODonyJcQtuaRIrAB2NAxxNhK1uAFUEBZZfJoL7TlldLSbxgH0Ga_SY3bndfqJqkELw5x2RZ6Ful5Yfmz-UgTGypaKeU5rwpyt3CZk2K79Q5fRS3kTPR1OBcBfBbwc9RrXGe_k/s640/counterSunk.jpg" width="640" /></a></div>
<br />
While I was working through this phase I also made decisions about buttons and lights. I'll detail more about this in the software section. I needed one button to switch the sketch that was running and another to start up and shut down the Pi. The Raspberry Pi is a complete Linux computer and Linux file systems can be corrupted if they lose power - so you simply cannot pull the plug reliably. You MUST shut them down nicely or deal with losing everything.<br />
<br />
I also wanted a status light that I could use at a glance to know if the lamp was on, or ready to be safely switched off etc. All these buttons and lights connect to the GPIO pins on the Pi but needed to be wired up and placed during this part of the build.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmdQ24mwL6h_6W6ozo791x_yVXzm9pQVNnhBr__EP_iWrmeF3K8xzzgxeC3WLrkz-FELbl5TQuMe2pWAoSlP78XO6cbuSNY5-KUC8xFpAiSBO9UeJmebCg6UcHK9x9acLSZ3XYWjuD1c4/s1600/ButtonsAndLights.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="494" data-original-width="800" height="394" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmdQ24mwL6h_6W6ozo791x_yVXzm9pQVNnhBr__EP_iWrmeF3K8xzzgxeC3WLrkz-FELbl5TQuMe2pWAoSlP78XO6cbuSNY5-KUC8xFpAiSBO9UeJmebCg6UcHK9x9acLSZ3XYWjuD1c4/s640/ButtonsAndLights.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I decided to wire up both blue and green LED choices to see which I preferred. Although I liked the blue it was too piercing and I went with the green for a more old school look.</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmRpvNKYSzwhlU6fukT6RdpbipyD9mN3r1QCiCTZ0NyrKExQvxL5ZiZU0duOCvwA17cGXLWbDFjaVbW1zuunSsONyrTzzeP8n6QEpCBTuJq2ke59RkD5biePctA0l6A-tpei-YzknOK3E/s1600/FrontBackButton.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="1200" height="426" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmRpvNKYSzwhlU6fukT6RdpbipyD9mN3r1QCiCTZ0NyrKExQvxL5ZiZU0duOCvwA17cGXLWbDFjaVbW1zuunSsONyrTzzeP8n6QEpCBTuJq2ke59RkD5biePctA0l6A-tpei-YzknOK3E/s640/FrontBackButton.jpg" width="640" /></a></div>
<br />
Cutting the button holes went easily enough with a combination of cutting and sanding discs on the dremel. One button on the front for switching sketches and one on the back for startup and shutdown.<br />
<br />
I made the decision to go with two power cables to supply the lamp. This wasn't as neat as I hoped but I didn't feel brave enough to drive the Pi via the power supply I'd bought for the LEDs as their requirements at full-white might risk drawing too much juice and make the Pi brown-out. So I chose to live with running both the LED power cable AND the Pi 5.1v adapter supply together. They needed to go into the side of the base at the rear but I didn't want them rubbing on the rough copper cuts I made. So I lined the edge of this hole with a hose washer I found at the hardware store that was the right size.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiTK3SheqgXUlt38FCVwU5QeqBC8bgCj4oPOos5fo6w7pMbwk2S3uW5QOkAeNz-Sg_zz-RjwGb9LDK3lloL_8yM1VMFdD8ekZ4_uvKHBohtMviEnx_46-AFFKyUOit0w-VVcAGDU4wIj1E/s1600/powerIn.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="600" data-original-width="800" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiTK3SheqgXUlt38FCVwU5QeqBC8bgCj4oPOos5fo6w7pMbwk2S3uW5QOkAeNz-Sg_zz-RjwGb9LDK3lloL_8yM1VMFdD8ekZ4_uvKHBohtMviEnx_46-AFFKyUOit0w-VVcAGDU4wIj1E/s640/powerIn.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's not exactly strain-relief but it'll stop any wires from being cut. Also you can see where I slipped with the dremel and scratched the copper surface while cutting the power hole. D'oh!</td></tr>
</tbody></table>
I cut a groove into the top wood disc to give the vase something to sit into and not fall off. I had a plan to secure this even better once it was complete.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwKdQtU8RIPb9qCfsHesUpJAfz-fhyU8ORQ84_ivVSaW85qPM-SmzVLm31DbF_DOLhNR04sK2W22fWtpgPXj-hFmVtamBII6jLH3_4AZ8YZXHBzg5__KXBYMg3rEZjMqnw3x6G9xuzVfE/s1600/groove.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="554" data-original-width="1296" height="272" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwKdQtU8RIPb9qCfsHesUpJAfz-fhyU8ORQ84_ivVSaW85qPM-SmzVLm31DbF_DOLhNR04sK2W22fWtpgPXj-hFmVtamBII6jLH3_4AZ8YZXHBzg5__KXBYMg3rEZjMqnw3x6G9xuzVfE/s640/groove.jpg" width="640" /></a></div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUGDUPx-et1f-m9qv6Kqk1t0kfyys2QFqlfvVVRauVIyWYIxHvN7kutV4QDQWOVW0tudrreEKlegXx8aX-RrEIEv_g0V0KVOKWQvvkkaiLWmrK6VgH8T4HE1Wr5R-_Wv5-KIxN7SG2qDs/s1600/firstLooks-3.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUGDUPx-et1f-m9qv6Kqk1t0kfyys2QFqlfvVVRauVIyWYIxHvN7kutV4QDQWOVW0tudrreEKlegXx8aX-RrEIEv_g0V0KVOKWQvvkkaiLWmrK6VgH8T4HE1Wr5R-_Wv5-KIxN7SG2qDs/s640/firstLooks-3.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My first real look at the aesthetic direction.</td></tr>
</tbody></table>
I moved onto mounting the central PVC cylinder covered in lights to the top wooden plate. I needed something removable in case I needed to replace any LED strips or to fix busted solder connections etc. I used some of the remaining copper mesh and cut some bendable strips to make some right-angle brackets.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdwnp8XtVofvmgkQ5sygH8JXwYjq2jVa-ZD70YB7xemKjUNr5OGt3QC_UFuYEiZ-1y5x7lq7jFSJH97Gc-_G4eWBw0YezPmDCz8piRRJknJE_7n9CVUpQSUS8gZUWN25ATOYrdxNyjUWE/s1600/brackets.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="890" data-original-width="800" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgdwnp8XtVofvmgkQ5sygH8JXwYjq2jVa-ZD70YB7xemKjUNr5OGt3QC_UFuYEiZ-1y5x7lq7jFSJH97Gc-_G4eWBw0YezPmDCz8piRRJknJE_7n9CVUpQSUS8gZUWN25ATOYrdxNyjUWE/s640/brackets.jpg" width="574" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I drilled into some of the existing holes in the mesh to enlarge them just enough for the screws required.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEict97QKSCD67kGthNWRtEYMrfdPlT4o5Y_zVof5PN0TsfUQexKt1HnuNSDXYUwY5-CpQeSEm3YPi13a-PVpNc8lT1ryo-uyQ9mTCRDk6VxNn8mHEIzUn6vuPVw_zTXBOJT3Kb0E7y4Ly0/s1600/atop.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEict97QKSCD67kGthNWRtEYMrfdPlT4o5Y_zVof5PN0TsfUQexKt1HnuNSDXYUwY5-CpQeSEm3YPi13a-PVpNc8lT1ryo-uyQ9mTCRDk6VxNn8mHEIzUn6vuPVw_zTXBOJT3Kb0E7y4Ly0/s640/atop.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Success. PVC tube attached.</td></tr>
</tbody></table>
Now it was time to get all that wiring below deck. No-one is going to see it thanks to the diffuser material inside the vase cover but heck, I'm a neat-freak. I needed to cut a hole through the wood and get the power and data wiring for the LEDs down to the Pi and Fadecandy etc.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-q0ZEpNP5JNPObQWf_B512iSsyGH33YquIsW_XPDB7BrvecZszu1iEizpfkJQ0h9lRuSP9Li4qEHIT_ZnP4t3kSsXAvYoP2hODfPYe7f-QdH9r1B_manTtUXeS67IAflumFtZTCRoyRU/s1600/HolesThrough.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="491" data-original-width="800" height="392" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-q0ZEpNP5JNPObQWf_B512iSsyGH33YquIsW_XPDB7BrvecZszu1iEizpfkJQ0h9lRuSP9Li4qEHIT_ZnP4t3kSsXAvYoP2hODfPYe7f-QdH9r1B_manTtUXeS67IAflumFtZTCRoyRU/s640/HolesThrough.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's ugly but it'll get cleaned up.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRPM1nBiUKZl0j7ApClOzDEZt_SQXTxb58Xf_ratz8L-nEkyAcn4wqjoPI47S7OUZXnzWq7n82EwiEIzLKqIgTgiO9U60Ul7hzmdhGeMsd_kgjhFepKqVE7m0T-HCFNp4JwrPdAVR9sgQ/s1600/stackingUp.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="600" data-original-width="800" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRPM1nBiUKZl0j7ApClOzDEZt_SQXTxb58Xf_ratz8L-nEkyAcn4wqjoPI47S7OUZXnzWq7n82EwiEIzLKqIgTgiO9U60Ul7hzmdhGeMsd_kgjhFepKqVE7m0T-HCFNp4JwrPdAVR9sgQ/s640/stackingUp.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's all stacking up nicely.</td></tr>
</tbody></table>
Diving back to the rest of the install for a mo, I took time to get the Fadecandy nestled happily underneath the Pi and shielded. I'd rationalised that the Pi would generate the most heat and as such bought two small heatsinks to place on the main CPU chipset. I didn't want the Fadecandy board above this and chose to seat it downstairs.<br />
<br />
During my sketch investigations I found a small USB microphone on Amazon that just plugged in and worked in Linux. This meant that using a Processing library called <a href="http://code.compartmental.net/tools/minim/" target="_blank">Minim</a>, I could modulate any parameter of a running sketch. So I could change the colour of the particles in my campfire sketch for example, just by monitoring sound in the environment. I decided to include this capability and make use of it later during the software phase.<br />
<br />
The microphone was a little on the large side and had a flexible boom neck that I really didn't need. So I disassembled it and you can see the condenser unit sitting on top of the USB plug stack. It used to look like this:<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVJOqjlx5P0U5ABF54ImKU0mb_nCJrMB1RwZsJfQcIza1uaDJJZ2eW5cuykpkj98PZDDwFGYioGH66dd20HR_dwaBVZKzq11Z3l6QwpDdq78mnJiTCd-bAYXohWy4VysKTkpts-jTVE4g/s1600/amazonMic.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="509" data-original-width="476" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVJOqjlx5P0U5ABF54ImKU0mb_nCJrMB1RwZsJfQcIza1uaDJJZ2eW5cuykpkj98PZDDwFGYioGH66dd20HR_dwaBVZKzq11Z3l6QwpDdq78mnJiTCd-bAYXohWy4VysKTkpts-jTVE4g/s400/amazonMic.jpg" width="373" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I just need the mic and USB stuff really. Not the flexible neck or casing.</td></tr>
</tbody></table>
In taking it apart, I found it had a little logic board that I shrouded in transparent heat-shrink and placed underneath the Pi alongside the Fadecandy board - see the image above.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAtMbbDu2RvThEpd7oQbNvrMNpQUyp_RpH_7RGRPzBTyuGpUPR_jpRH1YC60hpjlEHrVk77HsYbU43OZt0TX_uf_zj76QU_UX7LdCxFQG1TgMJxpMuVbrZGjYOIq06Jr1uiuv0s7b4fP4/s1600/statusLED.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="517" data-original-width="800" height="412" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAtMbbDu2RvThEpd7oQbNvrMNpQUyp_RpH_7RGRPzBTyuGpUPR_jpRH1YC60hpjlEHrVk77HsYbU43OZt0TX_uf_zj76QU_UX7LdCxFQG1TgMJxpMuVbrZGjYOIq06Jr1uiuv0s7b4fP4/s640/statusLED.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Cling to the wall status LED. Just hang in there little buddy.</td></tr>
</tbody></table>
The install was really coming together at this point. I still had to complete the pathing of the LED array power and data connections from the upper deck to the lower section, but in the meantime I installed the LED status light by simply bending the legs of the LED and foam taping it to the internal wall of the mesh cylinder. Not the neatest solution, but I didn't want to use glue or have to have another screw protruding through the mesh.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfCwHYivneAkLZq3QnhrqRjAuLu20qYI2bYVb74XN3hTubYftxQMRmQSbJW2Clh06oI7Pm36hGq_QB2maY9PEqkdUXsBAG8DouJVIO7b4sPX-yUKS2UoaFpV37Rp4auiJH5VNNdxmbj3w/s1600/upstairsDownstairs.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgfCwHYivneAkLZq3QnhrqRjAuLu20qYI2bYVb74XN3hTubYftxQMRmQSbJW2Clh06oI7Pm36hGq_QB2maY9PEqkdUXsBAG8DouJVIO7b4sPX-yUKS2UoaFpV37Rp4auiJH5VNNdxmbj3w/s640/upstairsDownstairs.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Upstairs/Downstairs</td></tr>
</tbody></table>
Ok. I knew at some point I was going to have to disassemble this thing and troubleshoot it for some reason. Who knows why. Something will go wrong at some point. I wanted to make it easy to do that by simply being able to disconnect the upper floor from the lower. Having messed about with flying small quadcopters recently I decided to use some XT-60 connectors because they were both reliable and they also happily conducted a LOT of power.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg5wMCHsc8eSoGAlV1qAHinF3xbldBbo2dzCVeoKr5Bt4C_EdRnBQKEEpwiGsbnjT7bISSNiVieODQIZ-GovZj19_bJJqhM4vOJDnOUV7Mmu98JoCWIk9fRpUFwS05PVmItedo6a0g-EbA/s1600/XT60.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="537" data-original-width="800" height="428" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg5wMCHsc8eSoGAlV1qAHinF3xbldBbo2dzCVeoKr5Bt4C_EdRnBQKEEpwiGsbnjT7bISSNiVieODQIZ-GovZj19_bJJqhM4vOJDnOUV7Mmu98JoCWIk9fRpUFwS05PVmItedo6a0g-EbA/s640/XT60.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">XT-60 for the win!</td></tr>
</tbody></table>
You can also see in the image above the capacitor I added to the incoming power connection to prevent any surges from frapping my LEDs.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVGtMrDY4_wbuDSOnM1vWP5E-KeSq338NiGsBu4SqFXT9kzZB-fpg9k-x78MGpDxECJEaYsagffYYpvfNNsrO7FX5pvTEdO-rr4M5tSim_Ab03g8XV6rQRoWkeJdel0gKXNbrNjZpVQrI/s1600/powerLights.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="600" data-original-width="800" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVGtMrDY4_wbuDSOnM1vWP5E-KeSq338NiGsBu4SqFXT9kzZB-fpg9k-x78MGpDxECJEaYsagffYYpvfNNsrO7FX5pvTEdO-rr4M5tSim_Ab03g8XV6rQRoWkeJdel0gKXNbrNjZpVQrI/s640/powerLights.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Powering it up on the bench. You can see the green status LED as well as the Pi board lights through the mesh.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjFVdEsVJf4AEfk7i5hqI-tUnX0Z2c77n8EoKdc9VI0z73_2IX1hSI0L6so13KjUMw_JmIyW-I2PKcfzisgPX0ezQNouSEWWQ7dkNB6p1jfi6i3hA_Oexd3yzWjsi7monEJTwFB9l55ebI/s1600/powerBench.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjFVdEsVJf4AEfk7i5hqI-tUnX0Z2c77n8EoKdc9VI0z73_2IX1hSI0L6so13KjUMw_JmIyW-I2PKcfzisgPX0ezQNouSEWWQ7dkNB6p1jfi6i3hA_Oexd3yzWjsi7monEJTwFB9l55ebI/s640/powerBench.jpg" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's alive Stimpy!</td></tr>
</tbody></table>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgbQe3UaiFq7rVubG9TPU-Lq6rb-IJji2u9mgLBTywX3FrU6dT1X_cgzrntqW2gDj7zYhNtcpi6_eusCFRwR7SCu9YpppxSpwQ8ArKQuJ9Hjhk0upzo1aAzGlAwMXegeZmqjIun-UGhZF0/s1600/coverOnColour.jpg" imageanchor="1"><img border="0" data-original-height="800" data-original-width="1200" height="426" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgbQe3UaiFq7rVubG9TPU-Lq6rb-IJji2u9mgLBTywX3FrU6dT1X_cgzrntqW2gDj7zYhNtcpi6_eusCFRwR7SCu9YpppxSpwQ8ArKQuJ9Hjhk0upzo1aAzGlAwMXegeZmqjIun-UGhZF0/s640/coverOnColour.jpg" width="640" /></a></div>
<br />
And then with the cover on all that remained was to drill holes for the screws to keep the top wooden platform in place just poking out of the mesh. That's if for the build and install. From here on it's mostly software and adding new capabilities.<br />
<br />
The next section will be all about the software side of things. I'll detail the tweaks and discoveries I made during my Raspbian setup as well as some Python for switching sketches based on pushing the front button etc. Also some Processing sketch discoveries. See you <a href="https://blog.julianbutler.com/2019/02/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">there</a>.<br />
<br />
<a href="https://blog.julianbutler.com/2019/02/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">Part 3 is here now.</a><br />
<br />
-j<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com25tag:blogger.com,1999:blog-1829332005373102680.post-46045456396039538182018-12-13T23:50:00.001+13:002019-02-25T10:38:38.348+13:00My Digital Lavalamp - or "The MkI Epilepsy Generator" - Part 1A three part series wherein I discuss the problems and solutions I encountered during the production of my own interactive LED lava lamp display:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOV1c4bzQsFPRLVICYt61tDXn9sp-jNzpg72qcFSyRAZe_-JAU4HKgOMLxpj8qlxDKD-8KduddrcliKFY9nKhbgjhDy6SN5R1nq1FwXOQZxhzD3kaHXPDysFrb3PuDu8VlN3IfivFNqoI/s1600/lavalampWarmBlobs.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="621" data-original-width="800" height="496" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOV1c4bzQsFPRLVICYt61tDXn9sp-jNzpg72qcFSyRAZe_-JAU4HKgOMLxpj8qlxDKD-8KduddrcliKFY9nKhbgjhDy6SN5R1nq1FwXOQZxhzD3kaHXPDysFrb3PuDu8VlN3IfivFNqoI/s640/lavalampWarmBlobs.jpg" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<span style="font-family: inherit;"><br /></span></div>
<span style="font-family: inherit;"><b>PHOTOSENSITIVE WARNING: READ BEFORE WATCHING</b> A very small percentage of individuals may experience epileptic seizures when exposed to certain light patterns or flashing lights. Exposure to certain patterns or backgrounds on a computer screen, may induce an epileptic seizure in these individuals. Certain conditions may induce previously undetected epileptic symptoms even in persons who have no history of prior seizures or epilepsy.</span><br /><iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i9.ytimg.com/vi/kiJ81XEa2kA/default.jpg?sqp=CLTJ9eIF&rs=AOn4CLBePkAup2Mlwf9HPhFazzNUgl_pNA" frameborder="0" height="266" src="https://www.youtube.com/embed/kiJ81XEa2kA?feature=player_embedded" width="320"></iframe><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">However, to get there from here, we'll have to go back a few steps...</span><br />
<span style="font-size: x-large;"><br /></span>
<span style="font-size: x-large;">Ion</span><br />
<span style="font-size: x-large;">-------------------------------------------------------------</span><br />
I saw this product on a friend's desk at work: <a href="https://www.kickstarter.com/projects/lavallc/ion-a-music-detecting-mood-light-with-bluetooth-lo" target="_blank">Ion - A Music Detecting Mood Light - Kickstarter.com</a> and I wanted one.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQYbookRn2M_pjn6KWgNnTeE-BudDEQDxqtOZNqhasPMTlWk0nRB3vwZp-IyjaXmJcxhcNtpix8UG2YgfQHrMZJesopXZpZuqD1vWEpSeW-HhbUhjBHVl507U8DLRJYTSIoE_QL-dBAO8/s1600/Ion_lamp.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="873" data-original-width="1552" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQYbookRn2M_pjn6KWgNnTeE-BudDEQDxqtOZNqhasPMTlWk0nRB3vwZp-IyjaXmJcxhcNtpix8UG2YgfQHrMZJesopXZpZuqD1vWEpSeW-HhbUhjBHVl507U8DLRJYTSIoE_QL-dBAO8/s640/Ion_lamp.jpg" width="640" /></a></div>
<br />
But the Kickstarter campaign was long over, and the site it spawned to sell them post the campaign had shut down also. There were none on Ebay or anywhere else in stock I could find. I've owned lava lamps over the last 25 years and I really like them and this seemed like a really smart advancement. I was looking for a project that would force me to delve into addressable LEDs so I decided to make my own. What's better than a lava lamp that can be every lava lamp?<br />
<br />
I started a massive Evernote file filled with questions about how to build it, what hardware to choose, what functions it could/should have etc. I consulted with friends and pooled ideas about different modes and what they thought might be cool etc. The original Ion lamp was capable of some pretty sublime colours, and also some cool interactive modes too:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhowLAAH2kro78Nf5jgeDyKoPmll54Qy7IfmA3HALvABMmx8xmiiDQB_L0QdmUpEf863J-i3dizAXMWo1oCw4vGCQCw2mf-BurGZ1lr4BwR_IvYnE263GMfRQN5RKGAayFGr_UaXG7beaw/s1600/IonPlasma.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="600" data-original-width="400" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhowLAAH2kro78Nf5jgeDyKoPmll54Qy7IfmA3HALvABMmx8xmiiDQB_L0QdmUpEf863J-i3dizAXMWo1oCw4vGCQCw2mf-BurGZ1lr4BwR_IvYnE263GMfRQN5RKGAayFGr_UaXG7beaw/s400/IonPlasma.png" width="266" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjaMfA4geGtOyOa3bAYg61Fp42G_CQOHRaS2Bge6wTRJTrKfOGcqhxT11BC0_j_GlysF0Ys6wnSfawd5v4CfTwXSfOLA2sq5jKEF-POLweaHVZnuFLH0VsmakFidpiWiWOkMuPLvgOr0hk/s1600/ionModes.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="475" data-original-width="600" height="316" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjaMfA4geGtOyOa3bAYg61Fp42G_CQOHRaS2Bge6wTRJTrKfOGcqhxT11BC0_j_GlysF0Ys6wnSfawd5v4CfTwXSfOLA2sq5jKEF-POLweaHVZnuFLH0VsmakFidpiWiWOkMuPLvgOr0hk/s400/ionModes.gif" width="400" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
I felt like I could see the the individual LEDs themselves and that I could improve on the resolution and offer a wider variety of modes including sound reaction, API connections to favourite services, IFTTT integration, voice recognition and, and ... and... well I'm still adding some of those now it's built.<br />
<br />
One thing I haven't tried to replicate is the Ion's bluetooth connection to your phone. That's possible of course, but I had a lot to figure out and had better start with just controlling some lights first huh.<br />
<br />
<span style="font-size: x-large;">Fadecandy</span><br />
<span style="font-size: x-large;">-------------------------------------------------------------</span><br />
I had to make a decision about how to control the LEDs. There are many ways to do this, but which was going to be the easiest and friendliest to my current coding capabilities? Plus, where there any which improved on the somewhat basic RGB output offered by off the shelf Arduino kits?<br />
<br />
The answer is indeed yes thanks to <a href="https://www.misc.name/" target="_blank">Micah Elizabeth Scott</a> and her <a href="https://www.misc.name/fadecandy" target="_blank">Fadecandy</a> hardware board. Micah Elizabeth Scott has been crafting displays for annual trips to the Burning Man festival amongst other art installations and interactive experiments. As she shows on her site, most normal LED controllers fall into a trough of sadness when it comes to blending hues together or displaying correct colouring as low light levels. She created the Fadecandy hardware to solve these issues.<br />
<br />
She partnered with <a href="https://www.adafruit.com/" target="_blank">Adafruit</a> to turn the Fadecandy board into a small and affordable form factor unit that can control a metric crap-tonne of LEDs for really, really big joy-filled displays.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYgzenK0MZWDafoWS2rcduFx7dOen89sluxYW1WpTv9Y7w76yDUYCybvHjwi7mFJjtotTVMZ4i8lIMC51i1PPXUhrDFQReX7dtSLR82HMqkknygHMjZwGiGbYIqhyphenhyphenBIUMY-4i0DpcEIUg/s1600/fadecandy.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="728" data-original-width="970" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYgzenK0MZWDafoWS2rcduFx7dOen89sluxYW1WpTv9Y7w76yDUYCybvHjwi7mFJjtotTVMZ4i8lIMC51i1PPXUhrDFQReX7dtSLR82HMqkknygHMjZwGiGbYIqhyphenhyphenBIUMY-4i0DpcEIUg/s400/fadecandy.jpg" width="400" /></a></div>
<br />
Better yet, it can be controlled via USB from big computers and small, embeddable computers like the Raspberry Pi etc. And it interfaces directly with <a href="https://processing.org/" target="_blank">Processing</a>, which I've already experimented with. Processing is a great platform to program generative art that can accept inputs including music, sound, sensors and other things. It's used by all sorts of creative people for interactive art installations, live music, large scale projections, small embedded pieces etc. Processing is also available for the Raspberry Pi, thus opening the door to my small scale needs.<br />
<br />
<span style="font-size: x-large;">Get some LEDs</span><br />
<span style="font-size: x-large;">-------------------------------------------------------------</span><br />
Where better to get addressable LEDs than Adafruit themselves? Actually... shh, Amazon had the very same lights with waaaay cheaper shipping so after some rough calculations I ordered two <a href="https://www.amazon.com/gp/product/B00R5CBOWY/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1" target="_blank">1 metre strips of 60 weatherproof NeoPixel RGB LEDs.</a> I also ordered a 5-volt, 10 amp power switching power supply [to handle NZ's 240v mains power] and one Fadecandy board.<br />
<br />
I decided that 8 vertical columns of 15 lights, wrapped around a cylinder should provide a suitable height and LED density to improve upon the Ion lamp resolution. I had to also figure out how to reproduce their diffuser solution as they talked on their page about their prototypes designed to deal with making the individual LEDs blend together into a whole. More on that later.<br />
<br />
The first thing to do was to get the Fadecandy board powered and connected to a computer in order to test my LED strips to make sure there were no dead LEDs. They're relatively hardy but sometimes die during shipping.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihCU1A37Na7Kg6rfAP4IQnMLZU6_pkjU0D3BnnTKsEOH6EtX8kUVHE06CBvg4AA1YbILGoxzI8p9m77ukS8kZS5j-pdRloqYI45n1jzQtYEOCArlLjTFYoIbwhaC9cBSdv50_0jWQ5Sgg/s1600/singleArray.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="800" data-original-width="600" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihCU1A37Na7Kg6rfAP4IQnMLZU6_pkjU0D3BnnTKsEOH6EtX8kUVHE06CBvg4AA1YbILGoxzI8p9m77ukS8kZS5j-pdRloqYI45n1jzQtYEOCArlLjTFYoIbwhaC9cBSdv50_0jWQ5Sgg/s400/singleArray.jpg" width="300" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Success! No dead NeoPixels.</td></tr>
</tbody></table>
I should mention at this point a note of gratitude to all the people on the internet who have written about their crazy projects and shared advice and tips on how to do stuff like this. <a href="https://learn.adafruit.com/1500-neopixel-led-curtain-with-raspberry-pi-fadecandy/overview" target="_blank">This page in particular</a> is excellent and offers a large amount of information for getting started.<br />
<br />
Micah offers <a href="https://github.com/scanlime/fadecandy" target="_blank">many example Processing sketches</a> designed to run directly on Fadecandy once you're connected and have the Fadecandy server up and connected. Here's Jamie testing the mouse-driven interaction:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9vjmODslg1i7u7ANtgIloLCVwfhgGA0gKDZk-Uz31ejKeiLIuj-6Z0bMptMDy7QTbG5il4Wxa6qkv0utyhM7Jb5wrtcxmalooPNBAtZKVhGv4wIyKHtzVtz5_NGRnMzF3ZYKabef6rD8/s1600/manualArrayInput.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="281" data-original-width="500" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9vjmODslg1i7u7ANtgIloLCVwfhgGA0gKDZk-Uz31ejKeiLIuj-6Z0bMptMDy7QTbG5il4Wxa6qkv0utyhM7Jb5wrtcxmalooPNBAtZKVhGv4wIyKHtzVtz5_NGRnMzF3ZYKabef6rD8/s640/manualArrayInput.gif" width="640" /></a></div>
<br />
Another example sketch gives you the ability to transform a bitmap through the sample points that are sent to the Fadecandy board and thus onto the LEDS. Here's what a simple bitmap of fire can look like via that method:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtLNb-7SDoSm0FvpmEnZS1AU60IXy5Nmu2QxK3zT9E03rHkpMpKPtY479tl-ytaYCx5Ct1W0gHUKitmfgjNCvYhtYUbuX0VyGcsh0hniATaBQxSIdjMeyArwyIRc0LASVJQLRXS5QlYNk/s1600/singleLineFire.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="281" data-original-width="500" height="356" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtLNb-7SDoSm0FvpmEnZS1AU60IXy5Nmu2QxK3zT9E03rHkpMpKPtY479tl-ytaYCx5Ct1W0gHUKitmfgjNCvYhtYUbuX0VyGcsh0hniATaBQxSIdjMeyArwyIRc0LASVJQLRXS5QlYNk/s640/singleLineFire.gif" width="640" /></a></div>
<br />
It's quite effective. The colours are excellent and the brightness can be overwhelming at times. This is a powerful way to manipulate the light array that means you don't have to be an expert programmer, you can envision cool effects just using things you can make in Photoshop.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjClIo4h-5Vq7CutAbp2Grikmj-rscFmhj2bHkZfW_LsaarW4xzJAA9wej-hEZuGkkYHgs4eCMmyAamC9OQ4rnfsYAT-UXaROePYPKGW84y567eH-JLM1TRJK4G3MTTO7eUq1H7h1IJhQ8/s1600/chickenNukesSingleLine.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="281" data-original-width="500" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjClIo4h-5Vq7CutAbp2Grikmj-rscFmhj2bHkZfW_LsaarW4xzJAA9wej-hEZuGkkYHgs4eCMmyAamC9OQ4rnfsYAT-UXaROePYPKGW84y567eH-JLM1TRJK4G3MTTO7eUq1H7h1IJhQ8/s640/chickenNukesSingleLine.gif" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">You can see the weird-ass bitmap I made in this example scrolling through the array sample points. These were meant to look like mini nukes.</td></tr>
</tbody></table>
With these simple examples working, and the creative power clearly accessible, it was time to form the complete array that will end up wrapping around the central cylinder in my lamp.<br />
<br />
As detailed in the <a href="https://learn.adafruit.com/1500-neopixel-led-curtain-with-raspberry-pi-fadecandy/data-topology" target="_blank">Adafruit NeoPixel Curtain example</a> I spent some time mapping and designing the separation between the power requirements of the array and it's data inputs. Each Fadecandy board offers 8 data outputs that can drive 64 NeoPixel LEDs each. I planned to drive 120 LEDs split into two strips. It was easy enough to simply drive each strip with a channel from the Fadecandy and not be too concerned that I wasn't using all the bandwidth of each channel. This did make for some funky OPC mapping that we'll get to soon.<br />
<br />
The next step involved soldering, which I can do but have never been great at. Nothing better than a reason to practise.<br />
<br />
Here's the array layout with my hand drawn power and data connections. You can see the arrows indicating the direction of the serial data that tells each LED what colour to display. I had to wire and solder all the missing parts at the end to complete the flow:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihIB8lo83fZsCbH8w372Rij0n8q8b17LYmW5P9RnnSl38RdHV5k8Vjle76DuCRDNfURfs6UdAxp0-Usq3JLFVtzLP6TUmWK3zKbzRnSG389pptPKyT1TpF4-KFIZN9ZMCZlgDHSfx6fdg/s1600/arrayLayout.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="489" data-original-width="800" height="388" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihIB8lo83fZsCbH8w372Rij0n8q8b17LYmW5P9RnnSl38RdHV5k8Vjle76DuCRDNfURfs6UdAxp0-Usq3JLFVtzLP6TUmWK3zKbzRnSG389pptPKyT1TpF4-KFIZN9ZMCZlgDHSfx6fdg/s640/arrayLayout.jpg" width="640" /></a></div>
<br />
<span style="font-size: x-large;">Enter the Vase</span><br />
<span style="font-size: x-large;">-------------------------------------------------------------</span><br />
I should mention at this point that I'd decided on the length of the strips according to the final form I intended to deploy the strips into. I wanted to mimic the Ion lamp styling and presumptuously assumed a local homeware shop [Briscoes] would simply have cylindrical glass vases that might suit the task. They did! Here's a pic of the vase, upside down on a temporary wooden plate with a central core of PVC tube from the hardware shop:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_LLY1-2JntrJSzPWxWCGALxdZLROkaWFVlQ6xcHglwa4KSVhazUWyz4XlZ3rEWt9lKdNZjICP8YBZiYNbIu05ZZpYgP9j6kXjKkU__U_n2G7kN1gIvBCLLtYYvYRGHkF0J4YuyF3VbSI/s1600/BriscoesVase.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="483" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_LLY1-2JntrJSzPWxWCGALxdZLROkaWFVlQ6xcHglwa4KSVhazUWyz4XlZ3rEWt9lKdNZjICP8YBZiYNbIu05ZZpYgP9j6kXjKkU__U_n2G7kN1gIvBCLLtYYvYRGHkF0J4YuyF3VbSI/s640/BriscoesVase.jpg" width="386" /></a></div>
<br />
My plan was to complete the wiring with the array laid out flat, then transfer it to the PVC tube and resolve the rest of the wiring loom issues as I went. I also gambled on being able to solve the diffuser issue down the line too as I could simply remove the vase and line it with some acrylic later once the array was working correctly. I still had no concept of the final installed form at this point, much less what small single board computer to run it off. I thought who cares if I only ever have it running connected to my computer while I'm dorking about?<br />
<br />
After a busy few hours of soldering and insulating, I had the strips connected and ready to test. Although a few of my solder joins failed [embarrassedFace.jpg] thankfully I'd not made any short circuits and my array successfully lit up.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8LYXnWhZT6SbO3zqwsKyxG_ehdY34Ip0jSOPqWPVU0Db5xOSlzV0hpTqRC-03KI-lgFIBpibalg5cglXjtfPbuguNVhJ_v_qX7-jZzHohpPkZWhfXGRogGJJnurauy-PDftmGNS1wmp8/s1600/arrayLit.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="525" data-original-width="800" height="420" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8LYXnWhZT6SbO3zqwsKyxG_ehdY34Ip0jSOPqWPVU0Db5xOSlzV0hpTqRC-03KI-lgFIBpibalg5cglXjtfPbuguNVhJ_v_qX7-jZzHohpPkZWhfXGRogGJJnurauy-PDftmGNS1wmp8/s640/arrayLit.jpg" width="640" /></a></div>
<br />
Having got this far, I could not resist testing some more complicated array graphics and modes.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1GZFon9wv02Jj7yuQVWDmOQAcBHSDDoFkCM30AA-tIxRtlnMtu-ieKRYKAFnCniXw9jRhnO4Glas8IGJNocQPGlrQm-rh-5rwB7SUEZv5xcoQ641Z_z4oIC54om6a22t9Oyr5yZ-duu4/s1600/rainbowTest.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="543" data-original-width="800" height="434" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj1GZFon9wv02Jj7yuQVWDmOQAcBHSDDoFkCM30AA-tIxRtlnMtu-ieKRYKAFnCniXw9jRhnO4Glas8IGJNocQPGlrQm-rh-5rwB7SUEZv5xcoQ641Z_z4oIC54om6a22t9Oyr5yZ-duu4/s640/rainbowTest.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Yeah I know, my desk is a mess.</td></tr>
</tbody></table>
I quickly ran into some issues to do with how I'd chosen to lay out my array and how the Fadecandy board and Processing saw things working out.<br />
<br />
<span style="font-size: x-large;">Radians, and assumptions</span><br />
<span style="font-size: x-large;">-------------------------------------------------------------</span><br />
As any Fadecandy enthusiast will know, a visit to the <a href="https://groups.google.com/forum/#!forum/fadecandy" target="_blank">Fadecandy google group page</a> will show that multiple people want to layout their LED arrays in different and sometimes challenging ways.<br />
<br />
I'd made the assumption during my wiring stage that my horizontal layout could simply be rotated in the Processing sketch or OPC layer to be vertically oriented and wrapped around a cylinder. Here's an example of the indexing of the zigzag array I'd followed [this example matches an 8x8 NeoPixel grid but the result is similar for a 15x8 grid - just longer on one side]:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3oqK-XLBJb1i-qxrdDmX5OTA7Zi2CGy-GuTBzBpDz5jKAq8V_fI1NA4TkyjgdMmvqnjgdn6kl_D4VKOi3uQQMU5WOFVM_YHsEGbYgxvEav_ChkrDJsqpWIhqEHM5_gN0k7I8qwwsGSyw/s1600/leds_fabprintHorizontalArray.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1013" data-original-width="1141" height="568" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3oqK-XLBJb1i-qxrdDmX5OTA7Zi2CGy-GuTBzBpDz5jKAq8V_fI1NA4TkyjgdMmvqnjgdn6kl_D4VKOi3uQQMU5WOFVM_YHsEGbYgxvEav_ChkrDJsqpWIhqEHM5_gN0k7I8qwwsGSyw/s640/leds_fabprintHorizontalArray.jpg" width="640" /></a></div>
<br />
You can see that the data input for the array enters on the top left at the 0 index, continues along the top to the right end where it zigs [or zags?] down onto the next row, this time in reverse order to the left hand end and then zigs again onto the next row, this time in the correct order etc. And on until the end of the layout.<br />
<br />
It's vitally important that the Fadecandy understands the intention of this layout so that it knows how to take the sample points in the processing sketch and convert the resulting pixel colour information into the correct spatial information when it's sent in serial down the data pin connection of the LED strip.<br />
<br />
My problem was that I actually required a layout more like the following image and that, being the VFX artist I am, I could simply specify a 90 degree rotation to achieve the correct result:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjumwVbbMzAJ3-dZsqvDaMFebFTWxruWIEU-UbO75JNdp4E_avfN3e4tGOR2mjKu55P9k1lLGqqReFZOgBDEBk3D2iTAaV-K7OcqX9ZNlAkASDE8Dg8aq0j-HXLoynedvuttbXnI30ziZ8/s1600/leds_fabprintVerticalArray.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1013" data-original-width="1141" height="568" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjumwVbbMzAJ3-dZsqvDaMFebFTWxruWIEU-UbO75JNdp4E_avfN3e4tGOR2mjKu55P9k1lLGqqReFZOgBDEBk3D2iTAaV-K7OcqX9ZNlAkASDE8Dg8aq0j-HXLoynedvuttbXnI30ziZ8/s640/leds_fabprintVerticalArray.jpg" width="640" /></a></div>
<br />
I needed this layout so that all my input power and data wiring would be near the bottom of the layout and that the longest dimension [not represented properly here] would be vertical, to match the height of my lava lamp physical orientation once the array was wrapped around the PVC cylinder.<br />
<br />
Although I had success with running sketches on the array as I'd wired it horizontally, if I simply rotated my LEDs 90 degrees it meant that a graphic element running in the Processing sketch from top left to top right would run from bottom left to top left on the array. Gadzooks. This throws a spanner in the works of some of my ideas for things based on physics, like sketches that use a gravity direction meant to mimic the real world.<br />
<br />
I had no clear place in the Processing sketch to rotate the output, and it was not immediately obvious where else I could effect this?<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVfeVLmLNOS87YP87Yu4qNtIFyG9KeWJPk48Z0RAvF2Jb5FteVcGQbya10Eu-ytQbEMzuYtwCy3rsyn6gZ2oovq5brna8R6-o7csyX8dphImuquwY0OGoFEvWoXsah5sRaCRFqlOzE4wY/s1600/plasmaModeArray.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="281" data-original-width="500" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVfeVLmLNOS87YP87Yu4qNtIFyG9KeWJPk48Z0RAvF2Jb5FteVcGQbya10Eu-ytQbEMzuYtwCy3rsyn6gZ2oovq5brna8R6-o7csyX8dphImuquwY0OGoFEvWoXsah5sRaCRFqlOzE4wY/s640/plasmaModeArray.gif" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's beautiful. But it's horizontal. This will not do.</td></tr>
</tbody></table>
After some forum diving and Fadecandy google group spelunking, I hit on <a href="https://groups.google.com/forum/#!topic/fadecandy/OcKnPXjuXUQ" target="_blank">this thread in particular</a> about a person who had a non-standard NeoPixel layout who needed to perform a similar rotation-based remapping operation. They used a rotation value specified in radians to get the correct orientation in conjunction with some other value swapping kung-fu.<br />
<br />
If you get on down to building a Fadecandy project yourself you're going to run into the problem of which OPC library call to use to map your Processing sketch out to your array. I settled on using two <b>opc.ledGrid</b> calls, the syntax for which looks like the following:<br />
<br />
<span style="font-family: inherit;"><span style="box-sizing: border-box; font-variant-ligatures: normal; font-weight: bold; orphans: 2; widows: 2;">opc.ledGrid</span>( index, stripLength, numStrips, x, y, ledSpacing, stripSpacing, angle, zigzag, flip )</span><br />
<div style="orphans: auto; widows: auto;">
<span style="font-family: inherit;"><span style="font-variant-ligatures: normal; orphans: 2; widows: 2;">After some head scratching and monkeying around, I </span>successfully remapped my sketch layouts out to my array via the two Fadecandy channels I was using with the following commands:</span></div>
<div style="orphans: auto; widows: auto;">
<span style="font-family: inherit;"><br /></span></div>
<div style="orphans: auto; widows: auto;">
<span style="font-family: inherit;"><b>opc.ledGrid</b>( 0, 15, 4, width*0.25, height/2, height/15, width/8, <b>4.712</b>, true )<br /><b> opc.ledGrid</b>( 64, 15, 4, width*0.75, height/2, height/15, width/8, <b>4.712</b>, true )<br /><br /><span style="orphans: 2; widows: 2;">I've </span><span style="caret-color: rgb(36, 41, 46); orphans: 2; widows: 2;">highlighted</span><span style="orphans: 2; widows: 2;"> in the lines above the radian specification for the rotation required. This worked! And it now gave me sketch output in Processing that was oriented correctly for the cylinder arrangement.</span></span></div>
<div style="orphans: auto; widows: auto;">
<span style="color: #24292e; font-family: "helvetica neue"; orphans: 2; widows: 2;"><br /></span></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVh3Gf5cjQ9O7Hx1g5aN0Nuj917U06GzwpaZi5bJKDiqQ7mvom_NiYAqftImv_ysJapHwlphV8v5kqKZkKNPBKJdc6dWiugIlJ4_kb_XoaKyh-fO2nFpPdwV8wrM574eUBOa2ZeRa-sNo/s1600/noiseCloudsArray.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><span style="font-family: inherit;"><img border="0" data-original-height="281" data-original-width="500" height="358" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVh3Gf5cjQ9O7Hx1g5aN0Nuj917U06GzwpaZi5bJKDiqQ7mvom_NiYAqftImv_ysJapHwlphV8v5kqKZkKNPBKJdc6dWiugIlJ4_kb_XoaKyh-fO2nFpPdwV8wrM574eUBOa2ZeRa-sNo/s640/noiseCloudsArray.gif" width="640" /></span></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="font-family: inherit; font-size: small;">Success! Now I can move onto the physical installation, knowing that gravity points down. Duh.</span></td></tr>
</tbody></table>
<span style="font-family: inherit;">With this problem solved I was pretty certain that I could progress onto the next stages of the design - the physical form factor, and also start considering some other issues like, how can I add a button to the front of the lamp to change the sketch? What small computer could it run on?<br /><br />While I considered those things, I also spent some time making sketches in Processing to run on the array and experimented with designs from <a href="https://www.openprocessing.org/">https://www.openprocessing.org</a> where a great many people share their ideas with the world. The terms at OpenProcessing.org specify that any work uploaded or created on their site falls under a Creative Commons license unless specified otherwise. I've found many sketches there that simply run in Processing locally quite well. They require tuning and optimising to run on the lava lamp array but this is quite fun.<br /><br /> So, that's it for Part 1. In Part 2 I'll discuss the computer platform choice and show the housing construction. In Part 3 I'll detail the steps and software I created to deploy what I'd built into a standalone unit with wifi where I can add new modes wirelessly. And a stretch goal...</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;"><a href="http://blog.julianbutler.com/2019/01/my-digital-lavalamp-or-mki-epilepsy.html" target="_blank">Part 2 this way...</a></span><br />
<span style="font-family: inherit;"><br /></span>-j<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com20tag:blogger.com,1999:blog-1829332005373102680.post-68126707699276945362018-12-08T17:20:00.000+13:002018-12-08T17:24:15.665+13:00FujiFilm X100F updateSo after shooting happily with the sublime FujiFilm X100s for the last 5 years or so, it's time for an update. Thankfully, in the interim, Fuji have not sat idle. They have iterated steadily, producing the X100t and in 2017 the X100f, representing a significant jump in specs and capabilities.<br />
<br />
I love the way the X100s produces images, from the combination of it's controls and lens to the friendly portability and styling that impart a relaxed mood to capturing scenes. It's just so much fun to have around. Then in Lightroom, the grading possibilities and way that the X-trans sensor sharpens makes me look forward to processing images from trips abroad.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXa_4zPmWlYwdXXVuulUKOnHcYr_3ZnQqcHCGBevSCdCmbhNTJKP5p9fZREOtsP9JBu0AoYnZM5DuSHYlNZOx_FRpQNzS5fqwcMBQlph7bkkqXdoX250Sk9nZ-tjhaRxnE9THvokAlRhI/s1600/Nara_deer_Japan.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="787" data-original-width="1279" height="392" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXa_4zPmWlYwdXXVuulUKOnHcYr_3ZnQqcHCGBevSCdCmbhNTJKP5p9fZREOtsP9JBu0AoYnZM5DuSHYlNZOx_FRpQNzS5fqwcMBQlph7bkkqXdoX250Sk9nZ-tjhaRxnE9THvokAlRhI/s640/Nara_deer_Japan.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Deer - Nara Park - Japan</td></tr>
</tbody></table>
The X100F delivers a huge improvement in speed, battery life, ISO and file resolution, marking a significant commitment from Fuji to keep delivering on what makes the X100 series so delightful. I couldn't resist dressing up my new X100f in all the latest hipster accoutrements:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgoXcUlXMUx-48S9Q0qUV-BqMF8AvcljIkQBvn6HnpE0lM_oBYinO4GtYdlC9uHm6nhviHFPrOCHBVGcsrEVap3QHKLi3TeF0ECQuyZWwoVXFTySGSeKYuQ9mAzBrGvaSArToLA58-FLfA/s1600/FujiFilmX100F.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1600" data-original-width="1600" height="638" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgoXcUlXMUx-48S9Q0qUV-BqMF8AvcljIkQBvn6HnpE0lM_oBYinO4GtYdlC9uHm6nhviHFPrOCHBVGcsrEVap3QHKLi3TeF0ECQuyZWwoVXFTySGSeKYuQ9mAzBrGvaSArToLA58-FLfA/s640/FujiFilmX100F.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Thanks in particular to <a href="https://gordyscamerastraps.com/" target="_blank">Gordy's Camera Straps</a> for the custom leather and binding.</td></tr>
</tbody></table>
The only fauxhemian element left to arrive is a leather Gariz half-case, to replace the Fuji brown leather travel case I had on the x100s. Also, my amazing wife is getting me the Tcl-x100 tele-conversion lens for Christmas which will afford me a 50mm equivalent focal length. This will be a great travel and portrait option where the 28mm equivalent fixed lens is a little wide. <br />
<br />
Merry Christmas and happy shooting!<br />
<br />
-julianJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-44495290727633298702018-11-03T12:26:00.002+13:002018-11-03T12:26:24.423+13:00A timely and important reminderFrom 1973:<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/nbvzbj4Nhtk/0.jpg" src="https://www.youtube.com/embed/nbvzbj4Nhtk?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<br />
That is all.<br />
<br />
-jJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-47718854508148379892016-10-27T21:39:00.001+13:002016-10-27T21:40:01.879+13:00Oculus Touch early access<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzlueehnFRdnrNqGyXEaZXMSwU3QliH-lbZd5ydhpI4zdGWv4WRWMplifmnnqKnHmeUsiKAhReIHWUp5p0N2hE15QFL54uxhNFznuc-YklrxrE6OF6XwQ5MWFwqMGoUNNGy2iiHTasy0g/s1600/Touch-1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="379" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgzlueehnFRdnrNqGyXEaZXMSwU3QliH-lbZd5ydhpI4zdGWv4WRWMplifmnnqKnHmeUsiKAhReIHWUp5p0N2hE15QFL54uxhNFznuc-YklrxrE6OF6XwQ5MWFwqMGoUNNGy2iiHTasy0g/s640/Touch-1.jpg" width="640" /></a></div>
<br />
Oculus have very generously sent me Touch kits for use in developing the hand interactions in the <a href="https://www.untouchedforest.com/" target="_blank">Untouched Forest</a>. I'm really stoked to have their support in what is turning out to be a really interesting project. Oculus Touch is not officially available and shipping until December later this year, however pre-orders are available here: <a href="https://www3.oculus.com/en-us/rift/">https://www3.oculus.com/en-us/rift/</a><br />
<br />
In short, Touch is fantastic. It's capacitive sensing and haptic feedback allow for the detection of hand gestures as well as feedback about objects the player interacts with. It also maps pretty much 1:1 with the controls on HTC's Vive hand controllers so, thanks to the generous support of Valve, almost any SteamVR title is natively supported. I find them very comfortable and intuitive to use and have begun Unity integration and experimentation.<br />
<br />
Google's Tiltbrush is an amazing app to use. I'm stunned by the user interaction in the form of wrist-mounted controls and I've begun sculpting crazy light pieces that I dreamed of creating when I was a child:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/n_jS2AOus54/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/n_jS2AOus54?feature=player_embedded" width="320"></iframe></div>
<br />
Massive thanks to Callum Underwood and Andres Hernandez [Cybereality] at Oculus for helping me out and giving me this fantastic set of tools! And thank you to all the engineers and developers there for pushing so hard to get this into the hands of devs all over.<br />
<br />
-julianJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-54008682060491138002016-09-13T17:32:00.002+12:002016-09-13T17:32:23.156+12:00Your foster parents are dead<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfnLBxxBhE04i3PnDFmXfzEiTuCFPJB-2q-JeIyd4A7mXDzZoMfNyx9OJPcx1aCCSysw3d5FAzHI-qbpFfw1m7-0u0BpZmAxBDUtyK0dd7OiESelH0l01sDtck4z2xRACWQGdXUYLEs7U/s1600/FosterParents.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img alt="" border="0" height="270" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfnLBxxBhE04i3PnDFmXfzEiTuCFPJB-2q-JeIyd4A7mXDzZoMfNyx9OJPcx1aCCSysw3d5FAzHI-qbpFfw1m7-0u0BpZmAxBDUtyK0dd7OiESelH0l01sDtck4z2xRACWQGdXUYLEs7U/s640/FosterParents.jpg" title="Woofy's just fine... Woofy's jusssst fiiine... where are you?" width="640" /></a><br />
<br />
Yeah ok so, bad title I know. But seriously, remember this moment above from <a href="http://www.imdb.com/title/tt0103064/?ref_=fn_al_tt_2" target="_blank">Terminator 2: Judgement Day</a>? <a href="https://www.youtube.com/watch?v=MT_u9Rurrqg" rev="en_rl_small" style="font-family: gotham, helvetica, arial, sans-serif; font-size: 14px;">https://www.youtube.com/watch?v=MT_u9Rurrqg</a> click to the left there to watch.<br />
<br />
Well, looks like the speech synthesis component of that instance has arrived. <a href="https://deepmind.com/blog/wavenet-generative-model-raw-audio/" target="_blank">WaveNet - A generative model for raw audio</a>, looks like it has massively closed the gap between computer speech synthesis and human speech. I won't attempt to summarise the whole article but, in short, far more natural sounding computer speech [and in fact almost any audio source including music] has arrived. The implications are, unnerving.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://storage.googleapis.com/deepmind-live-cms.google.com.a.appspot.com/images/mos2.width-1500.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://storage.googleapis.com/deepmind-live-cms.google.com.a.appspot.com/images/mos2.width-1500.png" width="640" /></a></div>
<br />
With the previous technology leader 'Concatenative' in the light pink on the far left in each graph, and human speech in green on the right, you can see where WaveNet now falls. Listen to the results yourself <a href="https://deepmind.com/blog/wavenet-generative-model-raw-audio/" target="_blank">in the midst of the article.</a><br />
<br />
This means that all the devices and smart assistants that are speaking to you and I today [Siri, Amazon Echo, Cortana, turn by turn GPS navigation etc] are not only going to sound ever more convincing, but the potential for mimicry of voice actors, politicians and people that are no longer around that we have enough samples of their speech will go through the roof.<br />
<br />
<a href="http://blog.julianbutler.com/2016/04/googles-deep-dreaming-your-favourite.html" target="_blank">Mimicking long dead artists' work is one facet of neural-net tech</a>, this is another.<br />
<br />
Incidentally, in that same article are some amazing [and frightening] piano music examples. I think the results are maybe physically impossible to play. They are interesting in a somewhat schizophrenic fashion.<br />
<br />
-j<br />
<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-79743910712862303222016-08-20T00:09:00.000+12:002016-08-20T00:09:15.748+12:00Welcome to the Untouched ForestI've begun a new VR project entitled <a href="http://www.untouchedforest.com/" target="_blank">Untouched Forest</a>. It's a piece of native New Zealand forest where you can experience flora and fauna in an interactive way. I'll be exploring player/character interactions in a relaxing virtual environment. Click here to take a look: <a href="http://www.untouchedforest.com/">www.untouchedforest.com</a><br />
<br />
from the site:<br />
<br />
Spend some time in a NZ native forest environment as native bird life comes to visit you. Experience a night and day cycle with all the variation and appearance of creatures that has to offer. Use Oculus Touch to let birds come and land on your outstretched hands and enjoy their song. See glow worms and hold a weta in your hand. Sit and relax as day and night pass before you while you escape your normal surroundings.<br />
<br />
More information on the site's blog here: <a href="http://untouchedforest.com/blog/2016/8/19/more-info-leads-here" target="_blank">www.untouchedforest.com/blog/</a><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiv9Kh-m4FssbJbHhGaDvrajrrRhs2uze177NWmghgfYhGpr8qBJw9jiYuWOBJsiLYzpAt6fwI5zhUXpFLmP-PSAXhgjrlMtPanhy2zRRMCHfRaC5pAG8AhkYNT24Qa3dPCYyzmuum1VMY/s1600/toolsPanel.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiv9Kh-m4FssbJbHhGaDvrajrrRhs2uze177NWmghgfYhGpr8qBJw9jiYuWOBJsiLYzpAt6fwI5zhUXpFLmP-PSAXhgjrlMtPanhy2zRRMCHfRaC5pAG8AhkYNT24Qa3dPCYyzmuum1VMY/s640/toolsPanel.jpg" width="580" /></a></div>
-julianJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-59956591209682085732016-04-26T23:29:00.001+12:002016-04-28T11:34:08.459+12:00Google's Deep Dreaming, your favourite artists and YOU.What if your favourite dead artists were still painting fresh works? Fresh works containing themes *you* specifically desired? Are you still sad that <a href="https://en.wikipedia.org/wiki/Francis_Bacon_(artist)" target="_blank">Francis Bacon</a> perished? Are you gutted that H. R. Geiger <a href="http://www.theguardian.com/film/2014/may/13/hr-giger-dies-alien-artist" target="_blank">fell down some stairs and died</a>? Isn't it sad that we don't have more of <a href="https://en.wikipedia.org/wiki/Gustav_Klimt" target="_blank">Gustav Klimt's</a> stunning paintings from his gold phase? I think so.<br />
<br />
But here are some paintings they never made:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://photos.smugmug.com/Art/Deep-Dreaming/i-RPLpQNQ/0/O/IMG_3511.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://photos.smugmug.com/Art/Deep-Dreaming/i-RPLpQNQ/0/O/IMG_3511.jpg" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://photos.smugmug.com/Art/Deep-Dreaming/i-CZQbVX4/0/O/IMG_3513.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://photos.smugmug.com/Art/Deep-Dreaming/i-CZQbVX4/0/O/IMG_3513.jpg" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://photos.smugmug.com/Art/Deep-Dreaming/i-jX7sK5q/0/O/IMG_8436.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://photos.smugmug.com/Art/Deep-Dreaming/i-jX7sK5q/0/O/IMG_8436.jpg" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
What is this sorcery? We've entered a new age. To explain a little...<br />
<br />
Google's <a href="http://googleresearch.blogspot.co.nz/2015/06/inceptionism-going-deeper-into-neural.html" target="_blank">Deep Dreaming</a> neural net is completely <a href="http://vignette1.wikia.nocookie.net/bttf/images/e/e1/Silenceearthling.jpg/revision/latest?cb=20091012151454" target="_blank">melting my brain</a>. First, there's what Google's brain-in-a-jar-with-eyeballs makes of images you feed it. Google researchers employed layers of artificial neurons, progressively working on different levels of an images structure, letting it amplify what it *thinks* it sees. The results seem to invariably involve dog-lizards, fish and bird life where previously there may have only been spaghetti:<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyVVYZBo_Ul2WieUU4_Y39PY8knaGhFrUKFx9O19AhVmUVIYrf6DuRufG8spuStwjxexWiuWHGRrXLr3r9zmBt6ROdQecO6LWFT9Xjs5ERxWuqJA9ntc01ze91YEYrAg56_QUXjjh7CqA/s1600/spaghettiDog.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="427" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyVVYZBo_Ul2WieUU4_Y39PY8knaGhFrUKFx9O19AhVmUVIYrf6DuRufG8spuStwjxexWiuWHGRrXLr3r9zmBt6ROdQecO6LWFT9Xjs5ERxWuqJA9ntc01ze91YEYrAg56_QUXjjh7CqA/s640/spaghettiDog.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Exhibit: A.</td></tr>
</tbody></table>
You can experiment with this marvellous craziness yourself here: <a href="http://deepdreamgenerator.com/">deepdreamgenerator.com</a><br />
<br />
This alone is worth toying with. For example this portrait of me holding a baking dish becomes something of a Dr Seuss trip, complete with fish-lizards, mutant turtle-birds and shirt monkeys. Click the images below for larger versions:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://photos.smugmug.com/Art/Misc/i-QLB38MK/0/L/IMG_1625-L.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://photos.smugmug.com/Art/Misc/i-QLB38MK/0/L/IMG_1625-L.jpg" width="400" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://photos.smugmug.com/Art/Deep-Dreaming/i-QBw489q/0/L/UDDM7471-L.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://photos.smugmug.com/Art/Deep-Dreaming/i-QBw489q/0/L/UDDM7471-L.jpg" width="400" /></a></div>
<br />
<div style="text-align: center;">
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjhTIpzpdtS-t3AnhnIHWU6DlBIRBcSIOEyXsUCKgedcP7KA13E_Brux9HtpEWqlKp-41pd6hTo5CD3_-Du7EJvhZ-wNx66FMF3MbrV8fUbrqScrF2_hRriIpD90lE9G9kTQitOATS2LV8/s1600/seussLarge.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="492" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjhTIpzpdtS-t3AnhnIHWU6DlBIRBcSIOEyXsUCKgedcP7KA13E_Brux9HtpEWqlKp-41pd6hTo5CD3_-Du7EJvhZ-wNx66FMF3MbrV8fUbrqScrF2_hRriIpD90lE9G9kTQitOATS2LV8/s640/seussLarge.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">close up weirdness</td></tr>
</tbody></table>
<div style="text-align: left;">
This is obviously fantastic. Like, really? Are we at that point where a computation can spontaneously add human-meaningful elements to an image? I... I guess we are. For the longest time computer-vision and image synthesis has been perfunctory at best, suited only perhaps to picking objects off a conveyor belt robotically or extending tileable textures from photos etc. We've all witnessed and read about the arrival of face-tracking and matching technology however, and now it's approaching an exciting tipping-point. Computers are no longer able to simply recognise faces, they're able to <a href="https://www.youtube.com/watch?v=ohmajJTcpNk" target="_blank">replace them believably in realtime</a>. But I digress.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Extending on Google's research, other parties have created more online tools where you can supply the guesses for what the deep dreaming algorithm sees by giving it a source image to choose elements it recognises from. This is like saying 'Make me a new image from this photo in the style of this image'. For example:</div>
<div style="text-align: left;">
<br /></div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVvs01zVXI4h2reaguP63gdXRllRg4OPgAzG-kIZmlVcGrHTilee81I19p_Sz5blaPykJsVO5HFAc8aJex8JjLEKCeMxmSErP2GOaXteuRm3qD2n7r9RE11_xOjTUdk2X23fFa49IkZ2Q/s1600/VanGoghStarryNight.jpg" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" height="510" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgVvs01zVXI4h2reaguP63gdXRllRg4OPgAzG-kIZmlVcGrHTilee81I19p_Sz5blaPykJsVO5HFAc8aJex8JjLEKCeMxmSErP2GOaXteuRm3qD2n7r9RE11_xOjTUdk2X23fFa49IkZ2Q/s640/VanGoghStarryNight.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Who doesn't like Van Gogh's Starry Night? </td></tr>
</tbody></table>
<div style="text-align: left;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmQiFfedwKKvC7DF6Zcjh2nrwbhtZzhBnXnyJVF9UvNlWGWx2xNW2FEAVXOCcvP9Eo1rwMjZssEYxfmhvLJM3uGK8-_p_InSsYB93PM71RF0S-H9Z1MZULlWPRD0Jv3WpHKyQfgYSLRMw/s1600/StarryBrian.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="480" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmQiFfedwKKvC7DF6Zcjh2nrwbhtZzhBnXnyJVF9UvNlWGWx2xNW2FEAVXOCcvP9Eo1rwMjZssEYxfmhvLJM3uGK8-_p_InSsYB93PM71RF0S-H9Z1MZULlWPRD0Jv3WpHKyQfgYSLRMw/s640/StarryBrian.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Brian painted by Van Gogh?</td></tr>
</tbody></table>
I know what you're thinking. What if The Great Wave of Kanagawa was really about skulls instead of tsunamis? Well:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqCsYoT0wYSAdA_NDmXz2unavm1kkHsV555ibCvN8nuKj_6UIuH1d9p252W2FqsjaGngWT3MtD40tMNqdIq4JGc0sACTMP_0b-N1KYMdET4gnX3k2XFP-ulCgb-SNa_48PWUVbstauiuU/s1600/GreatWave.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="430" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqCsYoT0wYSAdA_NDmXz2unavm1kkHsV555ibCvN8nuKj_6UIuH1d9p252W2FqsjaGngWT3MtD40tMNqdIq4JGc0sACTMP_0b-N1KYMdET4gnX3k2XFP-ulCgb-SNa_48PWUVbstauiuU/s640/GreatWave.png" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhe9zVbwxl_QAZcFCu2RU08r5nLFZa0OBuqPQqABWgLC1bvlj4ODuPIeyRNnY0Wdz-Ib3rLSyToNVA6cGDNVXt5CNgEOHr4AZUBLgsOU4xrrY6RV84vMcobO4ZUGnXwmuDttWAJNOrseJY/s1600/GreatWaveSkull.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="408" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhe9zVbwxl_QAZcFCu2RU08r5nLFZa0OBuqPQqABWgLC1bvlj4ODuPIeyRNnY0Wdz-Ib3rLSyToNVA6cGDNVXt5CNgEOHr4AZUBLgsOU4xrrY6RV84vMcobO4ZUGnXwmuDttWAJNOrseJY/s640/GreatWaveSkull.jpg" width="640" /></a></div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Me in the style of Duchamp's Nude Descending a Staircase? Yes.</div>
<div style="text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiy4xQpIrvIlUYoJPLVk64C3UiG_-UOuJ2quHbq51XupfMqJfbD3MgDp4WI5N3SDadjvcjJ_hZ0a3uNLtY-CxvnpqQeV564MAR8uLT6CpxH0svpkn4oNretdLBtiQsctclDKAQ_j103N6c/s1600/DuchampMe.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="476" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiy4xQpIrvIlUYoJPLVk64C3UiG_-UOuJ2quHbq51XupfMqJfbD3MgDp4WI5N3SDadjvcjJ_hZ0a3uNLtY-CxvnpqQeV564MAR8uLT6CpxH0svpkn4oNretdLBtiQsctclDKAQ_j103N6c/s640/DuchampMe.jpg" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkkUfKFxZrwmS5i7CHAx77VarJ_n3jOwnKy92UFsYIB86xQZvE6anNiOIxH5W7iHslzmZEx8BDjD5B_8k9-lRBzn9URpfmzBp2guwerEr2-KyrhRYeeZ_K0x2iGc2ebvUMCsOr9LfGJVk/s1600/constructivistBrian.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="310" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkkUfKFxZrwmS5i7CHAx77VarJ_n3jOwnKy92UFsYIB86xQZvE6anNiOIxH5W7iHslzmZEx8BDjD5B_8k9-lRBzn9URpfmzBp2guwerEr2-KyrhRYeeZ_K0x2iGc2ebvUMCsOr9LfGJVk/s640/constructivistBrian.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Naum Gabo yeah yeah!</td></tr>
</tbody></table>
<div style="text-align: left;">
The main tool I'm currently using is an Android/iOS app called <a href="http://www.pikazoapp.com/" target="_blank">Pikazo</a>: <a href="http://www.pikazoapp.com/">www.pikazoapp.com</a></div>
<div style="text-align: left;">
This app allows you to upload your chosen combinations to the cloud where the computation is performed. It is intensive and as such, only a limited resolution is permitted - somewhere in the realm of 500px on the longest side, and this takes roughly ten minutes to produce. You can currently upload up to 3 combos at a time as an obvious compute load and bandwidth constraint.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
I got a little carried away with this. There just seems to be so many new cool possibilities! Too see my whole gallery of experiments, click here: <a href="http://www.julianbutler.com/Art/Deep-Dreaming/">www.julianbutler.com/Art/Deep-Dreaming/</a></div>
<div style="text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxgLGnRG-wd9-Hl4zHlQaljZguzsIwqflriWYAclp7ExV6ArsWYVMPN-HziGnOulWpkRkPeUZJ455FZPiiUh6VAxAbmrqNotOeg5yTtb-2x1mGc-iIuhxmgzoIX1g6LyREMDFt0KcxiXs/s1600/dreamingGalleryImage.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="640" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxgLGnRG-wd9-Hl4zHlQaljZguzsIwqflriWYAclp7ExV6ArsWYVMPN-HziGnOulWpkRkPeUZJ455FZPiiUh6VAxAbmrqNotOeg5yTtb-2x1mGc-iIuhxmgzoIX1g6LyREMDFt0KcxiXs/s640/dreamingGalleryImage.jpg" width="331" /></a></div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
I'm not sure what this means for art and originality. Obviously the combinations I've managed to produce are in no way able to be passed off as legitimate works by the original artist at all. But then, now the new work is 50% my contribution? According to copyright law and the internets this may be the case. Everything is a remix huh.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
However, I think the strength of a successful image still lies equally in the concept behind the image, as well as it's execution, and currently the computer isn't coming up with too many great artistic concepts on its own. </div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
Yet.</div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
-j</div>
<div style="text-align: left;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://photos.smugmug.com/Art/Deep-Dreaming/i-MJcmjzq/0/O/IMG_3475.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="640" src="https://photos.smugmug.com/Art/Deep-Dreaming/i-MJcmjzq/0/O/IMG_3475.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Stanley Donwood I probably owe you a beer.</td></tr>
</tbody></table>
<div style="text-align: left;">
<br /></div>
</div>
<a href="https://photos.smugmug.com/Art/Misc/i-QLB38MK/0/L/IMG_1625-L.jpg" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: left;"><br /></a>Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-39865548867601557142016-04-06T16:13:00.002+12:002016-04-08T10:54:19.657+12:00Vive VR launch videoWith both launch campaigns from Oculus and Valve/HTC in full swing, this Vive launch video really grabbed my attention. It's commonly acknowledged in the fledgling VR community that it's tough to convey what VR is like in words and that experiencing it is the best way to explain it to newcomers. This low-key mix of everyday people experiencing VR for the first time sells what virtual reality is like in a way that needs no words. Check it out:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/qYfNzhLXYGc/0.jpg" frameborder="0" height="266" src="https://www.youtube.com/embed/qYfNzhLXYGc?feature=player_embedded" width="320"></iframe></div>
<br />
Congrats to Valve and HTC for setting the tone. Onwards!<br />
<br />
-jJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-67732265657494287692016-03-25T14:42:00.000+13:002016-03-25T14:42:14.946+13:00It's Virtual Reality Christmas Eve!It's been a long wait for people interested in getting their hands on the consumer VR hardware <a href="https://www.oculus.com/" target="_blank">Oculus</a>' founder Palmer Luckey first dreamt up on the <a href="http://www.mtbs3d.com/" target="_blank">MTBS3D</a> forums four years or so ago. It's been an even longer wait for people [myself included] who were expecting it to take off in the mid-90s when <a href="http://www.imdb.com/title/tt0104692/" target="_blank">The Lawnmower Man</a> had us all thinking the shape of work to come was <a href="http://www.crazymovielist.com/wp-content/uploads/lawnmowerman.png" target="_blank">suspended leather couches</a> and <a href="http://cdn3-www.shocktillyoudrop.com/assets/uploads/2012/05/file_167551_0_lawnmower_man.jpg" target="_blank">skin-tight glowing bodysuits</a>. Thankfully that is not to be the case.<br />
<br />
But, it's now finally time for this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpEH2O-NadslNfOgTTQB5rZCVH6HctB4BAW_1E_LSDP0NR6QHuItg7FPikKYGFpQlM2wbYAx0NSgcPvchZwM1fZjnOYS8vYwuJIpt_IjD5AgHcRCgchQtMEJXaLCPE5qJAFxgPyaGjaew/s1600/CV1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpEH2O-NadslNfOgTTQB5rZCVH6HctB4BAW_1E_LSDP0NR6QHuItg7FPikKYGFpQlM2wbYAx0NSgcPvchZwM1fZjnOYS8vYwuJIpt_IjD5AgHcRCgchQtMEJXaLCPE5qJAFxgPyaGjaew/s640/CV1.jpg" width="640" /></a></div>
<br />
And the below gif singlehandedly sums up the mood of myself and the VR community:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh40A-rfj8a2jMM75rKRx0RCgRR6rsjG9F00EVo3H2XIzpw0tKIEgBZSV9URjn1_w7c629jRNOFov7TjZbm5J3d8iZifxPKgAGQj4dmMqXlNRe7cal2DjtHpwSpyk5iB7_eUODyQoAo9AA/s1600/dumberer.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh40A-rfj8a2jMM75rKRx0RCgRR6rsjG9F00EVo3H2XIzpw0tKIEgBZSV9URjn1_w7c629jRNOFov7TjZbm5J3d8iZifxPKgAGQj4dmMqXlNRe7cal2DjtHpwSpyk5iB7_eUODyQoAo9AA/s1600/dumberer.gif" /></a></div>
<br />
Looks like I'll be waiting till late April/mid May for the delivery of my CV1 darnit. I was a day late to the kickstarter for the DK1. Had I been a little snappier I'd be hopefully receiving my free CV1 this Monday, as shipping to the original kickstarter backers has commenced!<br />
<br />
Bring it on.Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com1tag:blogger.com,1999:blog-1829332005373102680.post-77895867240219146312015-03-08T15:21:00.002+13:002015-03-08T15:21:50.808+13:00Unity 5 realtime global illumination in VRWell, after a pretty great response to the Sunshine demo, [<a href="http://blog.julianbutler.com/2015/02/sunshine-observation-deck-now-on-oculus.html" target="_blank">big thanks to Oculus for sharing it on the front page</a>] I'm ready to start looking at the next project.<br />
<br />
I'm a big fan of American artist <a href="http://en.wikipedia.org/wiki/James_Turrell" target="_blank">James Turrell</a>, and I wanted to make an experience reminiscent of his work where you can just experience the effect of light. That's kind of simplistic I know, but his colours, spaces and projections look terrific to me and I'd love to go and see a show of his work, or even better, go see his <a href="http://rodencrater.com/" target="_blank">Roden Crater</a> when it's finished.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrTsCVSvsEaYgCf8Q0LPsz-VJI4XuZSYuis8iZsrgSRu3SpGys-VBX9q4KiBQF0D12ad0Ym6szIh02OTwRyvdhl5J7A3SHPwDeAk6SRN_uQx4KDVrz13ONh82G68Md_vPkxvyZAnguClM/s1600/blue_planet_sky.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrTsCVSvsEaYgCf8Q0LPsz-VJI4XuZSYuis8iZsrgSRu3SpGys-VBX9q4KiBQF0D12ad0Ym6szIh02OTwRyvdhl5J7A3SHPwDeAk6SRN_uQx4KDVrz13ONh82G68Md_vPkxvyZAnguClM/s1600/blue_planet_sky.jpg" height="640" width="506" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Not Unity, but James Turrell's 'Blue Planet Sky'.</td></tr>
</tbody></table>
So, the first thing to try with Unity 5 is the realtime global illumination system built in from Geomerics, called <a href="http://www.geomerics.com/enlighten/" target="_blank">Enlighten</a>. I wanted to illuminate a scene completely with only one light. Here's a clip showing my current demo:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/PYvsbxu9YfE/0.jpg" src="http://www.youtube.com/embed/PYvsbxu9YfE?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<br />
I recorded this clip in VR mode and looked around using the Oculus Rift, so my apologies to those looking for a standard single eye view. Regardless, you can see the effect of the sunlight bouncing around during the daytime phase and then the same effects during the moonlight phase.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghPNPUJBlXhG98C090MHJiDpCVu-wZltSi-Mhbcpe-QBEfGzUdvJ0sJj2TAWiZBU0jfXyX_Tmwd8lBmOcNgDt7nF4T3saEn-UHCw5usldV3BojbSgx5j0NMe0ImVZu5jZG9hf3Ucy6rk4/s1600/sun.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghPNPUJBlXhG98C090MHJiDpCVu-wZltSi-Mhbcpe-QBEfGzUdvJ0sJj2TAWiZBU0jfXyX_Tmwd8lBmOcNgDt7nF4T3saEn-UHCw5usldV3BojbSgx5j0NMe0ImVZu5jZG9hf3Ucy6rk4/s1600/sun.jpg" height="426" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJOvKaDm1blMJldOC38V5WPGCXVNfCG8ur04huXwSKLsXY0RGbBV0fY-rf6RYP_zvCha1izjIhqlVUwFrz_7kb9dkAAUnQupgtf6UpJvKpgTlWih3bdoxkKs7Gd9If8sCyj-QnrUCQRHU/s1600/moon.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgJOvKaDm1blMJldOC38V5WPGCXVNfCG8ur04huXwSKLsXY0RGbBV0fY-rf6RYP_zvCha1izjIhqlVUwFrz_7kb9dkAAUnQupgtf6UpJvKpgTlWih3bdoxkKs7Gd9If8sCyj-QnrUCQRHU/s1600/moon.jpg" height="426" width="640" /></a></div>
<br />
Effects like this used to only be possible in an offline rendering situation with all the indirect bounce-light calculations sometimes taking hours. With a little bit of up front pre-computation [in this case about 45" on my computer] this lighting can play back at realtime speeds.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkYGxAFjjfSOPdlimxCwklDxQBhsLBdMJxWzH9c6RFdNoDcfSYq0F1RYfHcxURYsnCeB34Z5lG4dbKttUHDux-pwgu_FJFPqXcuJjF_6LlfZQz54pcuVYPLkbqtoaGuKxQcOIxKyQBdwM/s1600/3DdrainChamber2Big.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkYGxAFjjfSOPdlimxCwklDxQBhsLBdMJxWzH9c6RFdNoDcfSYq0F1RYfHcxURYsnCeB34Z5lG4dbKttUHDux-pwgu_FJFPqXcuJjF_6LlfZQz54pcuVYPLkbqtoaGuKxQcOIxKyQBdwM/s1600/3DdrainChamber2Big.jpg" height="320" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">This image, from a mental ray render in Maya in 2003 took about an hour to render from memory?</td></tr>
</tbody></table>
So it's a really exciting time to be making interactive 3D content - realtime GI is a major stepping stone towards realistic lighting that can immerse the viewer.<br />
<br />
-jJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com4tag:blogger.com,1999:blog-1829332005373102680.post-55427085268618661802015-02-20T21:02:00.003+13:002015-02-20T21:02:48.712+13:00Sunshine Observation Deck now on Oculus ShareMy demo was approved and made publicly available on <a href="https://share.oculus.com/" target="_blank">Oculus Share</a> today, and it made the front page - Woohoo! It's an honour to be selected alongside others such as the <i>Apollo11 Experience</i> in a <a href="http://www.reddit.com/r/oculus/comments/2whkbs/oculus_share_content_update_219/" target="_blank">long list of new additions and updates</a>.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6ycgKUsXPUditJ05oo3hHDH_MjplsqpUoF6Ncdus71uQvGLN5RkXADzjl9pKaGgyKlw3D9dt_NrCcOWojgcuQe3KGyxqMc4pxi1XsgKOkFbKgrdZD6T8U0DmxwP0EPLbQQd1uxmiqhr0/s1600/OculusShare.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6ycgKUsXPUditJ05oo3hHDH_MjplsqpUoF6Ncdus71uQvGLN5RkXADzjl9pKaGgyKlw3D9dt_NrCcOWojgcuQe3KGyxqMc4pxi1XsgKOkFbKgrdZD6T8U0DmxwP0EPLbQQd1uxmiqhr0/s1600/OculusShare.jpg" height="640" width="600" /></a></div>
<br />
Oculus host the versions you can download now, so I probably won't kill the free dropbox bandwidth I'm currently using. But you'll need an Oculus account to download these so I'll keep these links live for anyone who doesn't have one:<br />
<br />OSX:<br /><a href="https://www.dropbox.com/sh/ptsph0fkn0r3yob/AABQfls0b3SY2s92KeP9nFQma?dl=0">https://www.dropbox.com/sh/ptsph0fkn0r3yob/AABQfls0b3SY2s92KeP9nFQma?dl=0</a><br /><br />Win7/8: Note: This demo runs great in win7, I've not tested in win8 however.<br /><a href="https://www.dropbox.com/sh/s9gpcjt67phby5z/AACt8otcG3wwwxg2kFC1g5Rya?dl=0">https://www.dropbox.com/sh/s9gpcjt67phby5z/AACt8otcG3wwwxg2kFC1g5Rya?dl=0</a><div>
<br /></div>
<div>
Thanks to Oculus for providing a curated hosting solution.</div>
<div>
<br /></div>
<div>
-j</div>
Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com1tag:blogger.com,1999:blog-1829332005373102680.post-53052611618837577662015-02-11T10:06:00.002+13:002015-02-11T10:06:50.265+13:00Sunshine Observation Deck VR demo released!<div>
You can now download the demo and try it for yourself [Requires Oculus Rift DK-2]. Here's a youtube clip to give you a taste:</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/LP3S-0hU2ZU/0.jpg" src="http://www.youtube.com/embed/LP3S-0hU2ZU?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div>
<div>
<br /></div>
<div>
There's more info here on the <a href="https://forums.oculus.com/viewtopic.php?f=28&t=20072" target="_blank">Oculus Dev Forum post</a>. And here's a download link to the builds if you want to jump in:</div>
<div>
<br /></div>
<div>
<div>
OSX:</div>
<div>
https://www.dropbox.com/sh/ptsph0fkn0r3yob/AABQfls0b3SY2s92KeP9nFQma?dl=0</div>
<div>
<br /></div>
<div>
Win7/8: Note: This demo runs great in win7, I've not tested in win8 however.</div>
<div>
https://www.dropbox.com/sh/s9gpcjt67phby5z/AACt8otcG3wwwxg2kFC1g5Rya?dl=0</div>
</div>
<div>
<br /></div>
<div>
Time to start thinking about the next project. This time a Unity5 VR test of the real time global illumination capabilities of <a href="http://www.geomerics.com/enlighten/" target="_blank">Enlighten from Geomerics.</a></div>
<div>
<br /></div>
<div>
-j</div>
Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-12274233004699250082015-02-04T23:23:00.002+13:002015-02-04T23:36:33.690+13:00Put some pants on and make a cup of tea, the sun is nearly up.Almost finished dressing the sun in flares and magnetic flux elements.<br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGCGJvbXgCeqj0SMyVmhHwDw74iLiMC8xQ8RASA4w7yXq7gHC6Yo2PXijKkztBsP7ofb4ZiSLOGVYNrc0rG_9J5fbyESoUbgHT9xwfd4uPBqwQe3XXnvsQyUlv9RfNxu9AEAO-sVrarqA/s1600/sunDressedUp.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGCGJvbXgCeqj0SMyVmhHwDw74iLiMC8xQ8RASA4w7yXq7gHC6Yo2PXijKkztBsP7ofb4ZiSLOGVYNrc0rG_9J5fbyESoUbgHT9xwfd4uPBqwQe3XXnvsQyUlv9RfNxu9AEAO-sVrarqA/s1600/sunDressedUp.jpg" height="364" width="640" /></a></div>
<br /></div>
<div>
-j</div>
Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-274437620395977142015-01-16T00:07:00.003+13:002015-01-28T10:36:10.064+13:00Transparency and deferred rendering are a bad combo, plus a mysterious lady appears?<div class="separator" style="clear: both; text-align: center;">
</div>
Merry Christmas and a Happy New Year to you. Hope you had some time off and a relaxing break. I certainly did while playing a lot of <a href="http://blazerush.com/" target="_blank">BlazeRush</a> with my kids. BlazeRush has a solid frame rate, excellent effects, great feeling dynamics and it's easy to play. The <a href="http://www.targem.ru/en/" target="_blank">devs</a> just added some of the best VR support I've seen in a commercial title too. It's great value for money on Steam, go get it.<br />
<br />
This post will cover my discoveries about the difference between <a href="http://docs.unity3d.com/Manual/RenderTech-DeferredLighting.html" target="_blank">deferred rendering</a> and <a href="http://docs.unity3d.com/Manual/RenderTech-ForwardRendering.html" target="_blank">forward rendering</a> in Unity and how alpha transparencies forced me to learn a thing or two, plus some screen shots of where the sunshine observation deck is at. Things are nearing the time when I'll release it to the community as I'm itching to start another idea I had. It's a bit technical, but there's nice pictures at the end.<br />
<br />
<a href="http://blog.julianbutler.com/2014/12/physically-based-rendering-cubemap.html">In the last post I talked about physically-based rendering and cubic environment maps</a>, shortly after which I needed to create the transparent, alpha-mapped smoked-glass window you can see in the frame grabs from the movie below:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg3ON7leLi_BOE8cQQpuSewWxcZ-SglH5Y8o0mc6oSrM54Hb8VLpUIhjcCGMEjpo-LERiYJ54izqvj1nAz_Xy_m4s_75r4GIOMPgV1DAxSSpjXUHV9fB11-THUzKyxzi7dSQpNwlqq2vYo/s1600/smokedThree.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg3ON7leLi_BOE8cQQpuSewWxcZ-SglH5Y8o0mc6oSrM54Hb8VLpUIhjcCGMEjpo-LERiYJ54izqvj1nAz_Xy_m4s_75r4GIOMPgV1DAxSSpjXUHV9fB11-THUzKyxzi7dSQpNwlqq2vYo/s1600/smokedThree.jpg" height="264" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Smoked-glass sliding door screen-left.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGXQDMDCTjn84LmozmFjwemfDXW-BDmfADnZX8WlSkeYu405T5x-s-VmvJoXyElO3wTSx0gD6gq4cNO02xrzHnmlg9zio0AiAxc_wCNaTnQ6QnnrHAYp_WV1uiJkrDyNh__BrlXzNVQ2U/s1600/smokedOne.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGXQDMDCTjn84LmozmFjwemfDXW-BDmfADnZX8WlSkeYu405T5x-s-VmvJoXyElO3wTSx0gD6gq4cNO02xrzHnmlg9zio0AiAxc_wCNaTnQ6QnnrHAYp_WV1uiJkrDyNh__BrlXzNVQ2U/s1600/smokedOne.jpg" height="264" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">More smoked-glass, this time from inside the science lab.</td></tr>
</tbody></table>
As a new Unity user, [n00b], I had previously switched on deferred-rendering in the graphics pipe [cause it sounded great], only paying scant attention to the documentation mentioning that deferred rendering supports a multitude of dynamic lights. I thought 'Oh yeah, dynamic lighting, that's something I'm going to want for sure', and completely ignored the part that mentions transparent objects must be rendered in forward rendering mode.<br />
<br />
I happily proceeded to create the sliding door geo, apply UnityPro's refractive shader and watch my frame rate plunge down to sub 60fps as I approach the glass.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0DXl3uB1eOAkdj_-o1s-w02Zduqhv992q_0UNOPW0MnTnV86VIvJXEuOQgHMDYxmMmnAmvJ8SJvDMm6GZXDbh8wio4eRAFKZ4xT6LWPDFBUA8wc0bQ5JxyQy038s5Wu1qpMpusDqHEFE/s1600/FPS.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0DXl3uB1eOAkdj_-o1s-w02Zduqhv992q_0UNOPW0MnTnV86VIvJXEuOQgHMDYxmMmnAmvJ8SJvDMm6GZXDbh8wio4eRAFKZ4xT6LWPDFBUA8wc0bQ5JxyQy038s5Wu1qpMpusDqHEFE/s1600/FPS.jpg" height="416" width="640" /></a></div>
<br />
Oh noes!!! The horror! I resigned myself to the fact that I probably couldn't afford this cool looking effect with my GPU [nVidia GTX 680MX] and swapped the Pro refraction shader for a plain alpha-mapped transparency shader. So sad. However, I remembered that for achieving a sense of presence, Oculus recommend a high frame rate over visual fidelity.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifQ7n18fj1WW7j_cwLAAYGmktwveAZcNPMoLZJg8mnuwbx_kYFFDxQIXuuhNeCgKoqqeFDbffVY0YlYnDYmtp9lcfl7sxt_O6xwUuVfuXUENGi0cry33ORT-EIMDxfYqT7JgFODQWF20g/s1600/refract.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifQ7n18fj1WW7j_cwLAAYGmktwveAZcNPMoLZJg8mnuwbx_kYFFDxQIXuuhNeCgKoqqeFDbffVY0YlYnDYmtp9lcfl7sxt_O6xwUuVfuXUENGi0cry33ORT-EIMDxfYqT7JgFODQWF20g/s1600/refract.jpg" height="334" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">So money. I must have it at all costs.</td></tr>
</tbody></table>
But when I previewed the alpha-mapped transparency shader in the Rift the frame rate still took a nosedive when I approached the glass. So the transparency itself was the cause of the problem and not the refraction shader? But why should transparency cause such a hit to the GPU?<br />
<br />
There are two main problems and the answer lies in the order in which the objects I'm asking Unity to draw are rendered. I risk stuffing both feet in my mouth trying to explain this as I do not have OpenGL coding experience nor a computer science background, but here goes a quick summary. <a href="http://gamedevelopment.tutsplus.com/articles/forward-rendering-vs-deferred-rendering--gamedev-12342" target="_blank">And here's a link to a great page describing both forward and deferred rendering modes</a> if you'd like more info.<br />
<br />
To draw the current frame, the GPU usually attempts to draw the objects in the scene from the furthest object visible all the way up towards the front, or what is nearest to the camera. This is good because objects are sort of stacked logically and things appear in the correct order depth-wise, but this is also bad because things that might be hidden by other things are drawn unnecessarily, wasting GPU resources. This is the first problem. This unnecessary drawing is called 'overdraw' and in fact Unity has a viewport mode dedicated to showing you what's hiding behind other things:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9FNXXBHuFufFtSWQ9tKXFipclXGdPyIjx2zpWte4VdEiJRyuttbQ68OWMTnh7pyAaWGzB-mbm-CtBo_iBe4IxXJdob24ZJOsZyMZgyh9TUctDqO9JrwLFSqmaNbdDIZ5pGs3uWauj3rw/s1600/overdraw.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9FNXXBHuFufFtSWQ9tKXFipclXGdPyIjx2zpWte4VdEiJRyuttbQ68OWMTnh7pyAaWGzB-mbm-CtBo_iBe4IxXJdob24ZJOsZyMZgyh9TUctDqO9JrwLFSqmaNbdDIZ5pGs3uWauj3rw/s1600/overdraw.jpg" height="338" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Unity's Overdraw viewport mode - How to know what is transparent and what is not though?</td></tr>
</tbody></table>
Smart video game design attempts to minimise this by culling or removing objects that are hidden behind others so the GPU never needs to draw them. I'm not being that efficient yet however and just relying on not having too much stuff to draw. We're in space after all. This doesn't really impact the base rendering speed *too* much, but it's part of the overall problem.<br />
<br />
The second part of the problem is that I'd asked Unity to draw in deferred rendering mode, where the geometry is processed in multiple passes to separate the jobs of drawing, lighting, texturing the scenes objects. In that lighting pass - and this is the main advantage of deferred rendering - many many lights can contribute to the scene's illumination cheaply as there's no texturing or other stuff done at that time, hence you can have lots of dynamic lights. But during the remaining texturing and compositing phase of the draw the transparent portions of the foreground objects must be considered when calculating the pixel sample values of the objects behind. And it's this continuous checking and sampling that destroy any speed gains made. The glass and the way it's transparency contributes to the appearance of the objects behind has to be considered at every step. In fact the closer you get to the glass, or the more of your view covered by it, the slower the GPU goes.<br />
<br />
Forward rendering however, performs this object drawing from the back up to the front also, but it draws, lights and textures the scenes objects as it goes. Each object is rendered in it's entirety and then the next closest etc etc on and on until right in the front at the last millisecond the transparent glass is drawn over the top and BLAM that frame is done. It's this brute force approach that makes transparent foreground drawing feasible.<br />
<br />
Sure enough, switching Unity to forward rendering instantly restored my frame rate, and allowed me to use the refractive shader which looks cool. It's a little over the top as glass on a spaceship this modern probably has no imperfections at all through which the background would be distorted but I think it adds to the ambience. And since my scene is mostly static, I was able to bake any shadow casting lights and extra stuff afforded by deferred rendering into the light-mapping. That's my problem solved!<br />
<br />
Phew. If you're still here, congrats. Hope that wasn't too painful. And if I got this wrong, don't hesitate to tell me in the comments. Now for some screen shots of where the observation deck is at.<br />
<br />
EDIT: Here's a great read/rant from a graphics programmer about the different rendering styles as well as a discussion on why it's hard to make mirrored surfaces in games: http://www.reddit.com/r/gaming/comments/12j1jn/<br />
<br />
Please bear in mind the following images lack the optical effects present in the Rift's view which assist dramatically in creating the missing atmosphere. But the double imaging of the Rift screen make it tough to see what's there, so this is how you can see it for now:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhF-jRc3x2LP1zSJ0Ea4hKCsCiZYOAqROy-C3S_V9REeLNrTb0ajbEqO_XekfhxDsCGIbqcrHH682WXwKgx5xWWf4kBxFE1CsuA4qRiXGznKZ7gW1fPJyYIs86cpALBvD85OT_lsbjLHjQ/s1600/company.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhF-jRc3x2LP1zSJ0Ea4hKCsCiZYOAqROy-C3S_V9REeLNrTb0ajbEqO_XekfhxDsCGIbqcrHH682WXwKgx5xWWf4kBxFE1CsuA4qRiXGznKZ7gW1fPJyYIs86cpALBvD85OT_lsbjLHjQ/s1600/company.jpg" height="306" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I found a free, high-quality model of a seated woman online to share the viewing couch.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirthZFR2_OlcBhJ9hggwICkpDZ2HxpDvvclz3javB-4f7IzlDZxxXYCnzWmXAC2RB-cg8i-jng8P0zFPw5ivDMDgMj13rZITWOEO9rorlYZs5BqyJXKlRSIXjvkhitrH_G4HNl9Jwk7jU/s1600/exposure.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirthZFR2_OlcBhJ9hggwICkpDZ2HxpDvvclz3javB-4f7IzlDZxxXYCnzWmXAC2RB-cg8i-jng8P0zFPw5ivDMDgMj13rZITWOEO9rorlYZs5BqyJXKlRSIXjvkhitrH_G4HNl9Jwk7jU/s1600/exposure.jpg" height="300" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Viewing deck exposure controls are present. </td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiay902E___c3hKMBMvAuZtI5hvv4BdmQY6Y9ojRZEKlqEnVBH8jh93GyAhraH-Yp6iW903SaHzLpN104PpqQh_8syP2kNeVIAqfcHZLkNwf4NfwLbiTfTLpIGW3cKOT5wsy6LDJLrom5s/s1600/rearCorridorHatch.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiay902E___c3hKMBMvAuZtI5hvv4BdmQY6Y9ojRZEKlqEnVBH8jh93GyAhraH-Yp6iW903SaHzLpN104PpqQh_8syP2kNeVIAqfcHZLkNwf4NfwLbiTfTLpIGW3cKOT5wsy6LDJLrom5s/s1600/rearCorridorHatch.jpg" height="328" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Rear corridor hatch and controls.</td></tr>
</tbody></table>
<div style="text-align: left;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqwLlVhOXivaafuo-uceDUO-jiJIxT6UQSm3Xb_YiypRktYpH8hwIUBaaxpyCnscMcBkDmQx-mOVZ55fdg809Xrr9PPvoWPEhkQuk86EXaOmcmjIzRu4nNqoRRU2ysblxDKmTU-pWCCYc/s1600/rearWallPanel.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjqwLlVhOXivaafuo-uceDUO-jiJIxT6UQSm3Xb_YiypRktYpH8hwIUBaaxpyCnscMcBkDmQx-mOVZ55fdg809Xrr9PPvoWPEhkQuk86EXaOmcmjIzRu4nNqoRRU2ysblxDKmTU-pWCCYc/s1600/rearWallPanel.jpg" height="330" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Rear wall normal-mapped panelling.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTs8bfmkUbrHx-nTahS8pXz0pm5EkmSWuciaqlj58iakwYHi9I7SjNi5ANbqfKqyR60MQod-PO3hMBqI6HfBSx-UqwaXGDQV-SoA46j9jaIiYZgjg3Y_NHuXEX6qHy5-TB2stNNoe-0AQ/s1600/scienceLab.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTs8bfmkUbrHx-nTahS8pXz0pm5EkmSWuciaqlj58iakwYHi9I7SjNi5ANbqfKqyR60MQod-PO3hMBqI6HfBSx-UqwaXGDQV-SoA46j9jaIiYZgjg3Y_NHuXEX6qHy5-TB2stNNoe-0AQ/s1600/scienceLab.jpg" height="326" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Science lab doorway.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGkIHoZolv-LVVdyV1BkOQz5qk6287EPnjrIq0-QlOxt-mws7h-bfaa3Bx1cLnIjcXgP5lLO5bVYX358BMpg7yzArnRDVBwrOj68wjD1p1CO7OgzsTT75-kxv5keytv2OQDJXwSOpPKRg/s1600/labRear.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhGkIHoZolv-LVVdyV1BkOQz5qk6287EPnjrIq0-QlOxt-mws7h-bfaa3Bx1cLnIjcXgP5lLO5bVYX358BMpg7yzArnRDVBwrOj68wjD1p1CO7OgzsTT75-kxv5keytv2OQDJXwSOpPKRg/s1600/labRear.jpg" height="326" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Ergonomic chair. On a spaceship. Of *course* it's ergonomic. Who'd send an un-ergonomic chair into space!?</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj14ErC7X87KkBgr3crJT5AEUrdXaThyphenhyphenYaFx0KD9DnXmZn72AHoXdr5ZWHtJHMcJFEoT3QfT8F_UBmlorNTxQk4facBX0uMbxl74GkD4YzrSpmWcsmgmY8GeQC83belGwhD5KUJffB5vTI/s1600/SDOmonitors.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj14ErC7X87KkBgr3crJT5AEUrdXaThyphenhyphenYaFx0KD9DnXmZn72AHoXdr5ZWHtJHMcJFEoT3QfT8F_UBmlorNTxQk4facBX0uMbxl74GkD4YzrSpmWcsmgmY8GeQC83belGwhD5KUJffB5vTI/s1600/SDOmonitors.jpg" height="322" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Monitor bank, and [how exciting] a server rack. Blinking lights.</td></tr>
</tbody></table>
And at this point my goal is to return to detailing and texturing the magnetic flares around the sun and then release the demo to the community. The aim of this demo is to recreate a location from the film as well as offer a relaxing Oculus demo where you can get a tan in VR. I'll probably be spending a chunk of the Wellington winter months in there attempting to offset seasonal affective disorder.<br />
<br />
Till next time!<br />
<br />
-j<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com4tag:blogger.com,1999:blog-1829332005373102680.post-53004011893788278152014-12-03T22:43:00.005+13:002014-12-12T12:02:45.680+13:00Physically Based Rendering, cubemap reflections, parallax correction, light-mapping and more!So, I feel I've tumbled down some sort of rabbit hole in the last few weeks. What <a href="http://blog.julianbutler.com/2014/11/game-dev-realtime-rendering-and-oculus.html">started as an innocent attempt to model a room from movie</a> has transmogrified into a headrush of new learning about the current state of the art in real-time rendering, and thus a fair amount of dissatisfaction with my current abilities. Only to be expected really!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.marmoset.co/wp-content/uploads/header01.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.marmoset.co/wp-content/uploads/header01.jpg" height="360" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The excellent <a href="http://www.marmoset.co/" target="_blank">Marmoset Toolbag</a> in action</td></tr>
</tbody></table>
Last post I felt like all I needed was some realistic floor reflection cubemaps. Google got me off the starting line with some christmas ornament reflection maps that sorta worked:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuHbCOAMVvIKELaCUKslGvPCdT_ZyzcM_C1fQk47KbqrUcljJwEp-FcYZE6n5z6irAW2z3amvjqDD2oUq_4WdAmkpsij7anMYqJJcn59C2v1fCG83OCJnujLhQt6RIKJ9OBG9WzvQNxHKw/s1600/Preview+0.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjuHbCOAMVvIKELaCUKslGvPCdT_ZyzcM_C1fQk47KbqrUcljJwEp-FcYZE6n5z6irAW2z3amvjqDD2oUq_4WdAmkpsij7anMYqJJcn59C2v1fCG83OCJnujLhQt6RIKJ9OBG9WzvQNxHKw/s1600/Preview+0.png" height="356" width="640" /></a></div>
<br />
Which, when used as a cubic reflection map, produced this sort of look on my floor tiles:<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4qsM-Gtbt4KOoC2Szo8hN4Gs7zaHkk_CMlrFwjrXOS41HUX5ksprrq8YWW-7zMnfweARsm7xliX5hpTWC4Kyza2vOT4T-PbJhx1ztWI06YNAx8tYd1teWAk5UW8H8ViYmVo8JFWvBgnw/s1600/christmasReflect.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4qsM-Gtbt4KOoC2Szo8hN4Gs7zaHkk_CMlrFwjrXOS41HUX5ksprrq8YWW-7zMnfweARsm7xliX5hpTWC4Kyza2vOT4T-PbJhx1ztWI06YNAx8tYd1teWAk5UW8H8ViYmVo8JFWvBgnw/s1600/christmasReflect.png" height="334" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">yeah... ish</td></tr>
</tbody></table>
<div style="text-align: start;">
<div style="text-align: start;">
<span style="text-align: center;">I discovered that <a href="http://docs.unity3d.com/ScriptReference/Camera.RenderToCubemap.html" target="_blank">Unity can make these rather easily internally</a> too. By choosing a transform internally near where the viewer would be situated I can render 6 x 90° camera</span><span style="text-align: center;"> projections that form a box with the images that everything inside that box that was shiny would see. This is great. Now my normal-mapped floor can reflect the sun! Now my own cubic reflection map looks more like this:</span></div>
</div>
<span style="text-align: center;"><br /></span>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRJRq_m65LGG4ybdNMnmPNzOiqED7xasX5zjwshDWMk8VCU70F6pP70ZzcUhaLeSQ672tlRCHT9aGPcv55gpZZvoK_UFa8F4grs4FAFIIPLwU0HRgF2Wp0J5BPnQLxDax3kfupLtF8okM/s1600/ReflectDeckCubeMap.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRJRq_m65LGG4ybdNMnmPNzOiqED7xasX5zjwshDWMk8VCU70F6pP70ZzcUhaLeSQ672tlRCHT9aGPcv55gpZZvoK_UFa8F4grs4FAFIIPLwU0HRgF2Wp0J5BPnQLxDax3kfupLtF8okM/s1600/ReflectDeckCubeMap.jpg" height="426" width="640" /></a></div>
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">Which produces reflections like these:</span><br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhozj0sNkY0cOMI7UOuqtuHYYCmGcXRne6IbLwt2iW1yhwUcvjbpxGfvhiAGoTLjUyEXch8ygtjiykxHcev267Cbi6-M9UY0hhSNvKpas4NQFR50NM0N2CzOuBNRvJAMzwxhDWdSJT_yQ8/s1600/actualSunReflect.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhozj0sNkY0cOMI7UOuqtuHYYCmGcXRne6IbLwt2iW1yhwUcvjbpxGfvhiAGoTLjUyEXch8ygtjiykxHcev267Cbi6-M9UY0hhSNvKpas4NQFR50NM0N2CzOuBNRvJAMzwxhDWdSJT_yQ8/s1600/actualSunReflect.jpg" height="338" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The actual sun reflecting on the actual floor! I'm done! ... NOT.</td></tr>
</tbody></table>
But... well... this is all well and good if the floor is perfectly reflective. Which it's not. It should be covered in micro-abrasions that scatter the incoming light rays and making the reflections blurry. How can I create this in Unity? There's no 'roughness' slider in the default shaders and my metal floor is a long way from looking anything like the quality camera-lens renders above.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><span style="margin-left: auto; margin-right: auto;"><a href="https://www.blogger.com/goog_1176580967"><img border="0" src="http://www.marmoset.co/wp-content/uploads/pbr_theory_microsurf.png" height="246" width="400" /></a></span></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><a href="http://www.marmoset.co/toolbag/learn/pbr-theory">http://www.marmoset.co/toolbag/learn/pbr-theory</a></td></tr>
</tbody></table>
This began my investigation into quality reflection map creation which led me to <a href="http://www.marmoset.co/skyshop" target="_blank">Marmoset Skyshop</a> for Unity. Marmoset make a fantastic set of shaders for Unity 4.5 and up that aim to mimic the energy-conserving properties of a surface and also provide a really good introduction to physically based rendering. I highly recommend reading their <a href="http://www.marmoset.co/toolbag/learn" target="_blank">Toolbag2 Learning pages</a> if you don't know where to start. Turns out, realtime rendering and offline rendering have a much larger overlap than I assumed with next-gen game engines requiring the understanding of BRDF functions and forcing artists to re-consider albedo [diffuse] and specular maps entirely differently than the past 20 years.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.marmoset.co/wp-content/uploads/microcompare05.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.marmoset.co/wp-content/uploads/microcompare05.jpg" height="244" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A little learning goes a long way in improving material appearance.</td></tr>
</tbody></table>
It seems each time I want to improve the appearance of my scene with Unity's rendering capabilities, I end up trawling the Unity Asset Store looking for 3rd party solutions. And there are a few real gems. However, with the imminent release of Unity5 and it's reflection probes, realtime GI, and new super shaders, is it worth spending any money to gain these abilities now? Perhaps not.<br />
<br />
So my choices at this point are begin transitioning my project to Unity5's beta [which I have access to], or pay money to buy solutions that give me these effects now in 4.5 but may or may not be supported or required going forward...<br />
<br />
For now I chose to apply what I've learnt to with what I have. PBR theory has really helped me get better results with the current shader controls - for example the leather texture below has baked-in cracks and grain, breaking up the reflections and giving the couch top appearance a much better feel:<br />
<span style="text-align: center;"><br /></span>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnwPINHyXrU35M5oFGCzK3O9rOD6b8WPZIz-hhCCM6sTV2UtxuAun3Zp9h8OCyMQuvsXf-igzK4SEIsQk60cggQnmjRwwRD2CULj6ri3N5uzoQ0hC_vRIVwb0IqMAJqWT1tDqDzyCpxBU/s1600/couchClose.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnwPINHyXrU35M5oFGCzK3O9rOD6b8WPZIz-hhCCM6sTV2UtxuAun3Zp9h8OCyMQuvsXf-igzK4SEIsQk60cggQnmjRwwRD2CULj6ri3N5uzoQ0hC_vRIVwb0IqMAJqWT1tDqDzyCpxBU/s1600/couchClose.jpg" height="344" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Complete with hand sculpted bum impressions.</td></tr>
</tbody></table>
<span style="text-align: center;">I'll make the jump to a proper PBR version of this scene as Unity5's tech releases. In the meantime I've started light-mapping the room's illumination to get better ambient occlusion and mood. The viewing couch is modelled and in place and some new shaders created.</span><br />
<span style="text-align: center;"><br /></span>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3Ab3A5edyv4BkpcrnXUp5P56XfQtkFaYyRBcDyL-O5FHaBNlTXzA4VsavruJF4arx8pzAlH8KadFU6Y5B8UoukmdIF28YP9l0LeaReD9LlvVtDGClDO9MI3mMmyk_lDmrv_flPfhnvuU/s1600/couchMaya.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3Ab3A5edyv4BkpcrnXUp5P56XfQtkFaYyRBcDyL-O5FHaBNlTXzA4VsavruJF4arx8pzAlH8KadFU6Y5B8UoukmdIF28YP9l0LeaReD9LlvVtDGClDO9MI3mMmyk_lDmrv_flPfhnvuU/s1600/couchMaya.jpg" height="384" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A fun model to make with curved metals and a low-ish polycount.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3fvwxKE5KD0zCf63ckkSdWadamahXUo-8B1W787KFT1RFEJFIwp7CcGvPUBgh-gzW3SRKw2EeSprPH2xFhYpZeKSLlX-oVSfBKmYMm_KxEiDyx_7aO8-r1CRTzwqIsKQ46IaMEFZvtaw/s1600/couchWide.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3fvwxKE5KD0zCf63ckkSdWadamahXUo-8B1W787KFT1RFEJFIwp7CcGvPUBgh-gzW3SRKw2EeSprPH2xFhYpZeKSLlX-oVSfBKmYMm_KxEiDyx_7aO8-r1CRTzwqIsKQ46IaMEFZvtaw/s1600/couchWide.jpg" height="348" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Where O where art thine couch reflections in the floor???</td></tr>
</tbody></table>
<span style="text-align: center;">With my current reflection mapping approach, things don't really line up properly as the virtual surfaces that the reflections are mapped to are actually situated an infinite distance away. Thus you can see the sun reflection on the floor screen-left is actually present where the wall reflections should be appearing. This requires a technique known as parallax-correction - something that <a href="https://www.youtube.com/watch?v=yWBAE-LYcw8" target="_blank">Marmoset has an excellent solution for</a>, and the fantastic Sebastien Lagarde documents on his <a href="http://seblagarde.wordpress.com/2012/11/28/siggraph-2012-talk/" target="_blank">blog here</a>. I'm not sure I'll implement a solution for that as there are other things I'd like to move onto in this project.</span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">Optically in the Rift view I've got a new starfield in the background [I realise exposure-wise that stars would likely be well under the sun's brightness and thus invisible, but you know, VR!], some sun shafts creating beams of light, some dust motes floating in the room for a little atmosphere [after playing Alien:Isolation in VR I couldn't help it - the modelling and lighting in that game is a masterpiece!] and a few other tweaks in store.</span><br />
<span style="text-align: center;"><br /></span>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWi7gKEQOYsEjM2379xDP6eSFnvkyXMF-0HDC3npwcp7J-2K0otU1xYLEdBoN8fh6DzFCIZ2V4OnyVnOyIQOk_4YJIsYNxNL-BdmaWxqMqcKQbv6-C9oXcrpkVP3gHw-0uO1izlvr4BKU/s1600/riftView.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjWi7gKEQOYsEjM2379xDP6eSFnvkyXMF-0HDC3npwcp7J-2K0otU1xYLEdBoN8fh6DzFCIZ2V4OnyVnOyIQOk_4YJIsYNxNL-BdmaWxqMqcKQbv6-C9oXcrpkVP3gHw-0uO1izlvr4BKU/s1600/riftView.jpg" height="336" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">It's a place you can go and just sit and look at the sun.</td></tr>
</tbody></table>
<span style="text-align: center;">Still hitting 75fps no problem at all in OSX and Win7 so I'm not really near the limits of my GeForce 680MX yet. Speaking of which, the current OVR 4.3.1 runtime and Unity integration produce rock-solid head-tracking in Windows and it's kinda stunning. OSX is still a little swimmy. Getting close to producing a youtube clip [maybe 60fps?] for people to try out too. </span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">I'll wrap this up now, but next post I want to detail what I've found out about <strike>timewarping, prediction</strike> alpha transparency, forward rendering Vs. deferred rendering and the Unity Pro Profiler, and of course some scene updates. </span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">Looking for where this began? <a href="http://blog.julianbutler.com/2014/11/game-dev-realtime-rendering-and-oculus.html">Click here to visit the beginning</a>.</span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">So long for now!</span><br />
<span style="text-align: center;"><br /></span>
<span style="text-align: center;">-j</span><br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-6421533049563002012014-11-11T14:27:00.003+13:002014-11-11T14:31:49.671+13:00Game dev, realtime rendering and the Oculus Rift in my fan tribute to the movie SunshineI've been experimenting with the Oculus Rift DK-2 in Unity for some months now. It's incredibly fun to make a place from an idea you have and then go and visit it virtually. It literally keeps me awake at night when I have a moment of inspiration about something new I could try and then get all excited about how to bring it to life.<br />
<br />
Most recently I've started creating a bit of a tribute to the movie <a href="http://www.imdb.com/title/tt0448134/" target="_blank">Sunshine</a>. I really love the solar observation deck depicted near the start of the film:<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhiPMOsC3pQIx_vWhaetvV5r_rh-8JHVgSiIeEUV-t6Ptggok7votv_bCdxt9axwMjdrh7Z-QrNWhzBfFGL8dtgwcywcYLLRs7u0k3iCQumagFdu5axGvsesLtJZ2LihOLoS4fFJpACA90/s1600/sunDeck2.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhiPMOsC3pQIx_vWhaetvV5r_rh-8JHVgSiIeEUV-t6Ptggok7votv_bCdxt9axwMjdrh7Z-QrNWhzBfFGL8dtgwcywcYLLRs7u0k3iCQumagFdu5axGvsesLtJZ2LihOLoS4fFJpACA90/s1600/sunDeck2.jpg" height="264" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Sometimes it's small, sometimes it's big.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnPud52xtTzSQLVxaSZHIGQDXDLg-bu92gxR2iU-6DPkkemjvHuTMp0vyLR96-f7u1fK47J2Tq6zkBUibSMfwlwX6q14FORuYZFNxrQsMVRr15VUwH9RXysU-MIY7v08H3ibWwOEazjIE/s1600/sunDeck.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhnPud52xtTzSQLVxaSZHIGQDXDLg-bu92gxR2iU-6DPkkemjvHuTMp0vyLR96-f7u1fK47J2Tq6zkBUibSMfwlwX6q14FORuYZFNxrQsMVRr15VUwH9RXysU-MIY7v08H3ibWwOEazjIE/s1600/sunDeck.jpg" height="264" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><a href="http://www.imdb.com/name/nm0193295/" target="_blank">Cliff Curtis</a> asking the sun just how many characters of different ethnicities he can get away with playing.</td></tr>
</tbody></table>
Characters on the ship spend time in this portal just gazing at the sun as they head towards it in their ship [to blow it up of course. I'm not kidding BTW]. And for the first two thirds of the film it's as though the sun is a character in the movie - the main protagonist almost. As for the last third, well you'll have to watch it to see.<br />
<br />
I thought this might be a suitably contained idea to learn some realtime rendering skills and game dev abilities. I'm choosing not to focus on interaction so much as art direction. I'm aiming to make a place where you can just go and sit quietly for a while and watch the sun like the characters in the film did.<br />
<br />
Along the way to this I've had the rude-awakening of just what realtime rendering means and how much we are unconstrained and sort of lazy in the film VFX world. In VFX if something takes even 24 hours to render, well, that's kind of ok because the end result should be amazing right? Well in games the goal is often to pull off stunning visual complexity 60, 75 and up towards 120 times per second. And often on hardware that lags behind the sort of computing power available to me at work.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiq4J051GvsizKWOqUbXnQbsjawUHP7ewXcRsmDEL_sYaaMCCBeFRsQC_0tw4QyxlvY2Odg7KBjTigLo1kvBzeM-wiOZVrqkJASoxYXTUpMHBr-LQzEzZgEnWWUJ_1Dv6Qk2liIctWQwWg/s1600/sunDeck3.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiq4J051GvsizKWOqUbXnQbsjawUHP7ewXcRsmDEL_sYaaMCCBeFRsQC_0tw4QyxlvY2Odg7KBjTigLo1kvBzeM-wiOZVrqkJASoxYXTUpMHBr-LQzEzZgEnWWUJ_1Dv6Qk2liIctWQwWg/s1600/sunDeck3.jpg" height="264" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Another cool moment where the crew gather to watch Mercury transition across in front of the sun.</td></tr>
</tbody></table>
I'm developing on OSX on a 27" 2013 Apple iMac with an nVidia 680M GPU with 2GB RAM. I do not have a 980GTX sadly. Although I'll be testing builds in bootcamp in Win7, I'm creating and compiling in OSX because my main DCC toolsets are there at the moment.<br />
<br />
To do this, every aspect of scene creation needs to be efficient. Absolutely NOTHING that does not need to be computed should be:<br />
<br />
Extra faces/verts/edges on your model that are not contributing? Get rid of them.<br />
Small modelled-in details that could be represented more efficiently in a map? Map it.<br />
Extra lines in your shader doing nothing? Get rid of them.<br />
Extra geometry being transformed around that you can't see? Get rid of it.<br />
Using the similar shaders on multiple different objects? Make it one. Make them all one!<br />
Fancy looking particle collisions you can't see and don't really add much? Get rid of them.<br />
Amazing post-effects bringing your framerate down? do you REALLY need them?<br />
<br />
This approach really forces you to consider economies that impact art direction. How fast are procedural textures when you could paint a map? Do you really need to see all the curvature on the edge of that seat you won't get close enough to see if it's costing you 1 FPS? No. So it's a bit of a change from my day job at Weta.<br />
<br />
So right now I've pretty much just got the sun, some flares, some particles, some observation lounge geometry and that's about it. But I also have 75fps on my 680MX on OSX so that's a good sign too. And given the current state of the Oculus SDK and runtime [4.3beta], I should be able to continue to get a decent framerate in the future and definitely a speed boost under Win7. The screenshots below are very work in progress and do not represent the final appearance of this demo.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5EAn9Cs9F_azwk7SDc3RjmTylRpaaTO-q3O4w1TJi3QtZEYJ451q0IksnaNSMJAbd5Wc6yvPaFHwr95ijtCzLKIk3VZuObFPjV1_IJ57cDjTUFTZIjExPYIjNx0T_AONCDHNzBR0TQkY/s1600/UnitySunDeck.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5EAn9Cs9F_azwk7SDc3RjmTylRpaaTO-q3O4w1TJi3QtZEYJ451q0IksnaNSMJAbd5Wc6yvPaFHwr95ijtCzLKIk3VZuObFPjV1_IJ57cDjTUFTZIjExPYIjNx0T_AONCDHNzBR0TQkY/s1600/UnitySunDeck.jpg" height="332" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My sun at the moment, slowly spinning, painted with map data from the <a href="http://sdo.gsfc.nasa.gov/" target="_blank">Solar Dynamics Observatory</a>.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0fkSaf_lLOPTlw4YHuu6dpGT7TGEvZUW37dDDX2jhky-7rMy8pczaa0GvSQYawvYCNasjHMpZLK9fT2W37XxVzukLFVJP8UYlSxOFgETcrh15reXuLsKS3s5xvoUsTVSOztDUiSKxIZo/s1600/UnitySunDeck2.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0fkSaf_lLOPTlw4YHuu6dpGT7TGEvZUW37dDDX2jhky-7rMy8pczaa0GvSQYawvYCNasjHMpZLK9fT2W37XxVzukLFVJP8UYlSxOFgETcrh15reXuLsKS3s5xvoUsTVSOztDUiSKxIZo/s1600/UnitySunDeck2.jpg" height="330" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Just like Minecraft, only hotter.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5lhMr6aRMS0Cdyt3RvoTsuaDXly2bRyvJH7UIHJC-oiJZOkPrwQ9NVSXS0OmJntARf2UKB_6p9-MFMi-AluOZRncBOw9UgzEb9JU5wOxbe2xrweteP_YfBhNLlWAA_WJovker_h0bXgA/s1600/UnitySunDeckOculus.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5lhMr6aRMS0Cdyt3RvoTsuaDXly2bRyvJH7UIHJC-oiJZOkPrwQ9NVSXS0OmJntARf2UKB_6p9-MFMi-AluOZRncBOw9UgzEb9JU5wOxbe2xrweteP_YfBhNLlWAA_WJovker_h0bXgA/s1600/UnitySunDeckOculus.jpg" height="360" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">And of course in the Rift, looking out at my first attempts to paint the solar flare shape textures.</td></tr>
</tbody></table>
Here's some of the things I'm aiming to include in the final version:<br />
<br />
* Have a viewer-controllable exposure facility so you can radiate yourself should you wish.<br />
* Key music from the movie soundtrack - or ambient ship thrum noise.<br />
* Viewer-triggerable Mercury transition.<br />
* Heat-haze effect to depict the atmosphere in the viewing lounge.<br />
* Sun-shaft optical effects.<br />
* A seated figure with mirrorshades on so you can share your epiphanies. [Paging Cliff Curtis to the solar observatory deck]<br />
<br />
Like I said, I'm not planning to focus much on interaction with this one just atmosphere really. But I'm having a blast steadily solving problems one after the other and learning classic game tricks to speed things up.<br />
<br />
A couple of tools from the Unity Asset Store I'm using include <a href="http://u3d.as/content/sonic-ether/se-natural-bloom-dirty-lens/7v5" target="_blank">Sonic Ether's Natural Bloom and Dirty Lens</a>, and also <a href="http://u3d.as/content/pro-flares/pro-flares/62J" target="_blank">ProFlares</a> [which needs some Oculus compatibility updates].<br />
<br />
Next up on my list is HDR cubic environment maps and physically-based rendering.<br />
<br />
So much to learn. So many ways to screw up ;-)<br />
<br />
-julian<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-71176391877152583202014-03-08T23:09:00.000+13:002014-03-08T23:09:03.983+13:00Night Vision Goggles<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.julianbutler.com/Other/Misc/i-VTKCSR7/0/XL/DSCF8424-XL.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://www.julianbutler.com/Other/Misc/i-VTKCSR7/0/XL/DSCF8424-XL.jpg" height="390" width="640" /></a></div>
<br />
He said it was the best thing I ever gave to him.<br />
<br />
They're not real military grade, <a href="http://n4g.com/user/blogpost/sangria/430144">but they work</a>. They came with the Prestige Edition of Call of Duty - Modern Warfare 2 [I did not buy that, I bought these from a friend at work].<br />
<br />
-jJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-22598216584638379422014-02-22T19:33:00.002+13:002014-02-22T19:33:37.455+13:00Lego Downhill Derby at the Ribble Street Races.<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-F7tgNt8/0/XL/DSCF8048-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-F7tgNt8/0/XL/DSCF8048-XL.jpg" height="398" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Jamie and Isobel on the course, getting ready to run.</td></tr>
</tbody></table>
This year as part of the <a href="http://islandbayfestival.org.nz/">Island Bay Festival</a>, the <a href="http://islandbayfestival.org.nz/index.php?cID=151">Ribble Street Races</a> held a Lego Downhill division. Jamie and I could not resist taking part, especially as we felt that the use of my old lego from 1980 had wheels in it that would trounce the competition.<br />
<div>
<br /></div>
<a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/">Click here to jump straight to the photo gallery</a>.<br />
<br />
I still own most of the parts of this Lego Technic 8860 kit from 1980 - thanks to Mum and Dad for keeping hold of this for me!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://farm3.staticflickr.com/2541/4065510867_dabdc83174_o.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://farm3.staticflickr.com/2541/4065510867_dabdc83174_o.jpg" height="425" width="640" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://thelegocarblog.com/2011/11/20/lego-technic-8860-car-chassis-review/">click here for a review of this kit</a></div>
<br />
So a week ahead of schedule we got down to work. We drew up a cheat sheet of what we felt were the main issues: terrain and hill gradient, wheel choice, centre of gravity and vehicle weight.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-tJmHSXb/0/XL/IMG_4912-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-tJmHSXb/0/XL/IMG_4912-XL.jpg" height="480" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Secret plans, secret plans... plot, hatch, scheme.</td></tr>
</tbody></table>
Here's what we came up with, the main concept being a very low centre of gravity, and being fast and mean [so mean in fact that during the test run we rode roughshod over another vehicle monster-truck style - oops! Oh well, mess with the bull, get the horns I always say]. <a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/">Check out the gallery for more images</a>.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-7vh3sD3/1/XL/IMG_1509-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-7vh3sD3/1/XL/IMG_1509-XL.jpg" height="327" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Say hello to 'Old Skool - 1980'. All your race R belong to US.</td></tr>
</tbody></table>
There was a huge variation in designs, with everything from boats [on wheels] and tanks to monster trucks and small nimble things that went fast but deviated off course very quickly. Jamie and I quickly found that our creation took it's sweet time to get up to speed, but then went relatively straight and became somewhat unstoppable.<br />
<br />
The main race stipulations being that your entry had to fit within a 40cm cubic dimension. All parts must be actual lego. No adhesives or extra weight to be used or fitted internally. Your vehicle MUST be able to transport at least one Lego mini-figure. Didn't say nuthin' about having THREE mini-figures.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-Bt26pQ8/0/XL/IMG_4927-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-Bt26pQ8/0/XL/IMG_4927-XL.jpg" height="420" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Helms-man, Navigator and Psy-Ops.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-RggR9Tf/0/XL/IMG_5103-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-RggR9Tf/0/XL/IMG_5103-XL.jpg" height="180" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Sweet, forgiving asphalt. Not the usual stone-chip NZ road surface.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-kbJ6C8b/0/XL/DSCF8086-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-kbJ6C8b/0/XL/DSCF8086-XL.jpg" height="388" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">For Glory! ... and a smash up at the end</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-X7gHQwh/0/XL/DSCF8057-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-X7gHQwh/0/XL/DSCF8057-XL.jpg" height="382" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">First look at some of the competition.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-jw7dgSz/0/XL/DSCF8103-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-jw7dgSz/0/XL/DSCF8103-XL.jpg" height="392" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Heat two for the open class, Jamie gets a last minute tip on direction from a race official.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-cnQkbrx/0/XL/DSCF8053-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-cnQkbrx/0/XL/DSCF8053-XL.jpg" height="382" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">City Arts Manager Martin Rodgers from the Wellington City Council doing a great job on the mic.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-83hTvLH/0/XL/DSCF8091-XL.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-83hTvLH/0/XL/DSCF8091-XL.jpg" height="356" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Unorthodox Death Machine on the loose!</td></tr>
</tbody></table>
We managed 2nd place in the Open Class! [open cause if you had grown up help you couldn't enter in the younger age-brackets]. For Great Victory!! Here's Jamie on the podium letting the crowd in on a few secrets of our success:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-m3xkV3W/0/XL/DSCF8125-XL.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/i-m3xkV3W/0/XL/DSCF8125-XL.jpg" height="450" width="640" /></a></div>
<br />
We won video rental, dessert pizza from Hell.co.nz and a nice ceramic mug. Congratulations to all the well-deserved wins in the other divisions, [you can see more of the winners in the <a href="http://www.julianbutler.com/Events/Ribble-Street-Races-Lego/">gallery</a>] and well done Island Bay for hosting Wellington's very first Lego Downhill Derby.<br />
<br />
Next year I think I'll take a more back-seat stance and have our kids build and enter their own vehicles now we know the lay of the land. We've got a heads up on next years design already.<br />
<br />
-j<br />
<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-24155225438081015952014-02-16T20:15:00.001+13:002014-02-16T20:30:19.149+13:00A post in which I reveal my keyboard-fetish dreams have finally come true.<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjed27yXMIsc_5zB5MO6MVDTcbE8Cjq2fU60jKYzCzHx8RV4E6c9Rom2rUh41xidO9_7zQ0xP9JyzNCOsSQCpq85KYijgK2rql1RpiUg0dMop0kJ6xaZ_0TBICHYZwNvWzcQo3FR5GpGJQ/s1600/das.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjed27yXMIsc_5zB5MO6MVDTcbE8Cjq2fU60jKYzCzHx8RV4E6c9Rom2rUh41xidO9_7zQ0xP9JyzNCOsSQCpq85KYijgK2rql1RpiUg0dMop0kJ6xaZ_0TBICHYZwNvWzcQo3FR5GpGJQ/s1600/das.jpg" height="237" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">From this sorry thing...</td></tr>
</tbody></table>
<br />
<a href="http://blog.julianbutler.com/2010/11/i-am-going-das.html" target="_blank">A while back</a> in an effort to force myself to learn to touch type at work, I blacked out the keys of my keyboard with duct tape. This worked. However, after two hot summers, the duct tape began to slide off and my co-workers said my keyboard was looking a little ghetto. It's true, my fingers were getting sticky and my typing full of errrrrrorsa nd musssstaaaaakes. Time for a replacement:<br />
<br />
Enter the <a href="http://www.keyboardco.com/keyboard/usa-filco-ninja-majestouch-2-tenkeyless-nkr-tactile-action-keyboard.asp">Filco Majestouch Tenkeyless Ninja</a>!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidTMVePicfKFdVIVVqEm055pj19WWBydOzNhSdHv1Db6qwtXzfT4T0vxVxhb0eLSbeN-5l4F3PF97KxvIzeOmKriOhHossSoB0GDb89T6B4-85uNze2FKWhw1B9FuheFNSb6ipuR28vAg/s1600/IMG_4946.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEidTMVePicfKFdVIVVqEm055pj19WWBydOzNhSdHv1Db6qwtXzfT4T0vxVxhb0eLSbeN-5l4F3PF97KxvIzeOmKriOhHossSoB0GDb89T6B4-85uNze2FKWhw1B9FuheFNSb6ipuR28vAg/s1600/IMG_4946.JPG" height="284" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">To the future and beyond.</td></tr>
</tbody></table>
<br />
Thanks to the fine sales people at <a href="http://www.keyboardco.com/">The Keyboard Company</a> I am now the proud owner of this masterfully fine thing. And yes, those are metal [zinc to be precise] WASD keys. I've also replaced the 'f' and 'j' with radially etched Line2R green keys from <a href="http://www.wasdkeyboards.com/index.php/textured-cherry-mx-keycaps.html#ad-image-4">wasdkeyboards.com</a> and replaced the escape key with a red one. I'm sporting <a href="http://deskthority.net/wiki/Cherry_MX_Brown">Cherry MX Brown</a> switches with silencing rings under the main letter keys and the return key.<br />
<br />
It is well nice to type on. I see many customisations in my <a href="https://www.google.co.nz/search?q=custom+keycaps">keyboard-fetish-future</a>.<br />
<br />
Onwards!<br />
<br />
-jJulian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com1tag:blogger.com,1999:blog-1829332005373102680.post-2742446068711420632014-02-12T22:45:00.001+13:002015-01-16T00:11:01.756+13:00Net-neutrality, AppleTV and just what did Steve Jobs 'crack' then?<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://magicdroidtv.com/wp-content/uploads/2013/10/cable_tv_streaming-tv.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://magicdroidtv.com/wp-content/uploads/2013/10/cable_tv_streaming-tv.png" height="289" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="font-size: x-small; text-align: start;">But you have cable TV, right?</span></td></tr>
</tbody></table>
We have two main competing ISPs [Telecom and Vodafone] offering set-top boxes that record and time-shift content from a single cable-TV style provider - SkyTV. The hardware they provide represents varying levels of competence and reliability and require extra fees to enable the viewing of hi-def channels, all in the context of constant ad-breaks and promos for other content. But enough moaning, it's clear that business model is coming to an end, however all too slowly.<br />
<div>
<br /></div>
I'm just about ready to be a <a href="https://www.google.co.nz/search?q=cord+cutting" target="_blank">cord-cutter</a>. But here in NZ, data access and bandwidth caps produce hesitation in anyone who is not interested in paying additional $$$ in monthly bills to their ISP. Apple and other content providers still don't make legally available anything near the range of TV-series available like there is in Australia due to distribution deals with SkyTV. And I don't fancy having multiple iTunes accounts, VPN access and having to pay a stranger on Ebay a fee to buy a US iTunes voucher for me, scratch off the number and email it to me so I can buy content semi-legally in a timely fashion.<br />
<br />
Seriously, when can I give money to the people who make the content and have them make it available when it's ready? It's 2014!! I'm moaning again. Sorry. <a href="http://theoatmeal.com/comics/game_of_thrones" target="_blank">The Oatmeal sums it up fantastically if you haven't already seen it.</a><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://cdn.macrumors.com/article-new/2012/03/apple_tv_2012_interface.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://cdn.macrumors.com/article-new/2012/03/apple_tv_2012_interface.jpg" height="246" width="400" /></a></div>
<br />
<br />
So we began renting movies that are available through iTunes instead of physical media [DVDs, Bluray discs] through our local movie rental shop. At first, the iTunes delivery is stable and great. We have a 40mb cable modem connection which is more than enough to stream a 720p movie [even with a 15min wait at the start]. But about a year ago, quality of service from iTunes began to drop, with movies we rented pausing 2/3rds through and demanding we wait 20 mins for buffering. Why? Don't we have a fast enough internet connection for this?<br />
<br />
Enter the Net-Neutrality debate.<br />
<br />
<div>
From <a href="http://en.wikipedia.org/wiki/Net_neutrality" target="_blank">Wikipedia</a>:<span style="font-family: Trebuchet MS, sans-serif;"> </span><span style="font-family: Verdana, sans-serif;">Net neutrality (also network neutrality or Internet neutrality) is the <a href="http://en.wikipedia.org/wiki/Principle">principle</a> that <a href="http://en.wikipedia.org/wiki/Internet_service_provider">Internet service providers</a> and governments should treat all data on the <a href="http://en.wikipedia.org/wiki/Internet">Internet</a> equally, not discriminating or charging differentially by user, content, site, platform, application, type of attached equipment, and modes of communication.</span></div>
<div>
<br /></div>
<div>
As the internet is increasingly invaded commercially, we've all long suspected ISPs of being vulnerable to 'shaping' traffic volumes where they perhaps shouldn't. Recently this is a hotly debated topic in the US and the forces for and against net-neutrality are slugging it out, with consumers wearing any fallout.</div>
<div>
<br /></div>
<span style="font-family: Verdana, sans-serif;">On January 14, 2014, the DC Circuit Court determined that the FCC has no authority to enforce Network Neutrality rules, as service providers are not identified as "common carriers".</span><br />
<div>
<br /></div>
<div>
I don't know how this is going to play out, but I'm on the side of legislation that protects the internet from too much commercial influence and perpetuates the abilities of anyone to use it fairly as a communication medium, from freedom-fighters to Facebook, Twitter to TradeMe and back. You wouldn't want to have to pay your ISP for top-tier access to your favourite sites on top of monthly access and bandwidth caps would you? Me neither. We've all had enough of 'over-the-top' services from cell providers huh. <a href="http://en.wikipedia.org/wiki/Dumb_pipe" target="_blank">Dumb-pipes</a> await.</div>
<div>
<br /></div>
A few days ago, <a href="http://boingboing.net/2014/02/07/verizon-support-rep-admits-ant.html" target="_blank">BoingBoing</a> covered the discovery by an independent blogger that Verizon in the US are aggressively throttling Netflix traffic. This blogger, <a href="http://davesblog.com/blog/2014/02/05/verizon-using-recent-net-neutrality-victory-to-wage-war-against-netflix/" target="_blank">Dave Raphael manages to capture the discussion</a> with a Verizon tech representative where it's admitted that this is in fact what is going on:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://media.boingboing.net/wp-content/uploads/2014/02/verizon_fail1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://media.boingboing.net/wp-content/uploads/2014/02/verizon_fail1.png" height="640" width="516" /></a></div>
<br />
So what's this got to do with AppleTV then?<br />
<br />
Well back when I bought AppleTV [2nd gen], I was high on the hope Steve Jobs was about to unveil an app store for it and we'd be able to do some of the things we do on the phone/iPad on the TV. That never came to pass, but Steve did, and all we were left with was the notion he'd 'cracked' it and we'd soon be blessed by something much better. And it's been just that, a notion.<br />
<br />
Slowly Apple have been adding channels to AppleTV over the last two years. My impression was that this was the thin end of the wedge and that Apple were collecting content makers together one by one to quietly begin to be able much more varied and higher quality offerings than our traditional providers. Yet nothing has really materialised in terms of hardware despite rumours about large screens, 4k displays, bezel-free designs, magical remote control rings etc. Other rumours suggest Apple are hard at work tying up agreements and making deals behind closed doors, getting ready to do to TV what iTunes did to music sales.<br />
<br />
Then this report surfaces on MacRumours detailing Apple's progress on building their own content delivery network:<br />
<br />
http://www.macrumors.com/2014/02/03/apple-developing-cdn/<br />
<div>
<br /></div>
<span style="font-family: Verdana, sans-serif;">"Apple built its retail store chain because Steve Jobs wanted to own Apple's interactions with its customers. With iTunes and iCloud, Apple controls the data and the service, but must outsource the less visible but still incredibly important job of reliably delivering data packets to users. With hundreds of millions of users downloading apps, music, TV shows and movies -- with many of those being streamed in real-time to the Apple TV -- ensuring quality of service for all users will be essential. "</span><br />
<div>
<br /></div>
And now I understand. Apple have already made the assumption that net-neutrality is going down the drain and are positioning themselves to be able to guarantee quality of service to their customers with their own content delivery system. And as Apple are a company that prefer to have all their ducks in a row before rolling out a new product [well not always], I don't believe we'll see any large announcements about AppleTV or channels before these infrastructure updates are complete.<br />
<br />
Given the timeline for the data-centre completions and the focus on a watch-style product right now and new iPhone6 rumours, I don't see TV announcements on the horizon for another year at least. Maybe I'm <a href="http://www.macrumors.com/2014/02/10/apple-tv-in-ios7/" target="_blank">wrong</a>, but, I don't think so. Looking forward to ditching the cable box though.<br />
<br />
Update: It's 2015 and nothing has changed regarding Apple's approach to TV. It's effectively still a hobby for them. The iPhone6 is here, the watch is about to hit and no TV in sight.<br />
<br />
-j<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="http://www.decisivemagazine.com/sites/default/files/images/Cord-cutting-2.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="http://www.decisivemagazine.com/sites/default/files/images/Cord-cutting-2.png" height="157" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Wouldn't it be great if these services were available in NZ without using a VPN?</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com2tag:blogger.com,1999:blog-1829332005373102680.post-19806352088435732912013-12-14T17:50:00.000+13:002013-12-15T10:12:50.802+13:00My Fuji x100s Christmas Fauxhemian TransformationI've been craving the <a href="http://www.finepix-x100.com/" target="_blank">Fuji-x100</a> since it came out. The combination of the rangefinder aesthetic and sharp, fixed 35mm equivalent lens have me thinking constantly of how many situations I'd be able to use one in where toting the 5DmkII might be overkill, and the iPhone5 would be underkill.<br />
<div>
<br /></div>
<div>
Luckily enough I've waited long enough that the successor to Fuji's upstart digital rangefinder, the x100s is now available and is going to be my Christmas present this year!<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjd8zmnkAj_SjBm2xvV5y3yNtoTZZAhY_6uiS3lFpubU1QRk_PffSU7Hbzjuxp2ov2GSaQopt3FVjguQU-22WNmyoP4YfO2Bbhi3U52pOibMCXk_4oTYOHqBL6yLSD_aYXLtdn8aNyQdmU/s1600/X100s-lede.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="382" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjd8zmnkAj_SjBm2xvV5y3yNtoTZZAhY_6uiS3lFpubU1QRk_PffSU7Hbzjuxp2ov2GSaQopt3FVjguQU-22WNmyoP4YfO2Bbhi3U52pOibMCXk_4oTYOHqBL6yLSD_aYXLtdn8aNyQdmU/s400/X100s-lede.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">David Hobby's x100s in it's Lo-Fi travel duct-tape camouflage.</td></tr>
</tbody></table>
<div>
<br />
My usage theory goes like this:</div>
<div>
<br /></div>
<div>
1. Canon 5DmkII for assignments and clients and weddings etc.</div>
<div>
2. Fuji x100s for holidays, short trips and places where too much gear is a burden.</div>
</div>
<div>
3. iPhone5 for everything else.</div>
<div>
<br /></div>
<div>
David Hobby [<a href="http://strobist.blogspot.co.nz/" target="_blank">Strobist</a>] has been using the x100s for a while now and has an excellent series of posts covering it's flexibility and attraction to photographers everywhere:</div>
<div>
<br /></div>
<div>
<a href="http://strobist.blogspot.co.nz/2013/03/in-depth-new-fujifilm-x100s.html" target="_blank">In-Depth: The New Fujifilm X100s</a></div>
<div>
<a href="http://strobist.blogspot.co.nz/2013/10/on-assignment-margo-seibert.html" target="_blank">On Assignment: Margo Seibert</a></div>
<div>
<a href="http://strobist.blogspot.co.nz/2013/12/fuji-follow-up.html" target="_blank">Fuji Follow-Up</a></div>
<div>
<br /></div>
<div>
And his YouTube run-through of the features is worth a look if you need a sense of what the camera can do and where it fits into his world: Click on the image below to visit YouTube and watch:</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.youtube.com/watch?v=kPMnqzjLEAs" target="_blank"><img border="0" height="241" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEindJZIvRcBKLNq6OTZqItcIHWG45Okes3ilXGc-tn7O3CwQCTH436JvOUoOz_nXhLqc6hJzBGo6j7cNBdqpJBWF999H2jaU_o63W9Mv5eYn-JJhJGUCS93JVt_Hc2GK27QRWnDF4M7QSA/s400/strobist.jpg" width="400" /></a></div>
<div>
<br /></div>
<div>
And if you wanted more, a search on Flickr shows <a href="http://www.flickr.com/search/?q=fuji%20x100s" target="_blank">plenty of examples</a> by others.</div>
<div>
<br /></div>
<div>
I think my biggest stumbling block is going to be the x100s' excellent in-camera processing and jpeg output. This may mean not shooting RAW files as the increase in shooting speed and flexibility afforded is pretty impressive. I... *think* I'm going to have trouble committing to this over the xmas break and won't have my computer to compare images on, so maybe RAW+jpeg it will have to be.</div>
<div>
<br /></div>
<div>
I make use of the <a href="http://www.julianbutler.com/Other/Hipstamatic" target="_blank">iPhone5</a>'s panoramic shooting mode regularly: </div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://www.julianbutler.com/Other/Hipstamatic/i-QGhrnDG/0/X2/IMG_3755-X2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="186" src="http://www.julianbutler.com/Other/Hipstamatic/i-QGhrnDG/0/X2/IMG_3755-X2.jpg" width="640" /></a></div>
<div>
<br /></div>
<div>
and so I'm pretty stoked to read that the x100s has a mode for shooting this way too. It should mean higher res and sharper images in this format. </div>
<div>
<br /></div>
<div>
The way the lens flares out is great. The built-in ND filter is good. The ergonomics are very nice. The shooting modes and film emulation is fantastic. The leaf-shutter and wide aperture should mean much more interesting looks outdoors in full sunlight. In short, I cannot wait for Christmas Day this year!!</div>
<div>
<br /></div>
<div>
I hope your Christmas is filled with family, fun, sun and good cheer!</div>
<div>
<br /></div>
<div>
-j</div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-75249728311792615442013-09-30T23:37:00.001+13:002013-10-01T11:07:18.923+13:00So long Facebook and thanks for all the posts!<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9JuKKHfi-0OP9BnD-v_eFRx9D8tSzop9_7LvuyQkAVmDFRTSntqbXJK-A2KS7mcXe1u9TGK25BSqpAnVmYJKzMLqoodizHJOx2ByzPs5ncWj-TRPOttN8I0Zk6_04XomHX2JvrsHs_OI/s1600/information.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="294" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9JuKKHfi-0OP9BnD-v_eFRx9D8tSzop9_7LvuyQkAVmDFRTSntqbXJK-A2KS7mcXe1u9TGK25BSqpAnVmYJKzMLqoodizHJOx2ByzPs5ncWj-TRPOttN8I0Zk6_04XomHX2JvrsHs_OI/s640/information.jpg" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">....</td></tr>
</tbody></table>
Yeah, so I deleted my Facebook account. I haven't activated any bots to go through and remove me from any posts historically cause I think that's kinda rude towards my friends and family who are still there. But I'm gone and I won't be leaving any more info with that company for future use or whatever.<br />
<br />
I'm not so naive that I think that this will 'erase' me from that part of the internet. Far from it. Information I left inside Facebook is only private for now and I know it's never really deleted just stored for later on. In the meantime they're already <a href="http://www.theverge.com/2013/9/21/4753944/facebook-deep-learning-artificial-intelligence-research" target="_blank">employing new algorithms</a> to figure out which of your posts are worth reading or not.<br />
<br />
I'm not bitter, and it was a free service and I definitely got some fun time out of it. I'm under no illusions about the nature of the arrangement, but I think the scales are finally tipping too far in their favour.<br />
<br />
So [aside from <a href="http://en.wikipedia.org/wiki/Criticism_of_Facebook" target="_blank">this stupendous list</a> of criticisms!] these are <b>my</b> reasons for departing [in no particular order]:<br />
<br />
<ul>
<li>1. Ever increasing monetization. [Wasn't it more fun when there were less ads?]. <b>Edit</b>: Now they want <a href="http://technologyadvice.com/facebook-autofill-tries-to-simplify-mobile-purchases/" target="_blank">your credit card too</a>!</li>
<li>2. Pretensions to politics. [<a href="http://fwd.us/">fwd.us</a> Really? Something about this just ain't right].</li>
<li>3. <a href="http://www.nytimes.com/2013/09/12/technology/personaltech/ftc-looking-into-facebook-privacy-policy.html" target="_blank">Privacy policy changes</a> and fluctuations in the use of your material. What, don't you want to be famous? Your face could be used to sell vacuum cleaners!</li>
<li>4. Facebook's <a href="http://arstechnica.com/business/2013/09/facebook-suddenly-deletes-social-fixers-facebook-pages/" target="_blank">crappy treatment of 3rd party devs</a>. I believe Social Fixer likely made Facebook a better place to be, although I never used it. What's wrong with people making client-side tools to make Facebook's site more enjoyable?</li>
<li>The bloated arching rise of the <a href="http://www.chrisbrogan.com/junkweb/" target="_blank">Junk Web</a>. I love me some cat macros but some of my 'friends' are completely out of control on Facebook. And it's ugly.</li>
<li>The inane posts, <a href="http://vaguebook.org/" target="_blank">VagueBooking</a>, and general time wastage. I'd rather <a href="http://magicalnihilism.com/2009/11/07/get-excited-and-make-things/" target="_blank">get excited and make things</a> than sit and read about someone else eating their dinner.</li>
<li><a href="http://www.abine.com/blog/2012/how-facebook-buttons-can-track-you-across-the-web/" target="_blank">Cross-site tracking</a> and monitoring what I'm into, yo.</li>
</ul>
<br />
All this being said, I love you people and think the internet is a great place. Too bad it's being compartmentalised and monetized so aggressively. I'll be in touch, and you know, when we chat next I'll have more new stuff to tell you cause you didn't hear it all over Facebook!<br />
<br />
I may return to Facebook at some point in the future but likely only as a photography or app business entity as I fully recognise the power [hand raised to eye in salute] of 'the social' in this regard.<br />
<br />
-julian<br />
<br />
P.S. if you have interesting reasons why are or are not on Facebook I'd love to hear from you in the comments! What did I miss? Why should I still be there? What did you have for breakfast? Don't hold back.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjW2-4uA6tVqy96U08r4jDkp0gkqL08JAcZ5ocSO1IeWtsGqB8Bg94Ckt7r6tDCLyJazvSJeVPtTU_81oSrFrNZsSmvASFAMuJHQUu1IzJOFklwrNIrMA4S8IO5-ZKypaK658h7BL1QRT8/s1600/facebookAndYou.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjW2-4uA6tVqy96U08r4jDkp0gkqL08JAcZ5ocSO1IeWtsGqB8Bg94Ckt7r6tDCLyJazvSJeVPtTU_81oSrFrNZsSmvASFAMuJHQUu1IzJOFklwrNIrMA4S8IO5-ZKypaK658h7BL1QRT8/s400/facebookAndYou.jpg" width="400" /></a></div>
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0tag:blogger.com,1999:blog-1829332005373102680.post-9706336404845916222013-09-19T23:14:00.000+12:002013-09-19T23:14:16.570+12:00Privacy<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmVZoF0zItCmFGmbzWUO2OodENdtsAe3wkUBAgc6zs71_UooiGB7EQ712ElN54uydHQK92nMckr_tLGd-ywcHr90NFh0S1xvgsvMpeME8U3Y_y_nKjK1sewhMZyQbZci5xckxGYVC3tfA/s1600/wescan.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmVZoF0zItCmFGmbzWUO2OodENdtsAe3wkUBAgc6zs71_UooiGB7EQ712ElN54uydHQK92nMckr_tLGd-ywcHr90NFh0S1xvgsvMpeME8U3Y_y_nKjK1sewhMZyQbZci5xckxGYVC3tfA/s400/wescan.jpg" width="295" /></a></div>
<br />
<br />
So... I don't know about you, but I find the furore surrounding <a href="http://en.wikipedia.org/wiki/Edward_Snowden" target="_blank">Edward Snowden</a>'s revelations regarding the NSA and big tech companies betrayal of it's users and their data astounding. I mean we used to joke that privacy on the internet was a thing of the past and now we know conclusively that it really is.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKxnXFu3wbQlCG__NxfrclGDd2kD8UCs_rotIFQsoMQJfbja-te2xWdM97RZyV85zbsy1DeRAzHvpgPm_TgasChT9jCEqr1FjwyougZu64EbkwVGda4fPiOf9m10NHijZagWXjd9mZSAU/s1600/EdwardSnowden.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKxnXFu3wbQlCG__NxfrclGDd2kD8UCs_rotIFQsoMQJfbja-te2xWdM97RZyV85zbsy1DeRAzHvpgPm_TgasChT9jCEqr1FjwyougZu64EbkwVGda4fPiOf9m10NHijZagWXjd9mZSAU/s400/EdwardSnowden.jpg" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Edward, we ALL owe you a beer, and probably a LOT more.</td></tr>
</tbody></table>
I realised we were living in a surveillance-state online only about 8-12 months ago when I began to use <a href="http://www.ghostery.com/" target="_blank">Ghostery</a> in addition to <a href="https://adblockplus.org/" target="_blank">adblock</a> in my web-browsing activities. In using these tools I learnt more about online advertising's pervasive intrusions into our everyday web-surfing habits involving cookies, image tags, HTML mail with triggers pointing back to companies letting them know when their message had been even simply opened by me, and more. But I still thought that my communications were basically safe, let alone un-interesting enough that I wasn't really concerned that it might be true that someone, somewhere could be reading or listening in. I wasn't using encryption beyond https or SSL and the idea that I might want PGP or anything else seemed like an unnecessary encumbrance.<br />
<br />
<b>"If you have nothing to hide you don't need any rights."</b><br />
<br />
Things have changed dramatically in a few short months. The online landscape has forever altered. US spying programs with names like <a href="http://en.wikipedia.org/wiki/PRISM_(surveillance_program)" target="_blank">PRISM</a>, <a href="http://en.wikipedia.org/wiki/UKUSA_Agreement" target="_blank">Five-Eyes</a>, <a href="http://en.wikipedia.org/wiki/XKeyscore" target="_blank">Xkeyscore</a>, <a href="http://en.wikipedia.org/wiki/Tempora" target="_blank">Tempora</a> and more yet to be revealed are illustrating [much to the US' chagrin] just to what extent ALL our communications are being hoovered up 24/7 in rolling caches of searchable records. And individuals like NSA head General Keith B. Alexander are <a href="http://www.dailydot.com/politics/keith-alexander-dutch-cybersecurity-speech-kpn/" target="_blank">struggling to stay on top of the leaks</a> and are forced to engage in a comedic cat and mouse game of leak vs assurance in what increasingly is a massive abuse of the trust of the every American citizen.<br />
<br />
And it's not just the US. The UK have a <a href="http://www.theguardian.com/uk/2013/jun/21/gchq-cables-secret-world-communications-nsa" target="_blank">three-day rolling store</a> of pretty much everything going in and out of the United Kingdom in electronic form. And metadata storage for up to 30 days. Metadata can be more revealing in analysis than the actual conversation in your phone call. Information about who and when you spoke to someone and for how long can be <a href="http://kieranhealy.org/blog/archives/2013/06/09/using-metadata-to-find-paul-revere/" target="_blank">manipulated in ways that yield connections between parties</a> otherwise invisible at first glance.<br />
<br />
<br />
<b>New Zealand</b><br />
<br />
Here in New Zealand, the ongoing saga of Kim Dotcom being illegally spied upon by the NZ government simply won't seem to lay down and be quiet. This is likely because the <a href="http://publicaddress.net/onpoint/ich-bin-ein-cyberpunk/" target="_blank">GCSB and NZ Police force used PRISMs data</a> supplied by the NSA in their efforts to raid his Auckland home and place him in police custody on behalf of the US government. If the NZ government is using PRISM to look for copyright offenders where do they draw the line with your information? PRISM was designed to facilitate<span style="font-family: Times, Times New Roman, serif;"> US <span style="background-color: white; line-height: 19.1875px;">government surveillance of foreign intelligence targets "reasonably believed" to be outside of the United States during the Bush era, not spy on what music you might be downloading.</span></span><br />
<br />
<a href="http://en.wikipedia.org/wiki/John_Gilmore_(activist)" target="_blank"><b>"The Internet interprets censorship as damage and routes around it."</b></a><br />
<br />
This famous quote by activist John Gilmore from the <a href="https://www.eff.org/" target="_blank">Electronic Freedom Foundation</a> seems to suggest that the internet can heal itself and work around problems like censorship. Organisations or governments that wish to know what your internet connection is carrying use technology like <a href="http://en.wikipedia.org/wiki/Deep_packet_inspection" target="_blank">deep packet inspection</a> and <a href="http://en.wikipedia.org/wiki/Internet_censorship_in_Australia" target="_blank">internet filtering</a> that are designed to detect the nature of traffic on the internet. However in the wake of the knowledge that the NSA and GCHQ actually <a href="http://www.theguardian.com/uk/2013/jun/21/gchq-cables-secret-world-communications-nsa" target="_blank">fibre-tap the very cables</a> that cross undersea between our countries before they come into contact with your countries different landing points and ISP's, this sentiment now seems more like a wistful, rose-tinted vision of the pre-Snowden era that we'd now like to magically come true. And it's really not going to.<br />
<br />
<br />
<b>Laura Poitras and Glen Greenwald</b><br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGgqdV_ehbQUaoQcqeKDYWZ-lke78XoW0nlcZ4Uysrj38pd0y-gs8XPS_88QDDTwDI9Aqjz4r4oN4_WlH3Lgv2QXk7DSwp7fEJGA-uObHgmF8LqRLUazS_UIqq316bVh0cYI51dQw3zgE/s1600/PoitrasGreenwald.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="283" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjGgqdV_ehbQUaoQcqeKDYWZ-lke78XoW0nlcZ4Uysrj38pd0y-gs8XPS_88QDDTwDI9Aqjz4r4oN4_WlH3Lgv2QXk7DSwp7fEJGA-uObHgmF8LqRLUazS_UIqq316bVh0cYI51dQw3zgE/s640/PoitrasGreenwald.jpg" width="640" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
The two reporters chosen and initially contacted by Edward Snowden are now under tremendous pressure from multiple governments and go to <a href="http://www.nytimes.com/2013/08/18/magazine/laura-poitras-snowden.html?pagewanted=all" target="_blank">great lengths to protect themselves electronically</a>. They are at the pointy end of the very tools they are exposing and the measured pace with which they reveal each new piece of information is vital in keeping these issues on the tip of public awareness. If they revealed everything in one release WikiLeaks style, the scandal would likely froth over and the US would quickly return to pointless outrage over Miley Cyrus' VMA costume choice or other such matters, and it'd soon be business as usual.<br />
<br />
<br />
<b>Sympathy for the devil?</b><br />
<br />
<a href="http://en.wikipedia.org/wiki/File:Prism-week-in-life-straight.png" target="_blank">Google, Microsoft, Facebook, Yahoo, AOL, Skype and most recently Apple</a>...<br />
<br />
We all use something from one of these companies in some shape or form, right? And they all let the NSA in the door and they all share data with the US government. They are now <a href="http://blogs.computerworld.com/cyberwarfare/22503/apple-google-microsoft-and-more-demand-transparency-post-prism" target="_blank">clamouring for permission</a> from the US government to reveal the amount and type of data requests placed upon them in an effort to own up to their part in the information hoovering. I find it very hard to have sympathy for them at this point and indeed <a href="http://www.zdnet.com/u-s-cloud-industry-stands-to-lose-35-billion-amid-prism-fallout-7000018974/" target="_blank">so it seems does Europe</a> and other large parts of the world. Confidence in cloud-based storage products is taking a bashing as anything connected to the internet is vulnerable. I'm not about to trade-in my cellphone for paper and pen but it makes me think twice about *free* services like Gmail or Facebook.<br />
<br />
<br />
<b>Now the conversation about real privacy begins.</b><br />
<br />
I think we're lucky that the debate is happening. I believe we're lucky that a democratic world-leading country is at the heart of these revelations and is attempting to deal with them in any sort-of public fashion. I find it hard to imagine many other countries owning up to the nature and extent of programs like that which the NSA are engaged in and discussing ways of <a href="http://www.nytimes.com/2013/09/07/us/politics/legislation-seeks-to-bar-nsa-tactic-in-encryption.html?_r=1&" target="_blank">backing up and out of the current situation</a>. I acknowledge that there are necessary lengths that governments need to go to in order to protect it's people and I don't pretend to know where that line gets drawn. But it simply does not include my internet t-shirt orders by default, right? Or your phone call to order pizza? Or our drunken text messages from last night? Or your Facebook status update about how you just fed your cat. Or indeed the photos of your cat on your cellphone.<br />
<br />
So now the rush to build proper <a href="http://www.wired.com/threatlevel/2013/09/the-scramble-to-build-encryption/" target="_blank">NSA-proof encryption begins</a> as it's revealed that the NSA has worked hard to <a href="http://www.theguardian.com/technology/2013/sep/16/nsa-gchq-undermine-internet-security" target="_blank">undermine established encryption standards</a> so they can peer into hidden communications. Discussions and tips about how to <a href="https://www.schneier.com/blog/archives/2013/09/how_to_remain_s.html" target="_blank">remain unseen by the NSA and secure online</a> are highly informative. Can you reliably use TOR or not? And who wants to? Who wants to have to jump through all these hoops to protect our privacy? What is it worth to you or me who previously took it for granted that no one was listening?<br />
<br />
I'm not sure but we are going to find out. Sooner rather than later thankfully.<br />
<br />
-j<br />
<br />
<br />Julian Butlerhttp://www.blogger.com/profile/08246865944018918628noreply@blogger.com0