Postpro-Orgy July 15, 2014 at 3:28 pm

…aka: Finally having some time to play again with my new weapon of choice - Realtime Studio/UNO.

The past days I spend most of my time porting lots of my image-processing stuff to run in RS/UNO. Whilst doing so I decided that I should setup a quick trial/demo. The ingridients:

  • Different image-processing shader acting as main, as well as postpro-filters
  • Sound as source to let the shaders interact to
  • A few images (the artists formerly known as ‘Flashmöb’)
  • Randomly moving blobs (or in that case magnifiers with refraction)

…ending all up in ‘Fragmöb‘ a post-processing-fragment/pixel-shader-orgy. To get the full filtered impact you really should run the music a bit longer!!!

(WebGL enabled + Chrome´s best)

Glasfabrik. Realtime Studio/UNO July 10, 2014 at 10:57 am

A few weeks ago I came into contact with the nice and very talented Outracks team. Outracks? What´s that? Despite the fact that a few stray instances of their team members appreciate an excellent steak and are able to hold one´s drink (Beware: Norwegians!), which makes ´em very congenial, they are working on some extremly interesting IDE for real-time graphics in 2D and 3D. Beyond that they deliver with ‘UNO‘ a new programming language that unlike traditional languages and APIs, has no barriers between CPU code, GPU shaders, graphics APIs and the various.

Sounds ambitious. Sounds highly interesting. Yeah! Sounds like great fun…

Unluckily the past 3 years I simply never had the time to have a look at Realtime Studio/UNO and to tinker
a bit with it. Until now! Past 3 weeks, after I got a premium introduction under the command of Simo (which work is
highly recommendable), I finally got the chance to get started using this weapon called UNO.

What I can tell by now - it´s quite comfortable, it´s not hard to learn, even if it´s not like you get used to some new
concepts in two ticks. But it´s worth while understanding, cause one huge advantage is that when using RS/UNO you will get an super smart compiler, which will mostly take care of all optimizations when exporting your content to the right format for your target platform (WebGL, Android, iOS, Native and DotNetExe).

So WebGL please.

Felt a bit like cheating writing your code, pressing a button and getting as a result a fully optimized WebGL-version of the shader I´ve written in RS/UNO, but for the moment, lets release ‘Glasfabrik.’ (Glas manufactory), once written in UNO now in the ‘interwebz’:

(WebGL enabled + Chrome´s best)

More to come…

HTML5 - it felt like declared dead. June 25, 2014 at 5:10 pm

Recently I was charged by Bochum based Social Media Marketing Agency ‘adbites‘ to develope a pure HTML5/Javascript based Facebook App called Fleck´n´Battle to promote Henkel´s bathroom cleaner Biff.

Phew - can you find the fault here? Correct, almost 2  times enemy mine lumped together in one project! Facebook and HTML5, I´m pretty sure you are able to win any buzzword-bingo with those two tangled bad monkeys. But to be honest, there´s nothing to bad-mouth about Javascript/HTML5 in our days. Javascript´s perfomance really raised to a very comfortable and competitive level, plus, due to the fact that I´ve lastly spent nearly 1 year developing ‘Pagie‘ I felt very at home realizing this project within a tense timeframe of ~15 days. And by programming all those animations, blendings and little effects, the final good almost felt like working with a declared dead technology from the past aka Flash/Actionscript.

So, if you are not an island or hermit, have a Facebook-account and at least one friend, feel free to biff-battle like there´s no tomorrow…

Biff Facebook App

WebGL & canvas workshop DevState April 4, 2014 at 12:51 pm

Excited! Upcoming Beyond Tellerrand conference I´ll be giving a WebGL / HTML5 canvas workshop together with my DevState´s buddy Sakri Rosenstrom. Sakri will do, train and cover the HTML5 canvas part - not least cause he´s really digging into making nice little (text)effects and snippets for canvas lately: You can see a bit of his collection via - or his latest tribute to 80´s oldschool series like this  ‘He-Man´ish‘ texteffect.

I´ll be training the WebGL part. This will cover:

  • Basics in working and setting up WebGL
  • Creating your first flat triangle to your first textured cube
  • Working with vertex/fragment shaders
  • Advanced techniques and beyond…

…and some more artistic shaders and effects (like the cross-hatched ‘DevState´d’ shader below), raymarching and maybe some soundrelated demo-stuff (like this realtime soundreactiv little thingy).

So I wrote a little shader - quasi as a little appetizer and to warm up for this workshop. How about having some nice little shader, plus an artistic gfx effect to make it look sketched and drafted, plus free rotateable text (via mousedown & move), plus all the source-code you need to experiment yourself? Here you go:

Sketchy DevState realtime toy
(need to have a WebGL enabled browser!)

Btw, needless to say that once again being part of such a great conference like the upcoming ‘Beyond Tellerrand‘ confy, seeing all those phantastic line-up of speakers, is a mayor honour. And ‘hey!’, there are still tickets available for our workshop

Any questions? Need some more infos? I´ve setup a little Pagie for therefor:

DevState´s bt-confy workshop

It only remais for me to add - get your WebGL / HTML5 canvas workshop ticket soon & see you!

Distance estimated 3d fractals. March 25, 2014 at 1:25 pm

The last months I`ve spent nearly all of my time on developing ‘Pagie‘ for the cool clique at Psykosoft. And it was and still is so much fun, because we have so many features still to come, new ideas and of ’cause of being private beta now - problems to solve too on ‘Pagie‘. Meanwhile, scripting own ideas, snippets or experiments aka just play with gfx and code came off badly, so last night I took some time to write at least one little shader-experiment.

Time to play with WebGL and shaders again! The result: Some distance estimated 3D fractals - and to showcase the example, a little showcase-Pagie for it…

Distance Estimated 3D fractals via Pagie
(need to have a WebGL enabled browser!)

I know that´s not rocket-engineering any more, since the past four years, the 3D fractal field has had a massive impact and growth due to Mandelbulbs, more interesting hybrid systems like Spudsville or diverse Kleinian systems. But still, having WebGL becoming more and more widespread on browsers and even mobile devices these days, it´s really satisfying writing shaders again that can be accessed online (Shadertoy ftw!)

Distance Estimated 3D fractals
(need to have a WebGL enabled browser!)

And after all: It feels so good to play & experiment again, and to reactivate this blog from being a dusty dead end to be a bit more up-to-date. Stay tuned, there´s more to come soon…

Pagie beta March 19, 2014 at 3:38 pm

This is madness… ehhhhr… NO! This is Pagie!

Finally, as promised during my this years FITC Amsterdam session we are ‘Private Beta’ now with Pagie, one of my beloved most fav longterm projects I´m developing as lead architect and frontend developer (Javascript/HTML5/CSS3) at Psykosoft.

It´s been almost one year now since I´ve joined the very talented and decent crew of slightly skilled creative coders at Psykosoft. Second reason for that was: Being part of a team that contains among others Mario Klingemann and David Lenaerts is pure adrenaline. First reason was and still is: They promised challenging tasks and I´m still enjoying solving `em…

Pagie Beta
Visit website & become a beta-user.

But what the heck is Pagie? Well, to list just a few features at this state of release:

  • Simply drag´n´drop images, pdfs, music, whatever files you like from desktop to Pagie
  • Type and edit text directly on Pagie´s stage
  • Scale, rotate, transform, re-arrange your stuff the way you want
  • Copy´n´paste images directly online from web f.i from Google-images to Pagie
  • Embed your favourite Vimeo or Youtube videos, or Google maps, or Shadertoys
  • Generate pixelperfect responsive pages - to be shown on any screen/any device
  • …and many more to come…

To give you some 1st impressions and as proof of concept - I just quickly (~15 mins of work) edited/composed an example ‘CV’ -page for myself with Pagie:

Pagie Beta

As I already said - this is an ongoing longterm project so we are constantly polishing, refining, debugging and adding features day by day. Feel free to join and help us as beta-user, just drop us a line (by pressing the contact button at and get access to Pagie 1st!

Bring the noise - yes, I do websites too… March 14, 2013 at 3:41 pm

Lately, agencies and potential clients often asked me if I accept ‘normal‘ projects as well, meaning if I can concept, layout, design or realize websites, microsites, games, mobile developments or social media related gadgets for instance too - SURE THING!

Yes, I do websites (and evil Javascript) too! :)

Ok, looking at my work and projects I´ve done in the past, the impression is created I only accept sound related visualizations (ok ok - let´s say dupstep related…) - but that´s simply not true. To the contrary I´m always in search of interesting job inquires of all kinds, be it a concept or a design for a website/microsite/ad-campaign or such, or 3D related programming tasks, or social media related gadgets, or mobile developments, or effects and filters for all kinda applications or simply the realization and programming of a given task or project.

To give you a various selection of my work - please visit the new category of my blog called ‘work‘ (no shit - it´s about my work)

Nevertheless I´ve to point out that I just released videos of another project I´ve done in the past, and ehrrrrrrrr… ‘yep‘, it´s sound related, and ‘yep‘ it´s dubstep again…

I was asked by Image-Line to design and develop a GLSL based fragment shader as part of new ‘ZGameEditor Visualizer’ release
to underline the enhanced features and usage of shader based effects within the new program version.

Get more infos here.

visualization videos via Vimeo
More infos

Apart from that I developed two ‘demoscene‘ like soundreactive realtime demos, both performed in one single
fragment shader, showcasing the possibilities and power by using GLSL based shader for visualising sound.

1. Demo ‘Shapeshifter         View Video
2. Demo ‘Blueberry Shake   View Video

So don´t hesitate to just please contact me about any projects, ideas, realizations, concepts, collaborations, commissions or
exhibitions, or even just to say hello!

JJ… the doomed filter February 21, 2013 at 4:02 pm

A while ago I was asked to develop and design a set of filters (GLSL based shaders) that could be used realtime and be projected live on a screen with MAX Jitter during the stage show of rapper JJ DOOM´s ‘Key of the kuffs’ album release. The filter should take the rapper on stage during the show as input and generate some nice to weird looking ‘comic’ like effects of him as output.

So much for the theory. Unfortunately the filters were dropped just before the end of completion ´cause of - ehrrrr… let´s say the timing for testing the whole scenario was a bit tuff - a.k.a ’shit happens’.

Now I decided to extract a part of the origin filters and to rewrite it as a little finger exercise with WebGL, Javascript to make it suitable for online browsers. Doing this was fairly easy, because I nearly could 1:1 port my openGL GLSL to WebGL. I honestly got lots more struggles with all the necessary Javascript around it, like writing scripts for:

- Webcam input - which isn´t supported natively in every browser so I used Flash for that
- Passing image data from Flash to Javascript
- canvas to image conversions
- Enabling a ’save to disk’ option in pure Javascript to save screenshots

But at the end, everything fell into place and was working as expected:
WebGL based image processing

K´ …it´s working not as expected if the Internet Explorer is your weapon of choice - then you got no WebGL at all, but in case you use an modern up to date browser you can try out this little project here.


Wortgebilde word creations December 18, 2012 at 5:55 pm

Achtung, Achtung! Beware: Tales from the Crypt ahead… or, I absolutely suck at blogging this year! In fact, there´s so much to show, talk about, preview, or tell from actual projects, experiments and work I´ve done so far this year, but damn fact: Me is simply missing the spare time to write about all that stuff.

So what´s new? Recently I even managed to successfully maximize that leak of time by joining the FMX festival board as curator for the ‘Realtime Graphics/Interactive/Flash/Demos/Whatever‘ -slot. Meaning, I have the honour of inviting lotsa stunning entities from the creative coding front (hint! already confirmed some usual suspects as well as some promising heavyweights from the scene) - which means  I did a changeover in case of FMX and will not speak myself but moderate there. Wanna know more, stay tuned!

Beyond that I found out that finding time to experiment again <u>and</u> write about it is quite simple: Just take some vacation and you can do personal work again. Easy! So whilst revisiting some old image processing works I´d done years ago and trying to invite Jared Tarbell, one of my personal heroes when it comes to super-inspiring procedural driven art and experiments, to FMX, I digged out one of Jared´s old experiments called the ‘Emotion Fractal‘. Still one of that works that I like most…

I decided to quickly update the setup of Jared´s original source-code for the ‘Emotion Fractal‘ and used it for some image processing stuff on photos in a book that I prepared as Christmas gift. With that said I just released this little remnant for you to play with. Below an example directly from the workbench:

Cola Claus
Launch image processor

How does it work:

  • You can upload your own images (2048 x 2048 pixel !!!upload may take a while depending on the image-size)
  • Modify the chain of words to work with (separate each new word by using a blank)
  • Take snapshots from the working progress or/and download the completed composition to your disk

You can test and play with the whole enchilada here.

Last but not least - Happy holidays!

Mimic scale9-grid on GPU Shaders in their natural habitat #2 August 7, 2012 at 10:27 am

I know, I know - the frequency I´m keeping my blog up to date with new posts during the past few months is almost fast as snail pace…

But lotsa interesting tasks and projects I´m working on the past months simply unequals the time it takes to write about it, but I pledge to improve! Well, again, things I´m working on are mainly tasks dealing with writing Shaders and  programming GPU accelerated stuff - all in all a huge playground for me.

And since the Adobe Flash Player 11 has GPU acceleration too, there´s the challenge to rethink many build-in features of Flash here for the need to implement em on the GPU too. In case of 2D GPU driven graphics it means, say ‘bye bye!‘ to your beloved comfort of having a display list and many legacy features like vector masks and scale-9 for instance, going GPU means rebuilding this from the scratch again.

K´, so a very specific optimized 2d renderer is one of the tasks I´m working on the last months, therefore I needed to rebuild features like nested clips a.k.a nesting and compounds. Another requirement was to get back the Flash´s well-known Scale9-Grid feature fit for use on the GPU.  So ‘Pop! goes the weasel‘ - wanna have back scale9 grid for GPU?

Introducing ‘Vertex9‘ - a GPU based vertex solution mimicking an accurate scale9-grid:

Vertex9 ...mimic scale9-grid on GPU

Btw. up next in September I´m speaking at upcoming ‘Reasons to be Creative‘ festival in Brighton. There I´ll show all kinda actual stuff I´m working on right now - so just in case you don´t want to wait another half year for a new blogpost here, maybe it´s an pleasurable occasion to grab a ticket and see you there?