In celebration of 1 year of YUKU, we're going to look back at the creative projects that manifested via our platform and go into more detail about the background of each with the artists and creators involved. For our first feature, we have asked Paul Fennell, creator of the incredible Fragment:Flow patch, some questions about his project. Fragment:Flow was heavily used in the process of creating the visual art around Subp Yao's 'Infra Aqual' album, which was released in November. With his creation, Paul has shown the truth of creative coding being as much an artform as any other, and placed tools that were once accessible only to those who can code into the hands of those who cannot, but still enjoy experimenting and creating.
— Introduce us to Fragment: Flow, what does it do and how is it used?
Fragment:Flow is an advanced platform for realtime audio-visual design created with Cycling74’s Max. It consists of a versatile chain of GLSL effect slots built around a core shader. Each of these slots can host a variety of processing effects installed in JXS or ISF format and the resulting system can transform video files, live feeds or Spout textures in a wide variety of ways. Virtually every parameter can be modulated by incoming audio resulting in a very immediate and “tactile” approach to experimental audio-visual design. It has uses across a number of creative disciplines, from VJ shows and installation art to still image and video production. Indeed, one of the more unexpected pieces of feedback has come from users who’ve found that Fragment:Flow has replaced software like Adobe After Effects and greatly streamlined their video production workflow. It was primarily conceived as an experimental real-time tool, but provided that you have a satisfactory method of recording the output it possesses an immediacy that traditional software often lacks.
— What's your creative and programming background?
I come from a Fine Arts background. Originally a painter, I became increasingly interested in video and sound installation art in my early 20s. I’d used computers creatively throughout my childhood, but I stumbled across Max/Msp completely by accident. My first few years working with the software were dominated by audio experiments and DSP processing, and I had very little programming experience at that time. It’s a testament to Cycling74’s democratizing efforts that I was able to produce something like Fragment:Flow given time. I’ve been using Max for 15 years now and have dabbled in a range of coding languages as part of that journey, but I was far from a traditional coder when I started out.
— When and how did the idea of making FF come to you?
It emerged organically from my personal experiments in Max. Jitter really captured my attention when C74 introduced their Gen/GL family of objects and more extensive shader support. I started experimenting with various reaction diffusion and video feedback techniques and was blown away by the behaviour that emerged. Suddenly I was was generating visuals that resembled ink and paint, or else a glitch and datamoshing aesthetic, but which responded to my input in a very immediate and organic way. I shared most of these early experimental patches with the Max community and the response was very positive, so the idea of creating something more deliberate, extensive and fit for public release developed from there.
— Have people surprised you in the ways they've used FF?
Yes, constantly. Within a few days of releasing the Beta I had users projecting their Fragment:Flow presets onto local buildings, combining it with facial recognition techniques to map it to their faces and piping it into Unreal to texture VR environments. The presence of Spout and OpenSoundControl really helps to integrate it with other software. It’s very satisfying to see.
— What does the future hold for FF in terms of updates?
The software was conceived as a distribution platform for my shader work in Max, so I’ll continue to release new shader packs going forward. The forthcoming V1.1 update is mostly improvements to the core software, but I’ve created a new Video Trails shader which I’m really enjoying at the moment. Combined with some new keying methods it behaves like a strange cross between traditional collaging and Etch-A-Sketch. It’s interesting. I have extensions planned for the New Year which will allow people to use the live output from Fragment:Flow to displace 3D meshes, and this will come in both VR and non-VR variants.
— Is there a community forum for FF users?
Yes, there’s a forum on the website: https://www.fragmentflow.com/
If you would like to own a piece of Paul's art, order a copy of Subp Yao's timeless 'Infra Aqual' via YUKU HERE