A real-time interactive audiovisual installation in VR, based exclusively on biofeedback data derived from the human autonomic nervous system, triggering 3D generative visual elements & sound. An exploration of identity and the “Technological Self” within the 3D landscape featuring spatial audio, enabling and supporting personal empowerment and self-realization while introducing Homunculus Flexibility concepts.
Input signals derive from:
Pulse Detector (Heart beat, heart rate), Arduino Sensor
Breath Sensor (Breath activity of chest using DIY Respiratory Belt), Arduino Flex Sensor
VR Navigation is enhcanced by the addition of breathing signal which enables additional control of the user's movement in the 3D space.
Evolution of project includes additional input signals:
Blink Detection (Eye tracking) through cameras available with Meta Quest Pro VR or similar VR headsets.
Brain Wave Activity (real-time readings of alpha brain waves), Arduino EEG brainwave Sensor.
Created exlcusively in TouchDesginer.
Dome preview & documentation from my participation with Pixel Pulse @ Athens Digital Arts Festival 2025
https://chromosphere.eu/editions/2025-athens/program/
Exports of audiovisual performance for Pixel Pulse Collective live, based on the concept of live music & live visuals with audio reactivity.
Created exlcusively in TouchDesginer.
It is a combination of artistic work and game design in virtual environment, where elements of user interaction and animation scenes are also icorporated. Created entirely in Unity game engine as an asignment for the Module: Mixed Reality Illustrations (MA: Audiovisual Arts in the Digital Age- at Ionian University).
Created various 3D assets using photos that turned into 3D objects using NVIDIA instant NeRF, MeshLab and Meshmixer.
Game Senario
We find ourselves navigating through a strange, almost dystopian, dream where we observe various objects symbolizing fragments of memory trapped in the subconscious.
The space is distorted (spatially and color-wise) while the textures and atmosphere, on a visual and sound design level, are almost claustrophobic and dreamlike.
The user can navigate through the space and interact with some of the objects while the aim of the game is to get the user to find
access to a certain point where he is asked to choose whether to wake up (exit from the game) or stay in the dream (scene restart).
Future prospects regarding the game senario include VR integration, AI creation of 3D objects given prompts from the users' dreams, generative patterns of gameplay or 3D objects/particles created from extracting data from users' sleep tracking/dream stages apps.
Made with Unity game engine.
Unity Build: https://drive.google.com/drive/folders/1kuy-5rP6jBwpCTNV1AxaSSUcbkb8Ltl9?usp=sharing
The aim of this assignment is the creation of digital multimedia content for the corresponding module of the postgraduate program (MA in Audiovisual Arts) I am enrolled at. Specifically, I designed and customized an online 3D portfolio for some of my art works, which was implemented in a virtual space where users can navigate using avatars in three thematic rooms and interact with the exhibits.
At the same time, elements of gamification have been incorporated as the user is invited to complete a quest that puts him/her in the process of locating and visiting all the rooms of the exhibition. After or during this quest, the user is given the opportunity to sit at specific points in the space where there are interactive portals that correspond to the "contact" section of a "normal" online portfolio and provide links for social media, presence on other online platforms, etc.
Made with Unity game engine, using Spatial Creator Toolkit.
Spatial.io link: https://www.spatial.io/s/evangelos_space-66250b717d533ee7cda1bf25?share=6654834035832217121
Excerpt from Pixel Pulse Collective live audiovisual performance at Athens Digital Arts Festival (ADAF) 2024.
This performance presented by Evankelos was made using Elektron’s Analog Rytm mkII drum machine for music and live triggering of TouchDesigner sketches using MIDI signals.
We mapped and projected on the building of Former Santaroza Courthouse in Athens, Greece.
Participation in a 7-Day workshop which was organized by the Greek National Opera (GNO) in partnership with the Digital Media Lab (DmLab) of the Technical University of Crete as part of the 2023 UNESCO Maria Callas Anniversary proposed by the Hellenic Ministry of Culture.
The workshop focused on further broadening the relationships between architecture and music & space and sound through the use of new digital media.
It featured lectures from several mentors such as Marcos Novak [Director transLAB | Professor and Chair MAT/UCSB | Transarchitect | Artist | Theorist] and John Bardakos [artist | researcher | academic] among others, while the main focus was on intense TouchDesigner sessions with Georgios Cherouvim [Senior FX-TD | Computer Animator | Artist | Educator].
More specifically, participants experimented with the three-dimensional models (3D point clouds) of the scanned interior spaces of the GNO building designed by Renzo Piano that resulted from the preceding research project by DmLab, and explored ways of capturing interactions between the voice of Maria Callas and the spaces of the Greek National Opera. This led to the creation of virtual and hybrid installations that, taken as a whole, illuminated this iconic figure from an entirely different perspective. The resulting creations are to be exhibited in digital form on GNO TV, the online TV channel of the Greek National Opera, as well as in fit-for-purpose exhibition spaces located in both Athens and Chania.
For more information click here: https://dmlab.tuc.gr/project/visualising-the-voice-of-maria-callas
promo video ↓ ↓ ↓ ↓ ↓ ↓ ↓
Assignment project created for the "Music, Media & Algorithms" Module for the 2nd semester of the MA in Audiovisual Arts in the Digital Age (Ionian Univeristy).
An interactive system implemented in Pure Data based in Laurie Spiegel’s “Music Mouse - An Intelligent Instrument" (1986).
It features real-time sound composition with algorithmic criteria mapping mouse information as data input. This data is used in various ways in order to affect the system's operations, such as X and Y coordinates as notes on scales, arpeggiator time divisions, filters & their parameters, stereo pan effects based on coordinates etc., featuring decision making and probablility functions.
Work in progress of a collaborative audiovisual interactive piece based on a greek cultural heritage theme. Created in TouchDesigner.
Consisting of two sections, the visual piece is based on a constantly evolving generative pattern representing the moon which is also affected by the movement of the visitors captured by a camera. Samples of interviews are simultaneously generated and mixed with a pre-recorded drone that is playing.
Postgraduate Course Project Digital Poetry - https://avarts.ionio.gr/festival/2024/en/artworks/1444/
“I Am Not My Image” is an interactive audiovisual piece that uses the web camera to translate the subject's image into ASCII text characters. The visitor's image is transformed into the characters that re-form the image consisting only of the letters that make up the sentence "I Am Not My Image".
Alongside the interactive quality of the work, on the left side of the screen, an introspective poem by Evangelos is typed repeatedly, trying to express the experience of interacting in online environments, a reference to the emotional weight, the dead ends and the struggle to maintain our identity through communication and interaction in dystopian conditions.
At the same time Evangelos doesn’t fail to point out that sometimes this identity is all we have to hold on to, referring to the somewhat positive aspect of online identity existence in dystopic times (e.g., COVID) but also in cases of psychological pressure, depression, and more profound purposes supported and facilitated through the use of avatars, etc.
With this piece Evangelos attempts to comment on how our interaction within online social networks, and our overall online presence affect the way we choose to portray ourselves and additionally how our own sense of identity is ultimately affected. He focuses on the distortion of the self-image and refers to the phenomenon of the "looking-glass self," a term coined by Charles Horton Cooley in 1902, according to which we choose to present ourselves based on a reflection of how we think others see us, to please them.
Using the direct statement "I Am Not My Image," Evangelos aims to express his opposition, to remind, and to draw a distinct line towards this phenomenon, allwhile maintaining empathy and understanding of the underlying reasons and motivations that lead us to such behaviors.
"Dissloving" is an interactive audiovisual installation, part of the thematic title "Memory Void," assigned to students during the "Interactive Environments - Installations" Module of the MA: Audiovisual Arts in the Digital Age program.
The project focuses on the experience where memories intertwine and ultimately fade away, such as short-term memories in the case of Alzheimer's disease or older traumatic or non-traumatic memories that transform, distort, and eventually disappear. The main function of the project relies on the property of light particles (photons) to carry and contain memory, which is then imprinted on a light absorbing medium through the process/technique used in developing photographic film.
In this project, implemented in a dark room, a projector displays photographs onto a photosensitive/phosphorescent surface. During the exhibition, there is a person who greets visitors and captures web camera photographs of anyone who wishes to participate. Then these photographs are stored and automatically displayed in random order on the phosphorescent surface as a collection of memories instantly created.
The duration of each projected photograph is 10 seconds, allowing enough time for the light to be absorbed by the surface. Consequently, each subsequent photograph blends with the next one creating unique new patterns and distorted images every time. Afterwards, we experience a scheduled break of about 5 minutes of the projections so that we can observe how this imprint of the photographs fades over time, as the phosphorescent paint loses its intensity over the hour. Then the projection of the photographs is reactivated. Alongside the visual aspect of the project, the sound part evolves and is synchronized with the 2 phases of the project.
Created in TouchDesigner.
The audiovisual work Paralysis by Analysis presents a combined experience of two chapters that unfold in two and three dimensional spaces respectively. First, an audio-reactive visual piece is presented in a two-dimensional space, in two layers, which are influenced by the musical composition being played. Then, we move to three dimensions where we watch a particle system being born, evolving and dying as, at the same time, the musical piece is transformed into a hypnotic drone.
With this work, Evangelos attempts to narrate the phenomenon and the psychological burden of over-information, a process in which we are constantly receiving stimuli and data that we try to process and assimilate. In recent years, with the rapid development of technologies and their applications, our everyday life has been significantly affected, as whether we want to or not, we find ourselves in the position of having to analyse as much information as we can, being constantly bombarded with data from everywhere, while at the same time, not having the necessary time to process it, due to our heavy obligations. This results in more anxiety, stress and various psychological side-effects, while at the same time people end up paralysed under the weight of the burden, unable to balance their thoughts and cope.
In order to express and narrate the above phenomenon and set of processes, Evangelos uses the term Paralysis by Analysis, which comes from the science of economics and expresses the disturbance of thinking, information processing and ultimately decision making, due to the huge amount of data we receive. According to this terminology, people being, unable to analyse the set of options given to them (due to the amount of information), are unable to make decisions and choose the right option. There are too many variables and humans bend under their weight, unable to balance and analyse them.
Created in TouchDesigner.
Video edit in DaVinci Resolve.
Various interactive mini-projects using p5, a JavaScript library for creative coding.
Voice recording sampler & synthesizer - also available for mobile application
Αn exhibition featuring solely visual extractions, giving physical existence to Evangelos’ digital art. Consisted of 6 digitally printed, large scale canvases presenting a selection of Evangelos’ glitch art.
Evankelos transports us to an artificial world of distortion with multiple episodes.
With the use of algorithms and creative coding tools, he composes a new tropical landscape. The concept of randomness plays a pivotal role in the artist's work as on the surface, the depiction seems accidental, yet it hides a structured composition with smaller boxed narration with synthesis and naturalistic abstraction. "φ-gital", comes from physical vs digital. Here, the combination of its meanings is twofold. On the one hand, the composition and the material are reminiscent of a painting effect, while the production process is digitally created on a small screen size. The result, may seem pop, but it is way more expressionistic, with gestural determination and glitched perception, which reinforces the paradox.
The visualization result explores if it is a landscape, skyview photo, building parts or plants. However, Evankelos plays with time, as his works resemble multiple stop carré montages of a science fiction film. The colors are contemporary and reveal almost photographic sensibilities of everyday observation. This duality is conceptually reinforced in his subject matter, as with careful observation, some phytomorphic elements become apparent, next to urbanistically made, Tetris bricks. If we penetrate the artificial world of Evankelos, we are teleported to 2093 where the tropical φ-gital aesthetic is not only revived in simulation glasses, giant screens and electronic 3D games, but it exists among us.
Νiovi Kritikou / Curator
check my online store for more prints
Original video art was projected in exhibition space ↓ ↓ ↓ ↓ ↓ ↓ ↓
“Dissolving” - A four part audiovisual piece (2021)
click to read context“Dissolving” is a four-part audiovisual piece created by Evangelos. Each section contributes to the narrative, trying to explore and understand the implications of the new dynamics that prevail on people’s relationships, connection and communication in relation to the underlying internal psychological processes.
We find ourselves in an unprecedented and dystopic situation where people try to get a grasp on the new reality. We struggle to adjust and connect with each other on the one hand and simultaneously trying to reconnect with ourselves and find piece on an internal level.
These ongoing processes may reveal themselves in a melancholic, positive or negative ways impacting people and societies. People dissolve into bits and bytes becoming pixels, trying to communicate and find new ways to connect. Maybe they will erase their past, reinvent themselves and explore their characters investing into more substantial and healthy choices.
The main sections of “Dissolving” are:
1.Τέλος - feat. Logout (Dissolving)
2. Introspecting
3. Erasing
4. Connecting
These audio/visual pieces consist of original musical tracks combined (and in an open dialogue) with visual elements created on a live coding visuals platform.
Pieces 1 & 2 are two different DIY web cam filters. Watching closely, you can distinguish a person’s movement and how it affects the visual outcome.
“Qv2” - An audiovisual EP (2020)
click to read context"Quarantine v2.0"
A live album created quickly on the spot by applying the Technoself method of Deantoni Parks. I was simultaneously playing an electronic drum pad and a midi keyboard using Ableton live as my DAW.
Video was created using Hydrasynth (an online, browser based live-coding visual synthesizer). The signal from my web camera was processed through the live code and the camera was recorded through multiple screens (TV-latptop) creating a feedback effect.
“Untitled #3” - Music Album Visuals (2022)
click to read contextUntitled #3 - is a collection of 7 video clips accompanying the relevant tracks of my synonym musical album.
Videos where created using various live coding techniques (original code), web camera and glitch art software. Then the shot material was edited with a simple video editing software.
“Planitia” - An audiovisual journey into star clusters (2021)
click to read contextAll tracks were created by exploring the Technoself method and subsequently interspersing them with extra layers. Performed using a minimal electronic drum set-up / Ableton live / electric guitar / synth.
The Visualizer was created by sticking together several short 3D videos which I initially created, generated and directed on my mobile phone using an android app.
**PLEASE WATCH IN HIGHEST POSSIBLE RESOLUTION
“The Third Limb” - An interactive media piece (2020)
click to read contextAn online media piece based on the concept of "Homuncular Flexibility" (Jaron Lanier, 2006) which proposed the ability of the human brain to adopt and control bodies different from their own.
This interactive 3D piece was created as an assignment for the Technoself School of Philosophy of Deantoni Parks and it is a tribute to this concept.
It explores the implications and possibilities that may emerge and get incorporated into live music performance and in particular on the Technoself method by incorporating a third (virtual) hand that could be learned from the human body to be controlled and work as a third hand playing the keyboard for example.
HOW TO: With your keyboard cursors you affect the normal left & right limbs and with the mouse movement you control the third extra limb. Left and right limbs are behaving normal and are pictured with warm colors as they are natural to the human brain while the third limb is not well formed, is pictured with a cold color, is bigger in size and seems abnormal and difficult to control. This is a representation of how the brain's first interaction with a third limb may be, until it learns to control it. The music playing on this piece comes from my album "Untitled#2" ( https://justgazing.bandcamp.com/album/untitled-2 ) and it features a normal drum beat, where I am playing with 2 hands/sticks, while on top of that I chopped an old jazz solo and replayed it with a Midi Keyboard. I chose to incorporate it on the 3d piece as a possible sonic experience that may be played live if a third hand was present.
“Technoself tribute” - An interactive media piece (2020)
click to read contextThis interactive and audio reactive video clip serves as an abstract representation of the Technoslef method proposed by Deantoni Parks.
Unfortunately, the micro site is not currently available.
THE VIDEO
This video clip is created within a dataflow (visual patching) development environment using a WEBGL-based tool. (cables.gl)
To capture the real-time improvisational spirit of the Technoself method, the video clip is interactive and is being directed on the spot by the viewer. Similar to the Technoself method, the user has a basis of elements to work and interact with, while the result may differ every time.
To represent the current state of mind force, we use the mouse cursor position which affects the positioning of the 3D elements (drum set and cyborg hand) in the 3D space.
To represent the drumming factor of the musician playing (and the left hand of a right-handed drummer), the LEFT mouse button is assigned to change the position of a 3D drum set and also to reveal a human figure.
To represent the machine/programming factor of the musician playing (and the right hand of the same drummer), the RIGHT mouse button is assigned to change the position of a 3D cyborg hand and also to reveal a numerical coding of the MIDI signal information that is fed from the keyboard to the machine (music software).
To represent the fact that the human being is the controller of the musical output and the embodiment of the Technoself method, acting as a musician but simultaneously as a machine, shaping his identity adopting new technologies, we can see the 3D elements rotating around the center of the 3D space, where is the unchanged position of the human figure. Moreover, the human figure cannot be displaced by the mouse cursor, fact that represents that the current state of mind and efforts of the musician is solely on the music.
THE MUSIC
The essence of this technique is based on a real-time/ improvisational approach of playing & composing music on the spot. The musician acts simultaneously as a programmer and a drummer, creating patterns, rhythms and textures in real-time, converting his accumulated experience, his state of mind and feelings of the moment into music. This conversion is streamed live using one hand on a MIDI keyboard triggering samples (and controlling several of their parameters) of sliced music, while with the rest of the limbs playing the drum set, reacting in real-time to the sounds created from the keyboard; and vice versa.
“Pixel Warz” - A series of live-coding generated images & videos (2021)
click to read contextA series of glitch images & videos created with Hydrasynth live-coding visual language.
Different images melt together with various coding rules, creating a glitchy futuristic environment which sometimes resembles zoomed-out city scapes and others zoomed-in computer parts.
Videos are accompanied by orignial ambient & mysterious music.
Cassette Product design & implementation (2022)
click to read contextArtwork, concept, design and implementation for a cassette album release of mine.
Each packages included 2 mini postcards, coming from a batch of around 80 glitch art original images. ++ handmade key locks, pencils & wood powder, all created by used drum sticks. (last photo)
Various album covers. Graphic design work (2020-2023)
Various glitch art images (2021-2022)
Various video clips. Camera, stop motion & editing work (2012-2023)