Music visualization technology in real time (RTMV technology)

Hi Everyone,

I am an Embedded Systems Engineer. My hobby is developing real-time music visualization technology. The technology allows the creation of fully autonomous intelligent electronic devices that can interpret the musical audio stream into visual light images.

I call the technology itself RTMV-technology. Initially, RTMV-technology was developed as a 2D visualization option. But to create a full-fledged 2D prototype today, for me, it turned out to be quite expensive to manufacture. Therefore, an attempt was made to a simplified version - 1D visualization. The idea was to get a relatively inexpensive instrument that you can simply connect an audio signal to and enjoy the light images while listening to music.

This project was called CLUBBEST and was initially focused on club music, but as time has shown, it is suitable for visualizing any piece of music. Of course, there are many limitations associated with using inexpensive components and using a low performance MCU, but even in this case, I received a lot of positive feedback. For me, the most inspiring reviews were when, after watching, subscribers write - “I saw the music.” Yes, I created this technology in order to - See the music. And I get that kind of feedback all the time as the technology develops.

Initially, I made CLUBBEST as a tool for my PC to listen to music. But as it turned out, it can be successfully used for the workplace of a composer, musician, DJ, and also used as a decorative decoration for the interior of clubs, cafes, restaurants.

You can find a video of the work of visualization technology for musical works on my YouTube channel . On this channel you can find a video of the prototypes of the two systems - CLUBBEST M 68 and CLUBBEST M 100.

I believe that already at the existing level of development it is already possible to create a commercial instrument for DJ that could be used for live performances of musicians. Such an instrument can be placed in front of the DJ console and the audience (or behind the DJ left and right of it). I believe that light interpretation enhances the perception of a piece of music several times over.

The peculiarity of the idea itself is that, for example, a DJ would give the ability to focus on music in a live performance, and give the visualization to electronics.

Now in the prototypes , the final optical device , there is a linear arrangement of the same type of light sources. To work with lighting fixtures that are used on the stage , a configurator is naturally needed in which the user must specify the type of device and its location. For RTMV-technology it is important to know how light sources are located relative to the viewer , their type and their functionality.

I’m currently trying to contact inMusic to interest them in the development of this technology and the creation of new tools that will allow you to take a different look at the light SHOW.

But now, it would be important for me to hear your opinion on how light visualization correctly or closely expresses the theme of a piece of music. In the latest videos, on my channel, I tried to collect different styles so that it would be possible to evaluate the possibility of the algorithms.

If you have any questions, feel free to ask, I am ready to provide more technical information to understand the very idea of visualization.

I will be grateful for your feedback.


You know that Prime already integrates with several very established lighting systems already yes?

No, but I’m interested in the opinion of professionals and feedback on my idea of music visualization.

Here I propose to evaluate how RTMV-technology copes with the visualization of music. The video shows a real SHOW created with the help of modern controllers. Below is the CLUBBEST M100 in real time. Where do you think more information is transmitted during visualization? There is a link to the original in the description of the video.

1 Like

Hello, as it is great to see that You developed Your own lighting fixture with Sound to light option, You need to understand that You are way behind professional devices, that can go way further than that. As a professional light designer (running my own company and cooperating with multiple others in the Dutch market), I can assure You, that selling this will be ultra hard. Also You need to take in account that there are now software solutions that translate audio or video signal to real time pixel mapped shows or light movement, running milions of pixels, connected with media servers (Arkaos, Green hippo, Resolume, etc), professional DMX controllers (Avolites, Hog4, Grand MA2/3, chamsys, Obsidian and many more). All these devices can do the same thing and way more than that. So to push Yourself in to that market with a LED bar that reacts to music… Well, there are already devices that can do that… Sorry if that spoils Your marketing dream, but that is just the truth.

1 Like

Your review will not spoil anything for me, only until I can find devices that can compete with RTMV, place links where it can be seen or read? :grinning:

I just literally gave You the names of the devcies that can turn any LED bar in to it…

1 Like

Maybe this sideline is best pitched at the Hercules and basic midi controller markets rather than the established co-working franchises

1 Like

Unfortunately, none of your devices allows, unfortunately, to synthesize visual images from an ordinary audio signal in real time. There is not a single video that would show such a visualization option.

I offer a technology that is capable, for example, when broadcasting a live DJ performance perform real visualization, without human intervention, it is completely autonomous. Moreover, it is a visual arrangement technology, i. playing the same track over and over, a completely new visual show will be synthesized. Such an instrument can be used at home to listen to music (as I do). If you are, for example, the owner of a bar, I highly doubt that you will invite a lighting engineer to write lighting tracks for a particular composition that you want your visitors to hear and see.

Most often, you can see that someone who tries to somehow illuminate his institution, as a result, it turns out that the music is playing, the “light bulb” blinks to itself. I offer a real connection between light and music. And here you will not argue with me.

CLUBBEST prototype was built two years ago, just didn’t have time to show it. Now I have time and I’m interested in reviews. I am 100% sure that I am moving in the right direction in development. Yes, I’m looking for a company that will be interested in this.

And if I apply to some company, but they don’t answer me, this does not mean that this time, it was I who was unlucky. :wink: :grinning:

Вот еще один вариант визуализации в реальном времени

Personally, I really like this visualization. :smile:

1 Like

Great Job!!! Continue what you are doing.

1 Like

Check latest beta of obsidian onyx. NDI video in to light. Already working. Should be released soon to full consoles. Turning any audio or video in to light. Without human intervention.

Where I think your system is interesting is that it allows you to transform audio signals into light signals in real time, just in a plug & play way, without having to do any prior analysis of the tracks as soundswitch would do or having to do complex DMX programming.

The strength of your technology lies in the fact that, unlike other solutions, it’s possible to transform audio signals into light signals in real time.

Of course, your system is extremely versatile and could be useful in a variety of situations, for musicians, DJs etc… But for me, you need to clearly target a main audience so that it’s clear in people’s minds and meets a well-defined need.

Let me explain:

If you answer the question "What’s it for? it’s for visualising music in a luminous way, your message isn’t clear, it’s too abstract and people won’t understand what it’s for in terms of concrete applications.

On the other hand, if you present it as a product “that instantly synchronises audio with lighting, without complex programming, without prior analysis, in real time and without human intervention” then yes, your message will be interpreted by many djs and event professionals as if you had just invented “DMX 2.0” and your product could well find its audience. In my opinion, there’s clearly a market for a 100% autonomous product if it’s successful.

I think your work is clearly promising, but if you want it to be sold to professionals try to do demonstrations with professional equipment, for example MovingHeads or other lighting effects usually used in DMX. Make your technology a protocol that third-party manufacturers of professional lighting effects could integrate under licence.

Or you could concentrate initially on the “home dj” market if for the moment you can only operate LED light bars.

Clearly if tomorrow I can have a box capable of converting my audio to do this kind of show in a plug & play way and independently without any intervention on my part I would be the first customer

1 Like

Thank you for your feedback, this support means a lot to me.

I will prepare more information on this technology so that you can understand the design of this device, its capabilities, and the algorithms that I use. About the problems that I encountered in the implementation of the design.

I will provide photos and specifications of the prototypes. I have made several copies and I suggest that if this prototype design is of interest to you, I will just make you a gift - a device kit, on the condition that you make at least one video, demonstrate what you liked or did not like :wink:.

My self-interest is that I am a “listener of music”, and you are the creators of music, and I’m really interested in the opinions of people who create music.

Well, if in the end it turns out that you think that this technology is worth developing, then this will be the highest praise for me and an incentive to work!

In which country are you located

Ukraine, Dnipro.

Visualization of music has fascinated me since childhood. But all the devices that I built they had one significant drawback. So, I made a new visualizer according to a new super scheme, and no matter how admired his work in the first days, after a week, I see that it’s just a chaotic flashing of light and nothing more. There was no connection between light and music in these devices. This is how these first devices looked somewhere (I took this picture from the Internet).


The reason was that all visualizers were mainly built according to someone invented technology for frequency division of the signal into three or four (or more) light channels. And the algorithm was so primitive that the entire visualization was reduced to a few simple effects. And the human brain is designed in such a way that it very quickly remembers these effects on a subconscious level and the so-called fatigue effect occurs. This drawback was inherent in all devices built on this principle, which is even now used to automatically visualize music in many branded controllers.

For myself, I decided that in order to build a real visualizer, we must begin with the development of a theory on the basis of which it will be possible to create electronic devices. As the theory developed, I built visualization devices and, based on the result, determined the direction of development.

Here I will not tell you all the technical details, I think they will not be of interest to you, I will only touch on the fundamental principles that were taken as a basis. What I will describe is my understanding of music and nothing more, I do not impose it on anyone, I use these concepts in my developments, but comments and criticism are welcome, you can always find interesting ideas in them that have a rational grain.

Basic concepts.

  1. In any piece of music there are two components - this is melody and rhythm.
  2. Melody is a smooth component, which is like a wave that passes through the whole piece of music.
  3. Rhythm - or you can say another drawing, it can include a solo instrument, the voice of the performer, and of course rhythm itself - it is something opposite to a melody, a dynamic component of a musical work.
  4. The visualizer must perform the “light arrangement” of a piece of music. What does it mean? He must create a new visualization, each time for the same piece of music. The visualizer must be intelligent to some extent.
  5. Output optical device (OOD) - this can be a screen, it can be a set of various light sources that have a certain location in space, both among themselves and relative to the viewer.
  6. Symmetry principle. This is the principle of beauty. This was not invented by me, but I came to this when creating an output optical device. If the formed optical light images are symmetrical, it is always perceived beautifully. And if you look at all the light installations, you will see 100 percent that the principle of symmetry is present everywhere.
  7. The visualizer must be completely autonomous, without any adjustments and settings, or have a maximum of one button - power on.
  8. The frequency component analysis mechanism must distinguish one note from another.
  9. The essence of visualization is the creation of light images that have a visible connection with a piece of music.

To be continued…

1 Like

Very interesting project Can I send all this info to some friends of mine? They’re in close contact with Sonar Festival and some local promoters in Barcelona. Maybe they’re interested too.


Yes, you can. I will be very grateful to you for this. Reviews are very important to me. Especially constructive criticism.

1 Like

Just some thoughts on it. I think the hue lights all ready doing that. You can make music zones and then let Spotify connect to the huelights to get pulsating lights.

Also the engine lighting basic version is doing that. I mean right now you are limited to hue and nanoleaf but they are also giving you a real-time light show

1 Like