Lyria Camera turns your surroundings into a live soundtrack. Using Gemini’s visual understanding and the Lyria RealTime API, it transforms what your camera sees into evolving music that matches your mood and movement turning every moment into an immersive audiovisual experience.
Excited to share Lyria Camera! Аn app that turns your camera into a live musical instrument.
It uses Gemini to interpret what you see and Lyria RealTime to compose music that evolves with your surroundings.
A beautiful glimpse into the future of multimodal creativity.
This is such a fun use of real-time vision + audio generation — turning your surroundings into a soundtrack is a whole new interaction layer. Super curious how responsive the mood/movement mapping feels in real environments. Very cool concept! 🎧📷✨
@Lyria Camera by Google DeepMind This is fascinating! Turning your camera into a live musical instrument using Gemini to interpret what you see and Lyria RealTime to compose evolving music is such a creative application of multimodal AI. The idea that music adapts to your surroundings in real time feels like a glimpse into the future of interactive art.
I'm curious, how responsive is the system? Does the music shift immediately as the camera moves, or is there a slight delay for interpretation? And can users influence the style or genre of music being generated, or does the AI determine that entirely based on the visual input?
Also, what kinds of environments work best? I imagine natural scenes versus urban landscapes would produce very different soundscapes.
Really beautiful concept!
This is such an innovative app! Have you seen those trends on Tiktok where it's "upload a picture and tiktok will show you what sound goes with it?" so it's cool to see an app that can do that on it's own!
This is something new I got to know, out of the box concept. congrats on the launch. 🎉
Stunning. :) I'm super curious if this can go the other way around, connect music + camera to DMX controllers and light up a stage. Awesome work! Upvoted!