Vibe Chords
Context
I built Vibe Chords around a simple idea: sometimes you know the feeling you want before you know the chords. Musicians and producers often think in words first, things like "dark", "warm", "dreamy", or "aggressive", but most music tools still expect you to start from theory, notation, or manual trial and error.
I wanted to make that first step easier. Instead of forcing the user to already know the progression they want, the app lets them describe a vibe in plain English and turns that into something playable, structured, and useful right away.
The result is an AI-powered chord progression generator that takes a text prompt, produces a musically valid progression, suggests tempo and scale information, explains the harmonic direction, and lets the user hear it immediately in the browser. It also includes a visualizer section so the progression is not just playable, but easier to see and understand at a glance. From there, users can keep iterating conversationally, ask for darker or brighter variations, and export the result as MIDI for a DAW.
The Build
I built the app with Next.js, TypeScript, Tailwind CSS, and shadcn/ui, with the generation layer powered by the Gemini API. The core product flow is intentionally simple: a user enters a mood or creative prompt, the server sends that to Gemini, the model returns structured chord data, and the UI turns that into something interactive instead of a raw text answer.
That structure mattered a lot. I did not want the AI layer to feel vague or chatty. The app expects a clean response shape with the progression, BPM guidance, scale, mode, mood tags, and a short explanation, so the result feels like a usable music tool instead of a novelty prompt box.
Once a progression is generated, the frontend does the rest of the product work. Users can play the chords in the browser, adjust BPM and octave, see the harmony through the visualizer section, revisit earlier ideas through a session history panel, and export the progression as a MIDI file. I also added follow-up generation so the experience feels iterative. You can start with one vibe, then refine it with requests like "make it jazzier" or "add more tension" without throwing away the original idea.
I also spent time on the surface area around the core generation loop. The splash screen, sample prompts, dark and light themes, and fallback examples all help the app feel approachable even for people who are not deeply comfortable with theory yet. That was important to me because the whole point of the product is lowering the barrier between musical intent and musical output.
Challenges
Getting AI output into a shape the UI could trust was one of the biggest challenges. A music app cannot do much with a loose paragraph response. The generation layer had to be constrained enough that the app could reliably render chords, metadata, and explanations without breaking the experience whenever the model answered creatively.
Balancing creative flexibility with musical usefulness also took thought. If the model is too rigid, every result feels generic. If it is too open-ended, the output becomes inconsistent or impractical. The interesting part was finding a middle ground where prompts still feel expressive, but the returned progression stays coherent enough to play, vary, and export.
Making the product feel interactive instead of AI-dependent was another important challenge. The value is not just "AI gave you four chords." The value is the loop around that result: playback, the visualizer, fast iteration, visible theory context, session history, and MIDI export. Without those pieces, it would feel like a demo. With them, it starts to feel like a real tool.
What It Taught Me
This project taught me a lot about building a small AI product that still has to behave like real software. The model output is only one layer. What actually makes the experience useful is the contract around it, the product decisions, the UI responsiveness, and the tools that help the user do something concrete with the result.
It also pushed me to think carefully about how AI should fit into creative workflows. The best part of Vibe Chords is not automation for its own sake. It is that the app helps turn a vague idea into a playable starting point quickly, without making the user give up control. That balance between assistance and authorship is what made the project interesting to build.