is rap.
Rap music made me cry, sing, and dance, and it is also how I learned the English language. I still remember the first time I sat down and tried to understand Biggie and Tupac. I remember learning about Brooklyn, about California, the slang, the struggles, and all the other powerful messages behind those songs.
Rap music is the best example of how lyrics alone can make music a piece of art. Some of the best hip-hop songs are literally a sample, some drums, and a person talking poetry on a 4/4 tempo.
And while for someone reading the lyrics might not even be necessary, as they know the language, for non-English speakers like myself, tools like Genius made life incredibly easier. Genius is an online lyrics music collector: if you’re looking for the lyrics of a song, Genius is your guy. Thanks to Genius, even if I didn’t understand what Biggie was saying, I could sit down and read the lyrics, then Google them and translate them. Not only that: when a rapper (or a singer in general) makes a specific reference that is too hard to understand, Genius will clarify that for you through some side-bar snippets.

But how does Genius do that? How do they produce such insightful, up-to-date, useful snippets?
Well, in 2009, when Genius was born, I would say they would mainly produce these kinds of snippets manually: users would just add their comment, maybe some moderator could review some of them, and that’s it, closed deal.
However, today, with the powerful AI technologies that we have, the pipeline can be made much smoother and efficient. While I do not believe that an agentic AI would be able to do the job of a music expert (for so many reasons), I believe that a person with such domain knowledge could be helped by the agentic AI, which would provide them with the right tools to create the snippets.
And that is what we are doing today. 🙂
We will use Streamlit, Python, and OpenAI to build a super simple web app that, given a song’s lyrics, provides clarification to what the piece of text means. More specifically, we are going to allow the user to ask questions about the piece of text, making the Genius idea more “interactive”. We are also going to give our AI Agent the Web Search results to allow the LLM to look at other songs and resources when crafting the answer.
To spice things up (and for copyright purposes lol), we are going to also create our own songs, using another AI agent.
Exciting! Let’s get started! 🚀
If you are interested in the final result of this experiment, skip to the third section. If you want to craft the magic with me, start from next section. No judgments. 🙂
1. System Design
Let’s first design our system. This is how it looks:

More specifically:
- The User has the ability to create the song from scratch using an AI Agent. That’s optional; a batch of songs has already been generated and can be used instead.
- The User can select a part of the text and ask a question.
- The AI Agent can generate the response in a “Genius-like” style.
The AI Agent is also equipped with:
a. The “Internal Song Knowledge“, which consists of the extracted features/metadata of the song (e.g., vibe, title, theme, etc…).
b. The Web Search Tool, which allows the agent to surf the web to look for songs and add context to the question
This design has good modularity, meaning it is easy to add bits and pieces to increase the complexity of the system. For example, if you want to make the song generation more sophisticated, you can easily tweak the Song Generation Agent without going crazy over the other parts of the code.
Let’s build this piece by piece. 🧱
2. Code
2.1 Setup
The whole code can be found in this Github folder
The structure of our work will be as follows:
- A lyrics generator, which is
generate_madeup_lyrics.py - The lyrics answer, which is
qa.py - The web app itself (the file we will run through Streamlit), which is
lyricsgpt_app.py - A bunch of helpers (such as
utils.py,constants.py,config.py, etc…)
The data will be stored in a data folder as well.

I don’t want to bore you with all the tiny details, so I will only describe the main components of this structure. Let’s start with the core: the Streamlit app.
2.2 Streamlit App
Please note: you will need to have a OpenAI API key ready, for both the Streamlit app and whatever requires the LLM generation. Outside of the Streamlit app, the easiest way is to set it using OS: os.getenv(“OPENAI_API_KEY”) = “api_key”. Within the app, you will be prompted to copy-paste it. Don’t worry, it’s all local.
The whole thing runs with the following command:
streamlit run lyricsgpt_app.py
Where lyricsgpt_app.py is the following block of code:
This is quite long, but pretty straightforward: every line represents a piece of the web app:
- The title of the web app
- The song selector, which allows the user to select the lyrics from the data folder (more about this later)
- The box for the block of text, where the user can copy and paste the part of the lyrics of interest
- The question box, where the user can ask a question about the part of the lyrics of interest selected above
- The answer box, where the LLM can answer the question.
But this is “only” the executor; the dirty work is done by the lyricsgpt module and its objects/functions. Let’s see a few!
2.3 Lyrics Generator
This part is optional and not included in the web app, which focuses on the Genius-like Lyrics Explainer. Feel free to skip this if you are interested in that. But I have to say it: this part is pretty cool.
For the lyrics generation, the game is simple:
- You give me a title, vibe, theme, and some hidden message in a twist.
- I give you the lyrics.
For example:
Produces:
Verse 1
Underneath the city lights, whispers danced along the skyline, we were shining through the night, like diamonds in a midnight sky. But love was just a high-rise dream, built on fragile seams, now I’m pulling off the freeway, leaving echoes in between.Chorus
And it’s all glitter in the rearview, fading into shades of blue, every laugh and every tear, drifting out of view. As the road unrolls ahead, I let the memories stew, it’s just glitter in the rearview, letting go of me and you.[Some more LLM Generated Text]
Outro
So I drive into the silence, where the past can’t misconstrue, leaving glitter in the rearview, and the shards of me and you. It’s just glitter in the rearview, a love story gone too soon.
Pretty cool, right? The code to do that is the following:
You can play with it by modifying SONG_PROMPTS, which looks like this:
Every time you generate a song, it will go into a JSON file. By default, that is data/generated_lyrics.json. You don’t have to necessarily generate a song; there are already some examples I made in there.
2.4 Lyrics Explainer
The coolest thing about this whole Agentic era is the amount of time you save to build this stuff. The whole Question-Answering logic, plus the ability of the AI to use online web search, is in this block of code:
This does it all: answer the question, reads the lyrics metadata, and integrates information online if prompted/needed.
I didn’t want to get too fancy, but you can actually equip the agent with the web_search tool. In this case, I’m parsing the information directly; if you give the LLM the tool, it can decide when and if search online.
Ok, but does this work? How do the results look like? Let’s find out!
3. The magic!
This is an example of the web app.
- You copy-paste your OpenAI API key, and you select a song. Say “Glitter in the Rearview”.


2. Select the part of interest. For example, let’s say I’m an Italian guy (which I am lol), and I use meters, so I don’t know how far a mile is. I would also like to know if there is a reference to something specific when the singer says 13 miles (“Thirteen miles to freedom” is the first sentence in the bridge)

3. See the magic!

Let’s try something harder. Here, I pasted the whole second verse, and I asked the AI to give me which singer would write something like this.

The AI points out Taylor Swift and Adele. Especially the Taylor Swift reference is extremely accurate, as the songs about breakups and love stories are her greatest hits. She also on her popularity and how her life is affected by it in songs like “I know places”:
Lights flash and we’ll run for the fences, Let them say what they want, we won’t hear it
Taylor Swift – I know places
I will admit I had to Google that.
4. Some thinking…
Now, this is far from perfect: it is a plug-and-play weekend project, less than an MVP. However, it provides three widespread takeaways:
- When provided with the right tools and metadata, the LLM really shines and provides insightful information (like the Taylor Swift suggestion)
- Building an LLM wrapper is way easier than it was even 5 months ago. The way this technology is evolving allows you to be more productive than ever.
- Agentic AI can really be applied everywhere. Rather than getting scared about it, it’s best to embrace it and see what we can do with it.
5. Conclusions
Thank you for spending time with me; it means a lot ❤️. Here’s what we have done together:
- Designed a Genius-inspired system powered by Agentic AI that can explain song lyrics interactively.
- Developed the backend components in Python and Streamlit, from the lyrics generator to the Q&A engine.
- Built the AI agent with internal song knowledge and a web search tool for contextual answers.
- Built an app that can interpret lyrics intelligently.
- Had some fun while doing that, I hope. 🙂
I want to close with my 2 cents. On my daily commute, I listen to podcasts (usually interviews) of artists who make music, and they explain the songs, the references, and the lyrics. If someone replaces those musicians with an AI, I am going to RIOT. Not (only) because I like the people themselves, but because I believe that they will explain things with a passion, a depth, and an empathy that LLMs are not able to provide (and I think they never will).
However, if we provide music critics with these kinds of AI tools, their work becomes much easier, and they can 10x their productivity.
7. Before you head out!
Thank you again for your time. It means a lot ❤️
My name is Piero Paialunga, and I’m this guy here:

I’m originally from Italy, hold a Ph.D. from the University of Cincinnati, and work as a Data Scientist at The Trade Desk in New York City. I write about AI, Machine Learning, and the evolving role of data scientists both here on TDS and on LinkedIn. If you liked the article and want to know more about machine learning and follow my studies, you can:
A. Follow me on Linkedin, where I publish all my stories
B. Follow me on GitHub, where you can see all my code
C. For questions, you can send me an email at piero.paialunga@hotmail



