What Sonauto's users are actually doing (and what that means for the product)

What Sonauto's users are actually doing (and what that means for the product)

Mimir·February 23, 2026·3 min read

The Playlist Problem You Didn't Expect

When you think about an AI music platform, you probably imagine people typing prompts and hitting generate over and over. And sure, that's happening on Sonauto. But what's really interesting is what happens after the song is created.

Users are organizing their generated tracks into dozens of themed playlists with genuinely creative names. We're talking 40+ playlists in the trending section alone, with collections like "Soul Sisters 1, 2, 3" and collaborative shout-outs like "For Floofyc4t, and Slaykage." People aren't just making music — they're curating it, naming it, arranging it into narratives. This curation behavior is showing up as a primary retention signal, right alongside the core creation workflow.

The platform clearly understands this matters, since playlists are featured prominently in discovery. But here's where it gets tricky: empty playlists are showing up in that same trending section. When someone browses for inspiration and clicks into a placeholder with no content, it breaks the discovery loop at exactly the wrong moment. A simple quality filter that hides empty collections would protect the trust users place in that trending algorithm — especially important when you're building for a global audience across 90+ languages.

The Iteration Story Hidden in the Data

Sonauto users aren't one-and-done creators. They're iterating. A lot. You see it in the naming patterns: "Painted pitch black ex1, ex2" or numbered series that span multiple attempts. What's especially telling is that second songs from the same user often outperform first attempts — we saw examples with 46 likes on iteration two versus 18 on the original.

This makes sense. People are learning what works through trial and error, refining prompts, switching between model versions (v2.2 and v3-preview are both actively used), and building toward better results. But right now, the platform doesn't surface that learning process. There's no version history showing how a song evolved, no easy way to compare what changed between v3-preview and v2.2, no timeline showing which prompt led to which output.

A version history feature with side-by-side playback comparison would make that refinement workflow explicit. Users could see their own progression, understand what each change actually did, and iterate faster. Given that your most engaged users are the ones generating 7+ songs across genres within tight timeframes, supporting that experimentation loop more directly would reinforce exactly the behavior that's already driving retention.

From Solo Curation to Social Creation

Here's what struck me: users are showing all the social signals — 46+ likes on songs, comment threads, creative playlist dedications to other users — but it's all passive consumption. Nobody can actually co-create a playlist. The person who makes it is the only one who can add to it.

Given how much energy users are already putting into playlist curation, collaborative editing feels like the natural next step. Let multiple people contribute to a shared collection with role-based permissions. Suddenly that "Soul Sisters" series becomes a living project multiple creators can build together. Those dedication playlists turn into actual collaborative spaces rather than one-way tributes.

With a user base spanning 90+ languages and genres, collaborative playlists could transform Sonauto's international reach into genuine network effects. Instead of individual curators working in parallel, you'd have cross-geography, cross-genre collections that draw from the platform's full creative diversity.

Putting It Together

What makes Sonauto interesting isn't just the AI music generation — it's that users have found ways to build creative identity around what they generate. The curation, iteration, and social behaviors are already there. The opportunity is to build features that make those workflows more explicit, more collaborative, and more reliable.

We used Mimir to pull this analysis together from 15 public sources, looking at what users were actually creating and how they were organizing their work. Sometimes the most interesting product insights come from watching what people do after they get what they came for.

Related articles

Ready to make evidence-based product decisions?

Paste customer feedback into Mimir and get ranked recommendations in 60 seconds.

Try Mimir free
What Sonauto's users are actually doing (and what that means for the product) | Mimir Blog