Feedback on bliss public API


I have mostly finished this project GitHub - Polochon-street/bliss-rs: A song analysis library for making playlists, and was wondering - since the aim is to have people use the public API, how easy it was to understand from an outside point of view? And how good it was in term of standards / usability?

The project is basically a tool to compute distance between songs in order to make playlists.
The public API doc (which should be enough for people to start using it) is here

Since it's also useful as a plug-in for existing audio players, I've made a trait here Library in bliss_audio - Rust that is supposed to make implementation for extern audio players easier (see GitHub - Polochon-street/blissify-rs for a reference implementation).
How does that look as well?

I could really use some reviews to make sure everything makes sense :slight_smile: - or to change stuff if it doesn't :grinning_face_with_smiling_eyes:



This looks very cool. Do I roughly understand what is going on under the hood:

  1. You point the CLI tool at a few audio files
  2. The aubio library (C) does the heavy lifting of fourier analysis -> feature detection -> dense vector representation
  3. You can generate a playlist using (cosine?) distance?

how easy it was to understand from an outside point of view? And how good it was in term of standards / usability?

It looks simple enough, all the items have a description and most of them have a doc example :+1:. You do appear to have a few feature flags that aren't documented though, what do those do?.

Also a :+1: for clean examples on the front page of the api docs.

I do have a few nits about the code, if that's okay:

Your Song struct has a few String fields with the comment " ( "" if empty)". Using values like "" as a sentinel for a missing value is a brittle design. Option<String> would be better and conveys clearly that the value can be missing.

Crates that define their own errors usually also define a type alias like type BlissResult<T> = Result<T, BlissError> which can be used in function signatures etc.

Instead of Song::new(path: &str) -> Result<Self, BlissError> taking a &str, take an AsRef<Path>.

Thanks for the reviews! :slight_smile:

@fabio: Step 2 and 3 are perfectly correct (though chroma features are computed manually through a rehash of librosa / research papers, but that's a detail :stuck_out_tongue:). Though were did you read about the cosine distance? We used it for blissv1 in C, but I don't think I mention it anywhere here (because it doesn't work so well).

Step 1 is a bit more tricky, because the two CLI tools are just smalls demos that are not that useful in the real world - it analyzes songs, but then you would need to store the results somewhere to make the playlists. In the real world I think someone would have to either code or use a plugin for an audio player (or make just text file playlists).
What made you think the CLI tools should be used? I should probably correct some documentation there :slight_smile:

@mejrs, thank you for your very valid points. I'll replace the String fields by Options and make a BlissResult type :muscle:
I'm less sure about the Path though - I remember talking with some people that found using the whole Path / PathBuf thing rather cumbersome for simple cases such as that one (since in that case it's not a valid path, ffmpeg will just throw an Err directly). I've tried experimenting with it, and the compiler also doesn't really like AsRef<Path> directly, so, not sure - but that's probably me that doesn't know how to handle them :grinning_face_with_smiling_eyes:

1 Like

De/Serialization is something you may want to consider.

Here's a sketch of one way.

Are you familiar with AcousticBrainz / Essentia?

Nice! It is really hard to implement that much number-crunching completely from scratch, but it is great when you can start chipping away at some old code and start implementing parts in Rust.

What made you think the CLI tools should be used?

These two lines from one of the examples in the GitHub repo: I though they looked like linux bash script:

    let first_path ="Help: ./distance <song1> <song2>")?;
    let second_path ="Help: ./distance <song1> <song2>")?;

Though were did you read about the cosine distance?

I didn't. I have done work with Word2Vec and Collaborative Filtering in the past where cosine distance is the most common similarity measure, so I naively assumed this was basically Song2Vec!

Thanks again for the feedback.

I've implemented most of the changes here Some review comments by Polochon-street · Pull Request #13 · Polochon-street/bliss-rs · GitHub - I'm still unsure about using PathBuf - adding Path::new in test was not so much a problem, but the fact that it doesn't implement the Display trait and that you have to use the debug formatter to print Song.path grinds my gears a bit.


De/Serialization is something you may want to consider.


These two lines from one of the examples in the GitHub repo: I though they looked like linux bash script

I think this is something I need to address, but not sure how. This is a lib where people have to get their hands dirty to integrate it to an existing setup (after all, they how is the lib supposed to know which songs should be analyzed, and where that analysis should be stored?), but at the same time, it would be nice to ease these steps as much as possible.

I made the Library trait for that reason, but I'm starting to think the current implementation (blissify-rs/ at master · Polochon-street/blissify-rs · GitHub) might be a bit overkill.
It's using SQLite to store songs, but I could actually just store the analyzed songs in a serialized text file (or whatever Serde allows me to do).

@quinedot, you seemed to have some ideas about that - what do you think would be most judicious? :grinning_face_with_smiling_eyes:

You can use Path::display to get a Display-able thing from a path. Still a bit of an inconvenience, but nicer than debug.

1 Like

I reach for JSON by default these days. It's common and usually computer friendly enough, human friendly enough, and robust enough to do the job.

If you used MusicBrainz IDs or some other universal song qualifier, you could have some sort of store that's independent of file path (but I also agree that's probably better suited to it's own library or paired with a larger song management system or something). Same with storing the data in the files themselves, for containers that support that.


I'll use JSON for blissify, then. Seems way better than SQLite, anyways :grinning_face_with_smiling_eyes:

I've used Essentia to compare their values to mine for well-known descriptors (it's written somewhere in the code I think), but that's mostly it - I admit implementing MusicBrainz IDs is on the todo-list, but rather far away.

Thanks a lot for your (and everyone's) feedback, I think it's looking better now :slight_smile:

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.