"rerun-if-changed-env" does not trigger a rerun?

While cargo may not know these features are mutually exclusive, it is not invalid to use them that way,

Many on this forum would disagree with you, and say that this is universally wrong.

(My opinion is that, while this is not inherently wrong, there must be a single authority who decides whether to enable these features; either a binary crate, or the end user. As soon as a library like rayon enables one of these features unconditionally, that library is in the wrong.)


As for the generic consts being noisy; I would group up the other junk into a default type parameter.

struct Atomic<T, const N: usize, M = DefaultMemoryScheme>{ ... }

pub trait MemoryScheme {
    const CACHE_SIZE: usize,
    const CHECK_THRESHOLD: u32,
    const ADVANCE_THRESHOLD: u32,
}

pub enum DefaultMemoryScheme {}

impl MemoryScheme for DefaultMemoryScheme {
    const CACHE_SIZE: usize = 7;
    const CHECK_THRESHOLD: u32 = 21;
    const ADVANCE_THRESHOLD: u32 = 42;
}

Rust applies type parameter defaults quite aggressively, making Atomic unergonomic for somebody who wants to use their own MemoryScheme. But, we're already used to this problem with custom hashers for HashMaps, and so we already know the solution: The user can easily define a type alias.

type Atomic<T, const N: usize> = atomic_lib::Atomic<T, N, MyMemoryScheme>;

enum MyMemoryScheme {}

impl atomic_lib::MemoryScheme for MyMemoryScheme {
    const CACHE_SIZE: usize = 9;
    const CHECK_THRESHOLD: u32 = 99;
    const ADVANCE_THRESHOLD: u32 = 9999;
}
3 Likes

Thanks for the suggestion.
That does look like it may indeed be workable. The problem is, however, that concurrent memory reclamation/management mechanisms heavily rely on static and thread-local data structures and since Rust does not support generic statics either, I can't introduce these parameters in that way. I've come across this same issue when I contemplated how I could integrate different memory allocators with my management schemes generically, which I eventually had to give up on.

So behind the various interface types, I would need e.g:

struct ThreadLocal<P: Parameters> { ... }

But without generic statics, such a type can not be an actual thread local variable. I suppose for the counter thresholds I could structure only the type's methods to be generic, since only the implementation has to know them, but for the (array) sizes of internal buffers or caches that is again not possible.

I suppose for now I will have to abandon the idea of using compile-time parameters in this fashion, although I might try parametrizing the methods of thread-local variables.

Do your parameters have some typical defaults? Maybe you could use Cargo features as presets? e.g. "small", "medium", "large"?

This what I had initially, but felt it was also lacking, since the range of valid values could not be expressed with only a few pre-selected sets. Interestingly, using features in this manner also makes them implicitly mutually exclusive.

In the end I went back to using a large set of features for type-size related parameters ("cache-size-1", "cache-size-2, "cache-size-4", etc) and runtime configuration for all runtime parameters. I now use these features so, that if multiple sizes are selected (e.g. by different unrelated crates in the same dependency graph), the smallest value is chosen by the library itself.

This way, it would also be possible to solve the issue of defining numeric compile-time paramaters through cargo in general: Just require a selection rule (like min/max) or alternatively a fallback or default option, in case multiple conflicting values are selected somehow.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.