Design patterns for composability with traits (i.e. typeclasses)?

You have to prove you implement all the required traits when you insert the type into the runtime polymorphic container, so this is when the validation that you provide all the dictionaries takes place, before the types are erased. What we do is gather all the traits required of the elements in the collection from all the different modules (as pub traits are part of the module API this information is there already), and use that final trait list to check the types implement all the required traits (IE Foo and Shape).

We don't care about DLLs because they only support 'C' types anyway so we have to import DLLs as foreign 'C' functions anyway. I would also be unsafe to add new requirements to a collection that already has elements in it.

I think you forgot the meaning the Halting problem.

There's no problem with the halting problem here, why would you think there is?

Think again. :wink:

You will have to explain, because I think you must be missing something. I have already explained how the implementation would work... there's no recursion because there are only two phases we need to care about, static polymorphism and runtime polymorphism, there is not an infinite regression of polymorphisms...

Or you will have to think on it. :wink:

Hint: how do you differentiate between different execution paths in terms of what the collection will need as trait bounds at any point in the code. This is why I said you'd need to convert the linker into a compiler... you'll end up needing to put the conjunction of every trait in the program in every collection. Absolute gridlock. Because the dynamic runtime execution is undecideable.

Btw, I had your idea a long time ago and dismissed it.

Hint: how do you differentiate between different execution paths in terms of what the collection will need as trait bounds at any point in the code.

You don't need to as "extend" would be a top level primitive like "trait" and "impl"

Of course not, you would only put the traits in that are explicitly extended by the programmer.

Its simple really, for the collection to be proved sound statically, it must provide a dictionary for every trait that you will use on that collection. If any element is lacking one of the dictionaries then it is unsound.

And you think I don't know that. Amazing.

Try again to read what I wrote in the prior post.

Your proposal has no way of differentiating which execution paths require which dictionaries; thus you will need to put every possible dictionary that could be required by any execution path in every collection that shares possible execution paths.

Execution paths are not known until runtime, so you cannot use that information statically. You have to include all dictionaries that might be needed at runtime, which means all the dictionaries used by any possible path through the code.

How do you send your collection over the internet?

A union of data types I can send and can be perfectly checked statically at compile time by both ends of independently compiled interoption. Your proposal requires to know the universe of uses at compile-time.

As I said far upthread, you'd kill independent compilation and modularity.

Another problem is your proposal doesn't allow the creator of the collection to be independent of the implementations of data types that the consumer of the collection will independently provide. You force the creator of the collection to know all the implementations and all the traits that will ever be expected on the collection. That is gridlock. You break the entire point of independence of trait and impl. You invert the control over typing from bottom-up (decentralized) to top-down (centralized). The horror.

My proposal is bottom up typing, yet still statically typed. Every module (even dynamically linked one such as the internet) must inject its trait impl dictionaries into its global hashtable (but not a global hashtable for all communicating modules).

So in the Box for the data type, we store its unique id (cryptographic hash), then we can use this along with the trait's cryptographic hash to lookup the impl dictionary. Yeah it is slower; that is the price of solving the Expression Problem. For most use-cases, afaics it is entirely worth it. We obviously won't use it when we need absolute performance in the critical performance sections of a program.

You serialise at one end, and the other end has to be compiled to expect exactly the same set of dictionaries. Really this is not how network protocols work for many good reasons.

This sounds really whacky... of course you cannot do this you have to send type information, the whole point of static checking is that it happens at compile time. To me it sounds like you are looking for a dynamically typed language.

Nope, the trait information needed is part of the module API.

Not true, the user of the collection can extend with whatever traits they want. The key is the compilation of the collection constraints is deferred.

This does not sound like static type checking, it does not even sound like compilation. It sounds like an interpreter. Further it sounds like to cope with the internet this global hash table would truly have to be 'global' IE sitting on a big server somewhere shared by all users of the language?

Yeah, its not a compiler, its doing method lookup in a hash, that's an interpreter.

You've not understood anything. I think I've lost my desire to explain to you, because you aren't even really trying to learn.

The discussion is getting redundant. You are repeating the same misunderstandings. How else could I possibly explain it to you.

You must really think I am dumb. Have you ever entertained the idea that you are thinking about it incorrectly. I spent an entire thread talking about static typing and you think that just because you can't grasp the concept that I am talking about dynamic typing. Geez.

Of course both sides of the communication have the declarations of the data types and APIs. Why would you think otherwise.

Well if I didn't think you had something interesting to say, I wouldn't be spending my valuable time discussing it would I :slight_smile:

How do you ensure both ends have the same definitions of the types?

How does a compiler insure it has declarations. Oh come on now. Without a declaration, it can't compile anything.

Whether we exchange data on the stack or over an internet connection doesn't change the fact that the statically compiled types determine what that data is. This is CompSci 101.

It doesn't. You can have different versions of software at each end. The protocol definitions have to be shared, but there is nothing that enforces that. I can write an HTTP client that breaks the protocol, but it might do something interesting. In any case none of this can be proved statically, you have to dynamically check the server protocol version at runtime.

This is disingenuous. My operating system can have a different version too (or a virus) and lie to my program about which version of its APIs it is providing to my program. A remote program can lie to. You are taking this thread off on irrelevant tangents. If two programs want to intopt, they have to agree on an API declaration. Period.

C.f. Philip Wadler:

(note my comment above his)

Well, I would say that I have been focusing on S-modularity reading that comment, that is what most programming languages have.

Where do you actually find I-modularity?

Well they have to agree on the API statically for OS calls, but network protocols, and disk files will dynamically interpret the data in the file or network packet, precisely because you can't trust the other end.

The global hash table is not dynamic except in the case where the sender is controlling the polymorphism resolution. Modules only need to know the hashtable of the trait impl which they consume, which can be built at compile-time unless the sender is resolving the polymorphism. It is statically checked by the use of S-modularity, i.e. an API declaration.

This of course does mean that for APIs can't be polymorphic at runtime. But the salient point is that given a module consuming a collection of declared data types, the provider of that API does not need to know which trait impl the consumer of that API requires. Your proposal would invert the control and require the sender of the collection knows all the trait impl of receiver. Your proposal requires global knowledge. My proposal only requires local knowledge of the hashtable.

Edit: I explained in a subsequent post that the dynamic injection of the module's hashtable depends on whether the sender or receiver is resolving the polymorphism in the API.