I already have a lexer/parser, but my knowledge is unfortunately mixed when it comes to semantic model, symbol solving, type checking, cross-reference resolution (and the PLTD
Stack Exchange site sucks; don't get any answer there)... I've previously tried something and reached until a language server, but gave up in the project.
I'd rather use something for facilitating implementing these semantic stuff because there's a lot of type inference to do.
To add:
Also need to consider that I'll use package resolution like in Java, where source files are referenced lazily.
The default name system needs to be sort of three-dimensional like XML, e.g. (prefix, local_name)
Maybe I don't need a full framework, but some libraries with built-in algorithms for things like bidirectional type checking and things for the semantic model, and also need to know how to use them in my type checker.
I'd suggest my project, Lady Deirdre, which aims to address exactly the problem you described: providing a common framework for custom language semantic modelling. The linked page includes a comprehensive book on this topic. My other project, Ad Astra, demonstrates how to use Lady Deirdre in practice (symbol resolution in a lazy fashion).
More generally, what you're looking for is usually called "incremental computation". Beyond my work, there are several solutions in the Rust ecosystem: Salsa, Adapton, and Anchors.
I also recommend my article explaining how these algorithms work in practice, as well as an article by the author of Anchors.
To be precise, the full commercial license currently costs 5k (not 20k) if you earn >= 200k. This revenue cap is primarily for businesses that want to use my work in commercial software. For non-commercial hobby projects, LD is free of charge.