Are there more new programming languages now than there used to be?

Computing has definitely become much more "by the people" since the days when basically everyone had their own language as the local implementation and use of a PL was basically inseparable. Building compilers or interpreters was basically Linux From Scratch for the programming language. And we've also seen languages essentially die off in popularity on exactly the hill of broadcasting "we're superior/more pure" relative to "impure" replacement technology which subsequently became much more popular.

That got me curious. Taking wikipedia's History of programming languages - Wikipedia as some kind of indication of what languages were in use when I calculate the following. Showing approx number of languages created/in use over the decades:

Approx time span     Langauges per year
1950's, 1960's       1.4
1960's, 1970's       1
1980's               1.5
1990's               2
2000's, 2010's       1

Seems to me that languages have been sprouting like weeds at about the same rate since the dawn of computing.

1 Like

While the rate of languages sprouting has stayed fairly constant, the number of programmers has boomed in that time; as a result, the number of users per significant programming language is much, much higher than it was in the past. If there's 1,000 programmers in the world, and they invent 15 languages between them, that's many more languages per programmer than if there's 1,000,000 programmers in the world who use 30 languages between them.

3 Likes

Earlier in the timeline, like at the start or nearly, based on what I understand, relatively more people would have been using their own language projects, since sharing code was logistically a limiting factor. It was also a determination of credibility among programmers and researchers, so it was as if everyone had a language going. Of course a lot of the "owned" languages may have been customized flavors of some other project. Similar to how we have many flavored distributions of several main Operating Systems as an analogy.

1 Like

I'm not sure I see the significance of programmers per language.

If we now have a thousand times more programmers in the world than some time in the past, how come we don't see a thousand times more new languages being created per year today?

Mind you, from what I have seen as programmers content on YouTube maybe there are a thousand times more languages being created.

I agree I think there are relative fewer languages which are personal projects and not being designed with features for some wider audience.

Go back to the original quote that triggered your table:

If languages were "sprouting ... at about the same rate since the dawn of computing", then you'd expect the number of languages per year to go up proportionately to the number of programmers; instead, it's roughly constant, implying that the rate at which programmers invent programming languages has fallen over time, so that instead of (say) 1 in 100 programmers inventing a language, you now have 1 in 1,000,000.

I agree and I think my point being that some of the criteria we use to dissect languages, such as was shared in the paper, echos a time where factors of an implementation related much more closely to the reputation of the individual authoring it than it does today. So we have a lot of scientific views of what makes a language fit properly into categories. Categories which, I think, are sometimes less reflective of what makes a good language design for most people, and more-so an inherited checklist of properties used for critique.

1 Like

To be fair to the paper sited here it was written at a time when a lot of programming was done in assembler or BASIC. The two languages they deemed important enough to teach us in a CS course as teenagers in 1974. Or things like ALGOL and FORTRAN, which had GOTO. The art of programming was new and a lot of spaghetti code was being written. People had only just started thinking about how to organise code in a nice way that would be clear to those that had to work with it later. It was the time when the rules of "structured programming" were being proposed as a way to achieve this. Which they also taught us in that same CS course in 1974. Said "structure programming" approach is not language specific. You can do it in assembler if you like (Actually having worked on a lot assembler projects I wish their authors had).

As such I don't see that paper as being about any language in particular and even less about the reputations of any individuals.

Actually, I see that paper as an argument for "zero cost abstractions" to better organise code, be they only functions/procedures rather than the zoo of such things we have today.

I do agree with your comments about language categories. The idea that a language has to be this, or it has to be that, and if it does not do this or that perfectly it's no good. When it comes to getting things to work I don't care about your categories.

2 Likes

That's more-or-less what have happened.

“Roughly constant” is “number of languages Wikipedia deems important” which is more-or-less equal to “number of languages that have become famous enough to be mentioned in non-IT press”.

Open that link that @ZiCog gave us:

Ctrl+F, FOCAL — not there… 1968…
Ctrl+F MESA — not there… 1976…
Ctrl+F PostScript — not there… 1982…
Ctrl+F ABAP — not there… 1983…
Ctrl+F Limbo — not there… 1995…
Ctrl+F Sawzall — not there… 2003…

These are only languages that I remember “out of the top of my head” without digging too much!

And who may forget Go! language and their complaints that Google steals their thunder?

Number of languages have grown, but number of languages that reach notoriety is the same because number of people is approximately the same.