Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

By 1981, Wirth had long since moved on - Pascal was superseded by both Modula ('73-'76) and Modula-2 (first compiler released '79) in terms of Wirth's own efforts by then.

So, sure, already by 1981, the original Pascal was already old, and Pascal as a class of languages was still on a growth trajectory not because of "pure" Pascal, but because it was a very simple base to build something more advanced on top of.

You'll note that Kernighan very specifically recognises and addresses the distinction between the purpose of Pascal as a teaching language vs. his criticism of it for "writing real programs". That was somewhat relevant at the time exactly because a lot of people were using Pascal to "write real programs" and opted to extend Pascal rather than use more advanced languages as their basis. But it was of a lot less relevance with respect to the origins and purpose of Pascal.

Many of the lessons Kernighan describes are lessons Wirth was aware of as well, in designing Modula, Modula-2, and the various versions of Oberon. Though his approach was always to look for ways of solving these issues through careful simplification over ever more complex languages.

By 1981, when considering writing "real programs" for production use, Kernighan gave valid criticism. In 1970, when considering Pascal's aim as a tool to teach structured programming, it would not have been.

So, yes, "modern", even then.



Conjecture, both patterns you outline are the same phenomena, and stem from Pascal being a wonderful Algol canvas to create new languages from.

That Pascal was the LLVM of its day but with an added benefit that because of its simplicity, it could spread as a pure idea, being able to implement from a single mind, means that it would be used as the basis for language and system design.

The only way it could be more useful is if Wirth had spec'd a stable s-expr syntax as well.


It was too early for that at the time, unfortunately. p-code was an attempt at making it machine independent (but maybe a serialisation / s-expression would have done better), but the performance loss was too great for it to get wide traction.

Interestingly, one of Wirth's PhD students, Michael Franz, wrote his dissertation on Semantic Dictionary Encoding, a method that was reminiscent of lempel-ziv encoding an intermediate representation in a way that let the de-compressor / code-generator re-use templated generated code. Unfortunately at that point Java was underway (Franz thesis was published in '94) and Franz moved on to work on various code generation and verification for Java [I think, I haven't followed his newer work that closely]

But it doesn't end there. One of Franz' PhD students at UC Irvine was Andreas Gal, who did his thesis on tracing just-in-time compilation and went on to do TraceMonkey, and became CTO of Mozilla.

[I still think there's interesting stuff to be done with Franz' SDE work; I've periodically looked at it ever since I read it first time in '94, but have never had the time to devote to it, and keep hoping someone else will revisit it]


I was speaking metaphorically of Pascal being the LLVM of its day, more that it was a known base, portable, easy to explain and the first tool someone would reach for.

Templated IR code that has been LZ compressed sounds excellent, Thumb++ or uop buffer engine. It sounds like from the post below, that the instructions could even be pathologically the uop or VLIW instruction form and the SDE will effectively discover the compact CISC for your specific workload. Why not push the decode engine into the instruction decoder?

I found this awesome post that covers SDE, http://hokstad.com/semantic-dictionary-encoding

I think one way to explore this would be to implement compressed binary loaders, like done for demos etc, but target wasm, wrap it in a decompressor and have the wasm program decompress itself as it executed perhaps on a method by method basis.


> I was speaking metaphorically of Pascal being the LLVM of its day, more that it was a known base, portable, easy to explain and the first tool someone would reach for.

Yeah, I got that. My point was that they did actually try to take the next step as well, but it was too early for that part.

> I found this awesome post that covers SDE, http://hokstad.com/semantic-dictionary-encoding

That's mine, in fact :)

SDE doesn't necessarily "discover" an instruction-set per se, but certainly, the templates the decoder would put into the dictionary effectively does recover a lot of structure, and you'd be able to do simple stats and dependency analysis to identify constructs that could be turned into dedicated instructions, certainly.

I also think e.g. Gal's work on tracing has a lot of potential to be combined with this to be used as a profiling/feedback stage to allow the decoder to make better code generation choices over time if you maintain enough information to map optimised traced sequences back to the template fragments they came from.

> I think one way to explore this would be to implement compressed binary loaders, like done for demos etc, but target wasm, wrap it in a decompressor and have the wasm program decompress itself as it executed perhaps on a method by method basis.

That's an interesting idea. It would save having to build the final native instruction emission stage while testing. In fact, it'd be possible to build tools that simply demonstrates round-tripping the s-expression syntax as a proof-of-concept.


So glad I met you, :)

> decoder to make better code generation choices

Understand what you are saying, but there are lots of layers between the decoder and code generation and resulting uops. But if we are sending a compressed instruction stream, why not go up a level of abstraction send the IR, MIR, HLIR, s-expression, then s-expression with types, effects. Ok I am just gonna send the latex of the paper. ;)

speaking of round tripping the s-expr syntax

https://www.reddit.com/r/WebAssembly/comments/lil841/very_fa...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: