packetlost 2 days ago

Very interesting! For those who do not pay too much attention to Forth, the forth standard [0] still sees fairly regular updates. GForth, the implementation that I've spent the most time with, gets several commits per week, so it's not dead, though I suspect that it's a small handful of individuals who use it frequently that are keeping it alive and not widespread use.

Forth is absolutely worth learning, if for no other reason than understanding how stack-based interpreters work. The CPython VM, as an example, behaves similar in some respects to the average Forth program.

There's definitely better examples out there, but here[1] is an implementation of Mage the Awakening 2e's (tabletop game) spellcasting system in Forth that I wrote to help with playing awile ago.

[0]: https://forth-standard.org/

[1]: https://git.sr.ht/~chiefnoah/spellbook/tree

  • 7thaccount a day ago

    It's also important to point out that a forth standard probably helps a few of the commercial vendors keep some level of interoperability, but in general, the hobbyist community (I would guess the bulk of forth users today) ignores the standard almost entirely and just implements their own forth (commonly a new one for each project).

    The old saying is "if you've seen one forth, then you've seen one forth". It isn't a language like Python, but more like a language philosophy.

    The creator of forth (chuck Moore) wasn't a fan of standards and felt they led to bloat. He was uniquely lucky and gifted to where he would figure out the minimum implementation of both hardware + software to solve a problem and by hardware...I mean he designed his own chips. So at the end of the day, he had a holistic tool where the hardware and software were designed together to solve a specific problem. He wouldn't implement anything he didn't need. It's a fascinating process that I would guess wouldn't work for 99.999% of people as most of us aren't at that level of knowledge/efficiency.

    Forth and Lisp are both super freaking cool.

    • coliveira 17 hours ago

      Not only Chuck Moore wasn't a fan of standards, he isn't a fan of standards because he is alive and active.

      • DonHopkins 17 hours ago

        I used to do FORTH. I still do, but I used to, too.

    • stevekemp a day ago

      I've always suspected that most people learn forth by implementing one, rather than by picking an existing implemntation and using it for something real/significant.

      • PaulHoule 13 hours ago

        That was me.

        I wrote a FORTH in 6809 assembly for the TRS-80 Color Computer running this OS

        https://en.wikipedia.org/wiki/OS-9

        which was very much UNIX influenced and could support three users logged in simultaneously. I wrote to the Forth Interest Group and got back a reference card for the standard and used that for a guide. The big thing I did differently was file I/O; at that point in time FORTHs often accessed individual blocks on the disk, your code editor would edit one block at a time, etc. If you already have an OS, text editor and such it makes more sense to provide an API similar to the C API for the filesystem so that's what I did and the standard caught up.

vdupras 2 days ago

This is too much to bear and I had to create an account for this!

Might I interest you in the opposite? Sort of a Lisp implemented in Forth that can compile Lisp functions to native code, all of this in less than 500 lines of Dusk code.

https://git.sr.ht/~vdupras/duskos/tree/master/item/fs/doc/co... http://duskos.org/

  • kragen a day ago

    Condolences on finally opening an HN account. I greatly appreciate your work even though I disagree with much of it. DuskOS in particular seems excellent.

    I hadn't looked at comp/lisp before. It definitely looks like a real Lisp to me. In https://git.sr.ht/~vdupras/duskos/tree/master/item/fs/mem/co... I see that you have dynamic typing, implemented in assembly, and from https://git.sr.ht/~vdupras/duskos/tree/master/item/fs/mem/co... to line 70 you have a garbage collector. From https://git.sr.ht/~vdupras/duskos/tree/master/item/fs/comp/l... I see you have closures, but I'm not sure how memory allocation for their captured variables works; from https://git.sr.ht/~vdupras/duskos/tree/master/item/fs/comp/l... I infer that you are just using dynamic scoping, thus ruling out both upward funargs and tail-call elimination?

    What's the most interesting program you've written in comp/lisp?

    • vdupras a day ago

      > What's the most interesting program you've written in comp/lisp?

      None. I wrote comp/lisp because my lack of Lisp knowledge seemed to me like a liability as I invest more and more time in Forth. I had nothing I felt like writing in Lisp, so I thought it would be fun to implement one.

      While I can't call myself a Lisp programmer, I think I can say now I have an idea of what it's about. It didn't grow on me, so I leave it aside for now. Maybe I'll come back to it later.

      > I infer that you are just using dynamic scoping, thus ruling out both upward funargs and tail-call elimination?

      Tail-call elimination is possible with my implementation within the same function, but it's not "natural". I'm not sure of what it would take to have it like we see them in other lisps, or if it's possible at all. It's been a while since I swam in that pond, details aren't fresh in my mind.

      As for scoping, I'm unfamiliar with the theory. I don't think it's what we call "lexical scoping" because variable names have very little meaning in the compilation process. It's eliminated very early and only "compile time Parameter Stack offset" is kept in the AST.

      I don't know what you mean by "upward funargs", but if it's to access arguments from outer scope, yes it's possible. That's exactly the point of the last link you pointed to: lambda generation. A call to a closure generates a lambda which has its outer scope "injected" to PS.

      • kragen a day ago

        "Upward funargs" is where you return a closure which captures the bindings of your local variables, so it can access arguments from an outer scope which no longer exists. Like this:

            scheme@(guile-user)> (define (make-adder x) (lambda (y) (+ x y)))                                                 
            scheme@(guile-user)> (define x+3 (make-adder 3))
            scheme@(guile-user)> (define x+4 (make-adder 4))
            scheme@(guile-user)> (x+3 7)
            $1 = 10
            scheme@(guile-user)> (x+4 8)
            $2 = 12
        
        In Scheme those bindings are even mutable:

            scheme@(guile-user)> (define (counter n) (lambda () (set! n (1+ n)) n))
            scheme@(guile-user)> (define booga (counter 53))
            scheme@(guile-user)> (booga)
            $3 = 54
            scheme@(guile-user)> (booga)
            $4 = 55
        
        This makes Scheme closures a fully general substitute for Smalltalk-like objects.

        Upward funargs generally require a non-stack-like allocation discipline for the local-variable bindings.

        While implicit variable capture like Scheme is the most common way to do closures, it isn't the only way; C++ and old versions of Python (before nested_scopes) have a form of closures where you instead explicitly list the variables you want to capture. Old dynamically-scoped Lisps like Lisp 1.5 had a different solution which I don't understand; see AI memo AIM-199, "The function of FUNCTION in LISP".

        Usually dynamic scoping is incompatible with tail-call elimination (except in the special case of tail recursion) because you can't eliminate a tail call if which you have more work to do after it returns; usually popping some variable bindings off a context stack amounts to "more work to do". Maybe your parameter stack has a way for that to happen implicitly, for example by implicitly restoring a parameter-stack offset on function return.

        I hope this helps. Keep being awesome!

        • vdupras a day ago

          Yeah, to be honest I'm too out of this context to speak of deep technicalities, but a quick test seems to indicate that Dusk's comp/lisp can do something similar to your first example, unless I misunderstood it:

            $ make run
            stty -icanon -echo min 0; ./dusk ; stty icanon echo
            Dusk OS
            79KB used 31MB free ok
            needs comp/lisp
             ok
            lisp (defun make-adder (x) (lambda (y) (+ x y)))
             ok
            lisp (defun x+3 (x) ((make-adder 3) x))
             ok
            lisp (defun x+7 (x) ((make-adder 7) x))
             ok
            lisp. (x+3 7)
            10 ok
            lisp. (x+7 8)
            15 ok
          • kragen a day ago

            That could work without upward funargs because, in your version, (make-adder 3) and (make-adder 7) aren't live at the same time, so they might be using the same "x" instead of two separate "x" bindings. My hunch is that that "x" is in memory space that would be overwritten if you called another largish function in between calling make-adder and invoking its return value, but I haven't grokked comp/lisp fully enough to be sure that that's true.

      • anthk 13 hours ago

        Can you write a ZMachine interpreter?

        • vdupras 13 hours ago

          Of course I can, and so can you!

          • anthk 12 hours ago

            I'm still trying to fix Malyon from Emacs, as it hangs on this free adventure (Spiritwrak)

            https://jxself.org/git/?p=spiritwrak.git;a=summary

            You can get inform6 from the same page of the repos, and informlib by cloning spiritwrak with

                    git clone --recursive https://jxself.org/git/?p=spiritwrak.git;a=summary
  • wwweston 15 hours ago

    I came in to this thread to ask if anyone had done this, given an intuition that it seems like a fairly natural path to bootstrapping lisp on metal. Didn't even have to ask!

  • wild_egg a day ago

    This is excellent as well. I don't really jive with the Collapsnik aspect but I absolutely love powerful systems that can run in constrained environments.

    Thanks for sharing, I'll be diving into this for a while

mepian 2 days ago

The author is also one of the developers of the Symbolics Virtual Lisp Machine used in Open (and now Portable) Genera: http://pt.withington.org/publications/VLM.html

  • amszmidt 2 days ago

    Open in the "closed" sense. The legal status of anything from Symbolics is a minefield. Even this "portable Genera" might be of dubious legality...

    "Open" here was a popular term back in the day and bunch of companies used it .. sorta like AI, LLM, and blockchain.

    • fsckboy a day ago

      >"Open" here was a popular term back in the day and bunch of companies used it .. sorta like AI, LLM, and blockchain.

      nah. what you are describing is Open in the present day, like OpenAI.

      back in the day it meant "open standards, allowing cooperation between secretive competitors". IBM and Digital could both implement to an open standard instead of their traditional proprietary so their devices would interoperate.

      This type of openness is what Microsoft liked to "embrace, extend, extinguish", meaning "we'll get kudos joining the interoperating consortium, we'll look cutting edge by adding features which would make us not actually interoperate but are useful so people will adopt them, then we will have killed off this threat to our monopoly"

      in that same period of time, open source started to be called that because what was open was the source.

      • anthk a day ago

        Back in the day there was ITS, Maclisp and Emacs on top of TECO.

        • lispm a day ago

          That was gone at a time when Lisp Machines were commercialized (81/82). Emacs there was called Zmacs and was written in Lisp. Maclisp was replaced by Lisp Machine Lisp / ZetaLisp / Common Lisp. UNIX was then a thing.

          Open Genera in the early 90s then ran on OSF/1 UNIX (https://en.wikipedia.org/wiki/OSF/1 , renamed to Digital UNIX) on DEC Alpha. OSF meant Open Software Foundation ( https://en.wikipedia.org/wiki/Open_Software_Foundation ), a group propagating "open standards" for UNIX.

          • larsbrinkhoff 21 hours ago

            ITS, Maclisp, Emacs, and TECO were not gone in 1982. But admittedly on their way out.

          • anthk a day ago

            Curiously, GNU was born in that era, to give the Unix users the freedom from ITS, Maclisp, Emacs and the rest of the projects from MIT.

            Instead of having slow-ish boot times in a Lisp machine, GNU's plan was to put several memory-protected concurrent interpreters under Unix, giving the user far more power by default with GNU/Hurd against a common Unix users. WIth namespaces, for instance, you could forget about suid and/or needing group perms to mount a device or access different media and soundcards.

    • Y_Y 2 days ago

      I'm sure if they weren't incentivised by the certainty of creating a legal minefield in the future then this technology would never have been created and we'd be sitting at home banging sticks together.

      • amszmidt a day ago

        Most of this ”technology” was created at the MIT AI Labs,

        Genera did a few cool things but in the grand scheme of things it is essentially not much (the Dynamic Listener probably being the most interesting thing, the rest of the system was essentially the same).

        The MIT version of the Lisp Machine is free software without any legal conundrums. So there you go …

        • lispm a day ago

          > rest of the system was essentially the same

          not true

          • Y_Y a day ago

            I usually decry this sort of response and as virtually useless and not letting to fruitful discussion. However I have to commend lispm and amszmidt on having a fascinating (if tense) discussion on the history of Genera and its variants.

          • amszmidt a day ago

            Very true, try doing a M-x Source Compare. Genera is backwards compatible with System 99. And any differences are trivial.

            • lispm a day ago

              I'm looking at a Symbolics file system. Where do I find Dynamic Windows in System 99? Common Lisp? CLOS? The Document Examiner? X11? NFS?

              • amszmidt a day ago

                Nobody was saying that Genera didn't do anything novel, Dynamic Windows being one, and the Dynamic Listner being another. Overall, the guts are essentially the same, ZMacs, Zmail (also not part of the base System), Flavors, and anything else in the base layer -- to the point that you can compare all of the files that get stuffed into the cold load image without much effort.

                System 99 has support for Common Lisp. System 301 will hopefully have better support than Genera (seeing that it doesn't really support CL very well anyway) at that.

                The Document Examiner is a separate system (and LMI did it differently, with Gateway), and not part of the base System. X11, and NFS likewise. So listing a bunch of orthogonal systems isn't really what was being discussed.

                • lispm a day ago

                  The core Symbolics SYSTEM system I see here is version 501.51. It consists of a bunch of other systems like SI, Embedding, L-Bin, Tables, Scheduler, Netboot, Common Lisp, SCT, Garbage Collector, Flavor, Error System, Language Tools, Network, Lisp Compiler, Bitblt, Fonts, FS, Bin, Time, TV, Presentation Substrate, CP, Dynamic Windows, Debugger, I Linker, Fep-FS, ...

                  I don't see much of that in System 99 from MIT.

                  Stuff like X11 and NFS is also not random stuff. Open Genera uses it for its console and its files.

                  > seeing that it doesn't really support CL very well anyway

                  Well enough that it is able to run a lot of its software written in Common Lisp. It's able to run stuff like PCL, ASDF, ITERATE, ...

                  System 99 is from 1984 (?), Symbolics started in 81/82. Open Genera appeared 1992/93. You are ignoring a full decade of development.

                  Earlier:

                  > Genera did a few cool things but in the grand scheme of things it is essentially not much (the Dynamic Listener probably being the most interesting thing, the rest of the system was essentially the same).

                  I would guess that less than 20% of the core SYSTEM is essentially the same (but often quite a bit enhanced). For the rest of the software, I would think it's even less. Which is not surprising, since System 99 did not support the 3600 systems, the G-Machines, the I-machines (XL400, XL1200, XL1201, UX400, UX1200, NXP1000, MacIvory 1/2/3, ...), Open Genera, ..., because all that was developed later by Symbolics.

                  • amszmidt a day ago

                    > I would guess that less than 20% of the core SYSTEM is essentially the same (but often quite a bit enhanced). For the rest of the software, I would think it's even less.

                    ZWEI, TV and SI makes up the majority of the base system which is far more than "20%" percent, which is ignoring EH, WINDOW, and anything else that is essentially processor agnostic.

                    The target processor is not that important from the users point of view, to the point that Lambda and CADR shared the exact same source code, and microcode. The Explorer I was essentially a Lambda. Much was "feature" protected with #+/#- to the point where many of the systems worked on Genera as well.

                    So that seems at par with the CPU architectures that Symbolics made, which is "just" a change to the compiler .. and not the overall system .. which is what was being discussed. But yes, Symbolics did design hardware done by Symbolics and made sure it worked on their system .. not sure what your point is other than giving an incorrect view of history and what Symbolics actually did to the Lisp Machine system parts (Symbolics work was much more interesting when it came to other systems like the 3D stuff and what not).

                    • lispm a day ago

                      ZWEI is no longer in the base system. EH does not exist, a Window system does not exist. Dynamic Windows is in the base system.

                      > The target processor is not that important from the users point of view

                      It's important for the operating system and the processor was used for very different machine hardware: standalone workstations, headless systems, embedded in a Mac, embedded in a SUN, embedded in a network switch, ...

                      > Lambda and CADR shared the exact same source code, and microcode.

                      But not the 3600, which was a 36 bit architecture and not the Ivory, which was a 40 bit microprocessor. A lot of the core routines were different on a 3600 and an Ivory machine. The VLM was even different, running on a 64bit microprocessor, implementing the emulator in C and assembler. There is low-level functionality which has two or three different implementations.

                      The Ivory machines could be very different from a LAMBDA and a CADR, for example as an embedded CPU in a very different host machine (Mac & SUN). That had a lot of consequences for the operating system.

                      • amszmidt a day ago

                        > ZWEI is no longer in the base system. EH does not exist, a Window system does not exist. Dynamic Windows is in the base system.

                        Nobody will use Genera without Zmacs or the error handler (aka Debugger)…

                        System 200 ran in the CADR and 3600. Using the same code base, with different microcode. With just bunch of #+ and #-.

                        And no, the target is not that important that it makes up “80% of the system”.

                        Your going into tangents and constantly raising neg things not being discussed, we aren’t talking about the host architecture but about the Lisp Machine system which is mostly agnostic of the target CPU. And between those two, Genera and the MIT system, are essentially the same when it comes to the base layer. To the point you can take the Dynamic listener or even DW and have it running on a MIT system without much work.

                        Genera 9 (which started development just a few years ago going of a system that hadn’t been touched for 30 years and under dubious legal situation) is essentially Genera 8.5 with several fixes.

                        Genera was carefully designed to keep compatibility with the MIT system. And that is one of the reasons why feature and code are so similar.

                        • lispm a day ago

                          > Nobody will use Genera without Zmacs or the error handler (aka Debugger)…

                          Nobody will run Genera without TCP/IP and a lot of other stuff.

                          The error handler is not the same anymore.

                          > you can take the Dynamic listener or even DW and have it running on a MIT system without much work

                          I think that's pure fantasy.

                          > Genera was carefully designed to keep compatibility with the MIT system.

                          I don't think that was a goal, given that the MIT system was not relevant anymore and the vendors all had their own forks. TI converted much of the old code to Common Lisp with a translator. Symbolics wrote most new code in Common Lisp.

                          Without an extensive implementation of Common Lisp incl. CLOS, basically nothing runs on the old system.

                          The times of a few conditional reader macros was soon over.

                          Does Portable Common LOOPS (PCL) run on the MIT Lisp Machine?

                          https://github.com/binghe/PCL/tree/master/impl

                          It lacks a port, but there are ports to TI and Symbolics. That should be simple.

                          • amszmidt 21 hours ago

                            > Nobody will run Genera without TCP/IP and a lot of other stuff.

                            Plenty of people still do, e.g. on a Ivory or 36xx machines.

                            > I think that's pure fantasy.

                            I ported the Dynamic Listener (with large parts of DW) a bunch of years back to run on MIT System 78.

                            > ... MIT system was not relevant anymore ...

                            I realise you like to raise Symbolics to the skies, but to say that the MIT system, which then was continued by LMI was "not relevant" is just making up history. Both systems where still heavily used until the demise of Symbolics, and LMI and portability was absolutely one goal since breaking peoples code was not considered nice.

                            > Does Portable Common LOOPS (PCL) run on the MIT Lisp Machine?

                            No idea, LOOP on the MIT Lisp Machine is the standard Loop, feel free to port it though -- patches welcome as they say.

                  • amszmidt a day ago

                    SI, Scheduler, GC, Flavor, EH, Compiler, TV, etc all date back to before Symbolics entered the picture. And if you don't see that in System 99 .. then you must not be looking very hard since they are explicitly mentioned there. You mention a bunch of things that are also _not_ part of Genera, specifically bunch of Ivory stuff -- which is just the processor target.

                    > Well enough that it is able to run a lot of its software written in Common Lisp. It's able to run stuff like PCL, ASDF, ITERATE, ...

                    No it is not. You are thinking of Portable Genera.

                    > System 99 is from 1984 (?), Symbolics started in 81/82. Open Genera appeared 1992/93. You are ignoring a full decade of development.

                    System 99.32 is from 1987, which continued via LMI into System 130. Genera, or before when it got forked, is from 1978 when it got forked from System 78 or there about being re-branded into System 200. All three (four if you count TI -- which did quite more in renaming things making it hard to follow) with parallel development going on from there into the early 1990s.

                    Genera is a Lisp Machine system based on the work done at MIT, where most of the guts still the same. To the point that how you rebootstrap the system is the same, how the core areas look work, how the scheduler works, how the windowing system works, how flavours works, etc.

                    Open Genera did very little on top of Genera at that, mostly targeting and making it work on the Ivory. You are purposefully conflating Portable Genera, Open Genera and Genera.

                    • lispm a day ago

                      I have literally started a VLM (Portable Genera 2.0.6) with a Genera 9.0.6 world. In the "herald" the software says that it is running Genera 9.0.6. You can claim "You are purposefully conflating Portable Genera, Open Genera and Genera.", but I'm actually looking at it now and you don't. Please don't tell me this BS, when I'm right now reading the screen in front of me.

                      > You mention a bunch of things that are also _not_ part of Genera

                      Of course they are. I have the thing right in front of me, running. I'm typing to it. I'm looking at the system definition of SYSTEM 501. On a live running Genera 9 on a VLM. If I would look at my MacIvory running Genera 8 it also would not look much too different.

                      Stuff like EH has been long superseded. The Scheduler has been redesigned & rewritten. TV has mostly been superseded by Dynamic Windows. The Garbage Collector has been extended, by new GC variants.

                      > Open Genera did very little on top of Genera at that, mostly targeting and making it work on the Ivory.

                      Open Genera does not work on the Ivory processor. It's a Virtual Lisp Machine.

                      > Genera is a Lisp Machine system based on the work done at MIT, where most of the guts still the same. To the point that how you rebootstrap the system is the same, how the core areas look work, how the scheduler works, how the windowing system works, how flavours works, etc.

                      The Scheduler works different (the old scheduler polled, the new own is event driven), the window system is now Dynamic Windows, Flavors has been updated to New Flavors & the new CLOS (the Common Lisp Object System), ...

                      • r40694 a day ago

                        you know you don't need tell everyone how you have palter's vlm to look at rel-8-5's sysdcl which has been leaked and hosted on public sites since forever. for example https://archives.loomcom.com/genera/symbolics/sys.sct/sys/sy... rel9's sysdcl is not substantially different anyway, the list of module components is the same.

                        ams's hyperbolic perhaps point is that genera is significantly SYSTEM, that symbolics contribution is a kind of obvious extension of the grand vision that was already there in its totality and potential in the MIT's work. I think it's a valid argument, which I don't think can be resolved just by listing names of subsystems.

                        for example you can't just say "oh they replaced tv and window with dynamic windows", because dw uses both tv and legacy, for lack of better term, window. if you look at the flavor definition of basic dynamic-window it uses tv:stream-mixin tv:graphics-mixin and tv:minimum-window. and tv minimum-window is a venerable SYSTEM flavor. not to mention that other systems (like zwei) still use tv window directly. how thick a layer dynamic-window is on top of tv? answering that question require systems level knowledge and investigation.

                        other symbolics extensions are of similar nature.

                        • p_l 13 hours ago

                          Can't talk about the rest, but some spelunking led me to finding that TV was "legacy" component in later Genera options, working properly only on the main console and, due to significant emulation work, on MacIvory through RPC to host Mac.

                          DW/CLIM treated TV as one possible driver, if not sidestepped it, the problem was that some software (iirc mainly related to some S-Graphics products) had some hardcoded TV dependencies from earlier versions - it's mentioned as part of the porting plans for OpenGenera, because OpenGenera has no working TV subsystem at all, because TV didn't work over X11.

                        • lispm a day ago

                          > you know you don't need tell everyone how you have palter's vlm to look at rel-8-5's sysdcl which has been leaked and hosted on public sites since forever

                          I'm not sure what you are talking about. It's not related to what I wrote.

                          > how thick a layer dynamic-window is on top of tv? answering that question require systems level knowledge and investigation.

                          Dynamic Windows introduced a new UI look and feel, different from the old "static windows". DW has new APIs even for reimplementations of old UI features from TV , like the new drawing interface in the graphics package, which replaces the old TV flavor messages, with generic functions. It has also a lot of new features, like the presentation system. Later applications typically will use the new DW interfaces and new features. Both DW and TC are documented in "Programming the User Interface", with DW providing much of the high-level application window features, described in the first chapters.

                          There is another version of it, in another implementation, which is CLIM, which is then based on CLOS. Even later applications were thought to use the CLIM interfaces, to be able to write portable user interfaces, able to run on various other Lisp systems. Both DW and CLIM are substantially different from TV. Neither DW, nor CLIM are in the MIT software.

                          Other symbolics extensions are of similar nature.

                          • amszmidt a day ago

                            > Other symbolics extensions are of similar nature.

                            Sure, and those extensions can run on the MIT Lisp Machine.

                            But we are taking about the base system. Not extensions. And why one can take those systems and for the most part just run them in another system is cause they are so similar!

                            • lispm a day ago

                              Good luck with that. I don't think that's possible.

                    • mepian a day ago

                      >Open Genera did very little on top of Genera at that, mostly targeting and making it work on the Ivory.

                      Ivory support was implemented in Genera 7.3. Open Genera added the virtual machine to run on DEC Alpha hardware, it was Genera 8.3 + VLM, upgraded to Genera 8.5 with Open Genera 2.0.

                      • p_l 13 hours ago

                        Additionally, VLM is otherwise "ivory rev. 5" which is not compatible at compiled code level with ivory rev. 0-4 which where the physical ivories. Small differences in few places including IIRC in page sizes.

                      • lispm a day ago

                        and now it's a new portable VLM (DEC Alpha, x86-64, ARM64) with Genera 9.

      • eschaton a day ago

        The Symbolics intellectual property wasn't a legal minefield until the past decade and change, where it was acquired from Andrew Topping's estate by a longtime MIT Lisp guy who has been a complete idiot for not opening it up to the world.

        However, he only actually owns it if Andrew Topping actually paid the Symbolics bankruptcy executor for it in full. (Not doing so would entirely fit Topping's character, based on his criminal rap sheet.) If Topping didn't fully pay for it, then it still belongs to whoever or whatever the successor of the Symbolics bankruptcy executor is; I think that was ABN AMRO.

        Good luck finding any of that out!

        • Y_Y 17 hours ago

          Are you talking about John Mallory? I imagine someone has already tried to figure this out, but it's indeed not easy to find much online.

          It even appears there's ongoing development, but in some secretive fashion: https://hachyderm.io/@gmpalter/109553491209727096 (by the same dude who wrote TFA).

          Could anyone more knowledgeable provide firther direction?

  • anthk a day ago

      Not free. This is useless for preservation. Just compare it against Medley/Interlisp.
kunley 2 days ago

I hope there will come time for Forth to get more attention, or dare I say, to stop being so underrated.

It is so great tool, even just for stretching your brain cells..

  • sph a day ago

    I recently posted a paper on the Permacomputing Aesthetic (https://news.ycombinator.com/item?id=41818119) which argues there is a counter current of developers that want to go against the maximalist world of computing and rediscover programming as art on constrained systems. In this world, where we do not idolize the large teams of code monkeys, languages like Forth can see a resurgence. Forth is a bad language for a team of people of dubious and different skillset, but in the hands of a single person it is the most moldable environment, even moreso than Lisp.

    I have been designing some sort of "persistent" Forth OS, that boots into an interpreter and persists your defined words across sessions and reboots, so you can incrementally build your own personal computer from scratch.

    • mepian a day ago

      >even moreso than Lisp

      How exactly?

      • kragen a day ago

        Let's take Laxen and Perry's F83 implementation as an example.

        F83 implements an assembler, a memory hex dumper, an interpreter, a sort of bytecode compiler, a decompiler, a single-stepping source-level debugger, a screen editor (which could jump to the definition of any function and jump back and forth between source and documentation, and which tagged each screen of source code with the last edit date and author's initials), cooperative multitasking with thread-local data, a print spooler, and virtual memory. It can also compile programs into standalone executables, and in fact it's written in itself, so it can cross-compile itself for other architectures and operating systems; it supported CP/M-86, MS-DOS, and CP/M for the 68000.

        The Forth programming language it implements is low-level but relatively powerful; among other things, it includes structured control flow, Turing-complete compile-time macros, closures, pointers, and multiple return values.

        This is nothing unusual for a Lisp system, of course. What is different is that F83 is only about 2100 lines of code by my count. That's what makes it "more moldable" than a Lisp system with the same features, which would require around an order of magnitude more code.

        There are some major compromises made in pursuit of this level of minimality, though: no type-safety (not even to the extent of calling functions with the right number of arguments), no built-in structs (though you can build them with its macro system and closures), no bounds-checking on arrays, error-prone manual access to the virtual-memory system, no syntax, no stack-allocated named variables, etc.

        • mepian a day ago

          Thank you, that's interesting. The classic Lisp machine systems are very compact by modern standards but not 2100 lines compact, although they had about two decades of continuous development to accumulate cruft (and useful features like various network protocols support and sophisticated graphics as well).

          • kragen a day ago

            Yeah, LispM system software is orders of magnitude larger than that, but, for example, David Betz's XLISP for CP/M is 2800 lines of C, and it doesn't include any of those facilities except for an interpreter. Now, the interpreter does have some things F83 doesn't: a fully dynamic object system which supports adding new methods at runtime, named stack-allocated variables, dynamic typing, a string type, a file-pointer type with character I/O, Lisp lists, S-expression reading and printing, and a garbage collector. But it doesn't have an assembler, a memory dumper, a bytecode interpreter, a debugger of any kind, an editor of any kind, any kind of version control, any kind of multitasking, or virtual memory. For many of these, adding them in Lisp is a lot more difficult than adding them in Forth.

            Probably the most influential use of XLISP is as AutoCAD's built-in scripting language, AutoLISP, but it's also the basis of Nyquist.

      • transfire a day ago

        Probably because Lisp requires a GC. With Forth you have to write your code to fit a finite memory footprint.

        • mepian a day ago

          >you have to write your code to fit a finite memory footprint

          This sounds like the opposite of having the more moldable environment.

          • sph a day ago

            A bare metal Forth has access to your entire RAM. Perhaps that's what parent meant as "finite" memory.

    • packetlost a day ago

      I think I partially fit into the counterculture and have had similar ideas for a persistent Forth OS. I'd love to see your work!

  • ink_13 2 days ago

    Too bad I no longer have my HONK FORTH IF LOVE THEN bumper sticker

  • packetlost 2 days ago

    I doubt it ever will. There's still pockets on the internet and a few tireless maintainers, but I doubt it'll ever see much of a resurgence. It's just too different. Maybe during "bring-up" of embedded systems, but even that I think is mostly on the way out sadly.

  • codr7 a day ago

    I don't see standard Forth going anywhere; same for Common Lisp, Scheme & Smalltalk.

    They had a good run, and they're still fun to play around with and useful for teaching and stealing ideas from. But the world has moved on in too many ways.

    There's still a lot of ground to explore between Forth and Lisp though, and plenty of modern ideas to steal from scripting languages.

    • vindarel 18 hours ago

      Common Lisp is still kicking ;)

      (not its standard sure)

      • codr7 6 hours ago

        I don't know about kicking, it's not exactly thriving.

        I prefer Lisp for tricky problem solving, but it's unfortunately a no go professionally outside of prototyping ideas.

  • whobre a day ago

    That time was the 1980s. After 1995 or so, Forth sadly started to disappear.

ofalkaed a day ago

The use of CODE/;CODE/;ENCODE for defining words in lisp is great. I can see myself using this a good amount especially if I can get it playing well with Tk either through the FFI or lTk/nodgui.

crest a day ago

Doing it the other way around would be a lot more impressive. ;-)

  • astrobe_ a day ago

    There are more useful thing to do with Forth :-P

    • crest a day ago

      Depends on what you need. Forth is great for explorative programming on microcontrollers.

pmarreck a day ago

“CL-Forth compiles with LispWorks but crashes running the Forth test suite.”

wouldn’t this normally preclude doing any actual work with it?

  • shawn_w a day ago

    I wonder if they were testing with the paid, commercial version or the free evaluation version of LispWorks, which has significant restrictions.

    Most people playing around with this will be using sbcl though. It's overwhelmingly the most popular free common lisp system.

    • armitron 13 hours ago

      Not to mention more perfomant than Lispworks.

  • kragen a day ago

    It sounds like it would preclude doing any actual work with LispWorks, yes. But maybe someone at Harlequin will take interest and fix whatever the LispWorks bug is, or in the unlikely case that it's a CL-Forth bug, propose a fix. (It could be a CL-Forth bug, because Common Lisp allows you to turn off safety.) It doesn't preclude doing any actual work with CL-Forth, though, because you can use better Common Lisps.

  • anthk a day ago

    Use SBCL.