Hacker Newsnew | past | comments | ask | show | jobs | submit | YesThatTom2's commentslogin

Ada was also ignored because the typical compiler cost tens of thousands of dollars. No open source or free compiler existed during the decades where popular languages could be had for free.

I think that is the biggest factor of all.


Ada’s failure to escape its niche is overdetermined.

Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.

And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.


The first validated compiler for Ada that ran on the IBM PC was released in 1983.

The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.

I think more research is warranted by you on this subject.


I’m not saying it wasn’t possible, I’m saying the larger ecosystem was never going to embrace a language that was as heavyweight as Ada. In 1983, most PC system software was written in assembly!

Janus Ada 1.5 ran on CP/M.

I sometimes wonder what "Turbo Ada" would have looked like, but I think it would have probably looked like later versions of Borland Pascal. Things like generics and exceptions would have taken some of the "turbo" out of the compiler and runtime -- the code generator didn't even get a non-peephole optimizer until 32-bit Delphi, it would have been too slow.

It might be nice to have Ada's tasks driven by DOS interrupts, though. I think GNAT did this.


I have not seen it, but there is something close to what you ask about: Turbo Modula-2 (an implementation of MODULA-2 written by Martin Odersky), as both MODULA-2 and PASCAL were Niklaus Wirth-invented languages that looks very similar to Ada:

"Shortly before we finished our compiler, Borland came out with Turbo Pascal, and they were considering going into the Modula-2 market as well. In fact, Borland decided to buy our Modula-2 compiler to be sold under the name of Turbo Modula-2 for CP/M alongside an IBM PC version they wanted to develop. We offered to do the IBM PC version for them, but they told us they had it already covered. Unfortunately that version took them much longer than planned. By the time it came out, three or four years later, their implementor team had split from the company, and it became known as TopSpeed Modula-2. In the absence of an IBM-PC version, Borland never put any marketing muscle behind Turbo-Modula-2, so it remained rather obscure." -- https://www.artima.com/articles/the-origins-of-scala


I think Oracle PlSQL was also based on Ada, basically Ada + SQL embedded. So it may be the widest used version of "Ada".

Pretty much [close enough for government work]; see: https://stackoverflow.com/questions/7764656/who-is-diana-and...

Wow, that’s a factoid I’d love to learn more about!



Ada did not took off during the glory MS-DOS/Windows 3.x Borland days exactly because of Turbo Pascal.

Note that the Martin Odersky of Scala fame was one of the main developers of Turbo Modula-2, that Borland killed shortly thereafter. They were also rather quick to get rid of Turbo Basic (which I had quite some fun with).

With Turbo Pascal absorving the Object Pascal ideas from Apple, and some of the key features from Modula-2, there was not much left from Ada 83 that was relevant for MS-DOS programmers.

However nowadays between FreePascal, Delphi, and Ada, I would probably pick Ada, given its industry role, and besides Ada Core, there are actually other six vendors still in business.


Sun was the first UNIX vendor to introduce the idea to split UNIX into user and developer SKUs, now Sun eventually also had an Ada compiler.

When the companies bought the Solaris Developer tools, that did not include the Ada compiler, that was extra, and wasn't cheap.

Having already paid for C, C++, Assembly, why would anyone pay extra for Ada if not obliged to do so?


I used it a bit a Uni and remember enjoying it, but can you say what was slow about it; compilation or runtime or all of it?

I’ve never directly played with Ada but my understanding is that it was very much both.

Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).

These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.

Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.


I had to take some course that was something like "Programming Language Theory". As a result I had to look at the specifications for dozens of different programming languages. I remember looking at the features of some languages and scratching my head trying to figure out how some of this would ever be practically implemented by a compiler. Later on I found out lots of stuff is just implemented by a runtime anyways, which lead to me realize that those fancy language features are often better as a library.

I took a course exactly like that. I wonder if we went to the same school, or it’s due to curriculum standardization. The professor was particularly enthusiastic about Ada, so I had assumed the course was largely his creation.

academics loved ada when it came out. it was a very sophisticated language for it's day.

same is true today. the spec/body separation made for an actual delineation between design and implementation.


the accreditation was through ABET at the time from what I recall

compilation. run time was good, and you could turn off things like run time range checking, if you wanted to.

A huge factor. I used ada for years and the fact everyone I worked with did hobby projects in other languages didn’t help it. And most of us liked Ada.

It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.


GNAT has existed since at least the mid-90s, and in that time period plenty of companies used non-OSS compilers.

In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.


True, but at that time it was already too late. C/C++ had won.

Moreover, for a very long time GNAT had been quite difficult to build, configure and coexist with other gcc-based compilers, far more difficult than building and configuring the tool chain for any other programming language. (i.e. you could fail to get a working environment, without any easy way to discover what went wrong, which never happened with any other programming language supported by gcc)

I have no idea which was the reason for this, because whichever was the reason it had nothing to do with any intrinsic property of the language.

I do not remember when it has finally become easy to use Ada with gcc, but this might have happened only a decade ago, or even more recently.


the gnat people needed to make a living. there were several impediments to widespread use of gnat, like the runtime license.

not that it was their responsibility to provide a free compiler to the masses.


The article gives another reason "A second answer is aesthetic. Ada's syntax is verbose in a way that programmers with a background in C find unpleasant. if X then Y; end if; instead of if (x) { y; }. procedure Sort (A : in out Array_Type) instead of void sort(int* a)."

I think this should not be underestimated. There is a huge number of small C compilers. People write their own C compiler because they want to have one.

That doesn't happen we Ada. Very few people liked Ada enough that they would write a compiler for a subset of the language. For example, an Ada subset similar to the feature set of Modula-2 should be quite doable with a modest effort.


> I think this should not be underestimated.

You're right but it's broader than "C folks like terseness."

C is famously hard to read. Before Perl we used to joke that C is a write-only language: you can't understand what your own code means just weeks later.

Combine this with its lack of bounds checking, pointer arithmetic, and other dangerous features, and the result is a language that's macho for geeks: it's hard, it's dangerous, but it's small and it's fast.

It's a motorcycle for nerds. Ada is a tank.

Nerds get to establish dominance over lesser nerds by doing hard stuff in hard languages and making it fast. This bestows nerd street cred: geek cred.

Ada was used by contractors who needed stuff to work and money was no object.

C was used by hackers to do cool hacker stuff that was perceived to be fast and low level.

It's not low level: machine architectures haven't resembled the C abstractions since the 1970s.

https://queue.acm.org/detail.cfm?id=3212479

A modern low-level language would be some brain-bending combination of APL and Lisp with n-dimensional tensor algebra or something.

But C looks cool and hard and you will blow both feet off if you don't hold it just right.

And there are good free versions. So you can be poor and still demonstrate your machismo.

Result, a software industry requiring weekly multi-gigabyte online patches, keeping millions in work.

C makes programmers a cheap fungible commodity.

https://www.loper-os.org/?p=69


The real problem is that Ada forces you to plan ahead and most developers don't really know how to do that.

I'd say that is even more so with Rust and Rust got popular in a very short amount of time.

I think this was a genuine generational change. I am pretty sure Rust would never have become popular 20 years earlier because the priorities back then were so different (that was the era of languages like Ruby and Pearl where conciseness and low verbosity were the most valued aspects).

As Gen-X, in the Usenet flamewars, the C and C++ folks used to call Pascal/Modula-2/Ada advocates as straightjacket programming, whereas they would be called cowboy programmers.

Ironically the author of Fil-C calls classical C, YOLO-C. :)


When Ada came out a lot of programmers couldn't even touch type. You're right there's a generational change and a lot of of the Ada stuff won:

    * strong typing
    * lots of annotations
    * keywords over syntax, support for long variable and token names
    * object focus (Ada 83 had some limitations on inheritance so it wasn't OO strictly speaking) 
    * exceptions
    * large standard library
These things were controversial in the 1980s. They are not today.

I think that is not correct.

One of the big differences between K&R C and C89 is the introduction of function prototypes. Strong typing was certainly considered positive for compiled languages. Of course C is a lot less strict than Ada.

If we compare the Rust subset that has similar functionality as C then there is not much difference. You get 'fn'. The is 'let' but Rust often leaves out the type, so 'int x = 42;' becomes 'let x = 42;' in Rust. Rust has 'mut' but C has 'const'. Rust introduced '=>' and removed '->' from object access and moved it to the return type of a function.

The C language has support for long variable names. Some early linkers didn't, but that's an implementation issue, people were certainly unhappy about that.

C++ started in the 80s. Objects were not controversial back then. The same applies to exceptions.

I don't have a metric for the size of a standard library. For its time, the C library in Unix system had a large number of functions. Later that was split in a C standard part and a POSIX part. But that was for practical reasons. Lot's of non-Unix systems have trouble implementing fork().

I have no clue what you mean with annotations. If you mean non-function annotations along with code, then generally Rust programs don't have those.


Exceptions were controversial into the 90s which is why Java went down that whole checked-exceptions rabbit hole. The argument was that an exception was essentially a GOTO (or even COME FROM) which broke functional abstraction.

The Ariane 5 crash involved an exception and that was the central "Ada is unsafe actually" argument from C people.

In fact "exceptions are bad" is so baked into a lot of C people's brains that they left them out of Go!

Short variable names were a technical limitation in early languages but style guides were still arguing against long, descriptive variable names in languages like C into the 2000s.

Objects were also likewise controversial and you can see that in the design of Ada 83 where they were both inspired by OO languages like smalltalk but also hesitant to adopt stuff like inheritance. Inheritance was again, seen as a way to break encapsulation (it kinda is) but also a lot of object implementations were slow and memory inefficient in the 80s. Smalltalk was pretty much the reason why the Apple Lisa failed as a product.

OO became a massive buzzword in the 90s but by that time it had already been around for quite a long time.

By annotations I mean mostly type annotations, of course there's also aspect annotations and other stuff ex: Ada SPARK.


Function prototypes were actually taken from the C++ ISO process, back into C, originally.

Back when I learnt C, I think you could go beyond 8 or 12 characters in the symbol tables of compilers like Small-C.


not just the priorities, the overall skill and education of programmers.

in the 1980/1990's i was a dumb kid. problems of large systems were not in my mind. having to type begin/end instead of {} was, i thought, a valid complaint.

with experience, education, and hindsight, most of the advantages of the ada language were not understood by the masses. if ada came out today, it would have taken off just like rust.


I'd say that if the original Ada was introduced at the same time as Rust development started then people would pick Rust. Ada is also a product of its time would have to be modernized quite a bit.

Given how similar the syntax is of C, C++, Javascript, and Go, I think a language with the syntax of Ada would have a hard time.


The GNU ADA compiler was first released in 1995: https://en.wikipedia.org/wiki/GNAT

That's a decade too late.

Let's just be honest that even if there was a free compiler in 1985 or earlier, there's no way that e.g. someone like Linus Torvalds or an RMS etc would have written various groundbreaking pieces of software on Ada. It was just in an entirely different headspace.

I was around then, and culturally there just wasn't this (legitimate) concern with safety in the more "hacker" and Unix community generally. C won headspace at the time precisely because it was minimal and close to the metal while providing the minimum of abstraction people wanted. Which was on the whole fine because the blast radius for mistakes was lower and the machines were simpler.


> while providing the minimum of abstraction people wanted

Yes, I think this is key. I wasn't around in 1985, but on every attempt to write something in Ada I've found myself fighting its standard library more than using it. Ada's stdlib is an intersection of common features found in previous century's operating systems, and anything OS-specific or any developments from the last 30 years seem to be conspicuously absent. That wouldn't be so much of a problem if you could just extend the stdlib with OS-specific features, but Ada's abstractions are closed instead of leaky.

I'm sure that this is less of a problem on embedded systems, unikernels or other close-to-hardware software projects where you have more control over the stdlib and runtime; but as much as I like Ada's type system and its tasking model I would never write system applications in Ada because the standard library abstractions just get in the way.

To illustrate what I mean, look at the Ada.Interrupts standard library package [0] for interrupt handling, and how it defines an interrupt handler:

  type Parameterless_Handler is
    access protected procedure
    with Nonblocking => False;
That's sufficient for hardware interrupts: you have an entry point address, and that's it. But on Linux the same package is used for signal handling, and a parameterless procedure is in no way compatible with the rich siginfo_t struct that the kernel offers. To wit, because the handler is parameterless you need to attach a separate handler to each signal to even know which signal was raised. And to add insult to injury, the gnat runtime always spawns a signal handler thread with an empty sigprocmask before entering the main subprogram so it's not possible to use signalfd to work around this issue either.

Ada's stdlib file operations suffer from closed enumerations: the file operations Create and Open take a File_Mode argument, and that argument is defined as [1]:

  type File_Mode is (In_File, Inout_File, Out_File);  --  for Direct_IO
  type File_Mode is (In_File, Out_File, Append_File); --  for Stream_IO
That's it. No provisions for Posix flags like O_CLOEXEC or O_EXCL nor BSD flags like O_EXLOCK, and since enum types are closed in Ada there is no way to add those custom flags either. All modern or OS-specific features like dirfd on Linux or opportunistic locking on Windows are not easily available in Ada because of closed definitions like this.

Another example is GNAT.Sockets (not part of Ada stdlib), which defines these address families and socket types in a closed enum:

  type Family_Type is (Family_Inet, Family_Inet6, Family_Unix, Family_Unspec);
  type Mode_Type is (Socket_Stream, Socket_Datagram, Socket_Raw);
Want to use AF_ALG or AF_KEY for secure cryptographic operations, or perhaps SOCK_SEQPACKET or a SOL_BLUETOOTH socket? Better prepare to write your own Ada sockets library first.

[0] https://docs.adacore.com/live/wave/arm22/html/arm22/arm22-C-...

[1] https://docs.adacore.com/live/wave/arm22/html/arm22/arm22-A-...


To be fair, the file-handling is probably the 'crustiest' part of the standard library. (To use the posix-flags, you use the Form parameter.)

The best way to use Ada, IMO, is type-first: you define your problem-space in the type-system, then use that to solve your problem. -- Also, because Ada's foreign-function interface is dead easy, you could use imports to handle things in a manner more amiable to your needs/preferences, it's as simple as:

    Function Example (X : Interfaces.Unsigned_16) return Boolean
      with Import, Convention => COBOL, Link_Name => "xmpl16";
You can even put pre-/post-conditions on it.

Yes, agreed on Ada.Interfaces and the FFI, it's one of the best. The only thing "missing" is auto-import of the definitions in C header files (but there be different dragons). gcc -fdump-ada-specs works fine, but it's effectively a duplication of (non-authoritative) information. That's fine if you're targeting one system, but when targeting multiple systems a single "with Interfaces.C.Syscall_H" quickly becomes a maze of alternative package bodies and accompanying conditional compilation logic.

> The best way to use Ada, IMO, is type-first: you define your problem-space in the type-system, then use that to solve your problem

I guess that goes to the core of the argument I was trying to make: not that Ada is bad, but that the low-level abstractions in Ada's stdlib are a case of premature optimization. Luckily, I take much less issue with the Numerics and Container parts of the standard library.

> To use the posix-flags, you use the Form parameter

Do you have any examples/documentation on the use of the Form parameter? According to the RM, it's a String argument so I wouldn't have expected it to support flags.

(Also, to correct myself on the signalfd issue: there is GNAT.Signals.Block_Signal to mask signals on the Interrupt_Manager thread)


Ok, so the Form parameter is implementation defined; this was to allow the implementations the 'wriggle room' to interface with the host-system.

For GNAT, these two pieces of documentation are instructive: https://docs.adacore.com/live/wave/gnat_rm/html/gnat_rm/gnat... https://gcc.gnu.org/onlinedocs/gcc-4.9.1/gnat_rm/FORM-String... (This second one is older documentation, but illustrates how platform-specific Form parameters could be used.)

    Ada.Text_IO.Create (
        File => File,
        Mode => Ada.Text_IO.Out_File,
        Name => "test.txt",
        Form => "shared=no"
      );
The "maze of alternative package bodies and accompanying conditional compilation logic" is an artifact of C's approach to 'portability' using the preprocessor. Typically, the conditionality should be stable once you abstract it (using the compiler's project-management to select the correct body for a particular configuration) -- As a stupidly trivial example, consider the path separator, for the specification you could have:

    Package Dependency is
       Package OS is
          Function Separator return String;
       End OS;
    End Dependency;
    -- ...
    Package Dependency is
       Package body OS is separate;
    End Dependency;

    -- Windows
    separate (Dependency)
    Package OS is
      Function Separator return String is ("\");
    End OS;

    -- Classic Mac
    separate (Dependency)
    Package OS is
      Function Separator return String is (":");
    End OS;

    -- VMS
    separate (Dependency)
    Package OS is
      Function Separator return String is (".");
    End OS;

    -- UNIX-like
    separate (Dependency)
    Package OS is
      Function Separator return String is ("/");
    End OS;
Then in your the rest of your program, you program against the abstraction of DEPENDENCY.OS (and whatever other dependencies you have, likewise), and thus separate out the implementation dependency.

Not really, the state of compilers pretty much sucked back then. GCC was the only real free compiler in the 80s and it wasn't really ready for prime time until the late 80s. You were paying (lots) of money for a compiler no matter what language you chose. And if you were targeting a new language the compiler was sure to suck.

Even in the late 90s Jamie Zawinski had a rant against C++. His argument for not using it? The compilers suck! C++ was the main "competitor" of Ada and it was a decade or more behind Ada through most of the time.

The "killer feature" of C++ against Ada (when it came to fighting against compiler maturity) was really that you could pretend to be writing C++ code but really just keep writing C-with-classes.

If Ada had put a modula or pascal compatibility mode in the language and produced a reference compiler that was based on a stable compiler in one of those languages, the history may have been different because people could have just written "PascAda" while waiting for the compilers to catch up.


The ADA compiler for OpenVMS was over $200,000 in the 1990s.

Probably because only defense contractors used it. Now imagine that kind of gouging occuring for everything else they spend money on.

Ada was designed to solve different problems in harsher environments than other PLs at the time. Mostly, it was designed for the defense and aeronautics industries and had to compete against other PL designs to become a govt standard, similar to how weapons of war are developed and chosen. Think developing for hardcore code audits. There is no way the language could check all the boxes and remain compatible with, say, Pascal or Modula syntax.

Given some of the other issues, I’m not sure it would have mattered, but it certainly didn’t even allow the experiment to be run. I would not have wanted to compile Ada in the 1980s on that hardware. Given all the checking, the compiler must have been horribly slow (imagine compiling Rust on that same 1980s hardware).

I was student between 1990 and 1993 and Ada was the main language. Compilation speed was not an issue. I remember that Eiffel was very slow to compile, but not Ada. Between 1994 and 1999, I have worked with Ada on Vax machines. The full recompilation took 2 hours because the machine was slow, not because of the language. Other languages were similarly slow (pascal, C). C was slow because of the lack of precompiled headers (many headers had to be parsed many times). With Ada (alsys ada), there were "libraries" that were black boxes directories containing object code and already parsed package specifications. Between 1999 and 2002, I have handled projects in Ada, C++ and Java. C++ was slightly slower than Ada (slow link). Java was a lots faster. Nowadays, Ada compilation is faster than C++.

I always found it funny when Rust came about, I can't help but feel like, and maybe I'm misremembering when I deep dove Ada the first time, Ada was our first "Rust" like language, maybe Delphi / Pascal is the only other really close one that became mainstream enough before Rust did?

Rust emerged from the language enthusiast community not a formal industry committee and in some ways that was its superpower.

I and many others have looked at Ada with some appreciation for decades. But the actual "community" around the language was foreign to me; government, defense contractors, etc places that frankly wouldn't even hire me.

It's got appealing constructs, and I grew up with the Wirth languages so I wans't put off by its syntax and style... and I even sometimes considered rewriting my OSS C++ pieces in it because I was so desperate for something better. But it was just a self-limiting box.


I agree, as someone who is fascinated by it. I worked for a defense contracting company, and no even they used it. It's such a strange gem of a language, so much potential lost.

And the cpu that was designed to implement ADA also failed miserably: the iAPX 432.

https://en.wikipedia.org/wiki/Intel_iAPX_432


The claim that it was designed for Ada was just marketing hype, like the attempt of today of selling processors "designed for AI".

The concept of iAPX 432 had been finalized before Ada won the Department of Defense competition.

iAPX 432 was designed based on the idea that such an architecture would be more suitable for high level languages, without having at that time Ada or any other specific language in mind.

The iAPX designers thought that the most important feature that would make the processor better suited for high-level languages would be to not allow the direct addressing of memory but to control the memory accesses in such a way that would prevent any accesses outside the intended memory object.

The designers have made many other mistakes, but an important mistake was that the object-based memory-access control that they implemented was far too complex in comparison with what could be implemented efficiently in the available technology. Thus they could not implement everything in one chip and they had to split the CPU in multiple chips, which created additional challenges.

Eventually, the "32-bit" iAPX432 was much slower than the 16-bit 80286, despite the fact that 80286 had also been contaminated by the ideas of 432, so it had a much too complicated memory protection mechanism, which has never been fully used in any relevant commercial product, being replaced by the much simpler paged memory of 80386.

The failure of 432 and the partial failure of 286 (a very large part of the chip implemented features that have never been used in IBM PC/AT and compatibles) are not failures of Ada, but failures of a plan to provide complex memory access protections in hardware, instead of simpler methods based on page access rights and/or comparisons with access limits under software control.

Now there are attempts to move again some parts of the memory access control to hardware, like ARM Cheri, but I do not like them. I prefer simpler methods, like the conditional traps of IBM POWER, which allow a cheaper checking of out-of-bounds accesses without any of the disadvantages of the approaches like Cheri, which need special pointers, which consume resources permanently, not only where they are needed.


The other CPU that was designed for Ada succeeded spectaculary:

https://datamuseum.dk/wiki/Rational/R1000s400


I do not know much about the architecture of Rational/R1000s400, but despite that I am pretty certain the claims that it was particularly good for implementing Ada on it were not true.

Ada can be implemented on any processor with no particular difficulties. There are perceived difficulties, but those are not difficulties specific to Ada.

Ada is a language that demands correct behavior from the processor, e.g. the detection of various error conditions. The same demands should be made for any program written in any language, but the users of other computing environments have been brainwashed by vendors that they must not demand correct behavior from their computers, so that the vendors could increase their profits by not adding the circuits needed to enforce correctness.

Thus Ada may be slower than it should be on processors that do not provide appropriate means for error detection, like RISC-V.

However that does not have anything to do with the language. The same problems will affect C, if you demand that the so-called undefined behavior must be implemented as generating exceptions for signaling when errors happen. If you implement Ada in YOLO mode, like C is normally implemented, Ada will be as fast as C on any processor. If you compile C enabling the sanitizer options, it will have the same speed as normal Ada, on the same CPU.

In the case of Rational/R1000s400, besides the fact that in must have had features that would be equally useful for implementing any programming language, it is said that it also had an Ada-specific instruction, for implementing task rendez-vous.

This must have been indeed helpful for Ada implementers, but it really is not a big deal.

The text says: "the notoriously difficult to implement Ada Rendez-Vous mechanism executes in a single instruction", I do not agree with "notoriously difficult".

It is true that on a CPU without appropriate atomic instructions and memory barriers, any kind of inter-thread communication becomes exceedingly difficult to implement. But with the right instructions, implementing the Ada rendez-vous mechanism is simple. Already an Intel 8088 would not have any difficulties in implementing this, while with 80486 and later CPUs maximum efficiency can be reached in such implementations.

While in Ada the so-called rendez-vous is the primitive used for inter-thread communication, it is a rather high-level mechanism, so it can be implemented with a lower-level primitive, which is the sending of a one-way message from one thread to another. One rendez-vous between two threads is equivalent with two one-way messages sent from one thread to another (i.e. from the 1st to the 2nd, then in the reverse direction). So implementing correctly the simpler mechanism of sending a one-way inter-thread message allows the trivial implementation of rendez-vous.

The rendez-vous mechanism has been put in the language specification, despite the fact that its place would have better been in a standard library, because this was mandated by the STEELMAN requirements published in 1978-06, one year before the closing of the DoD language contest.

So this feature was one of the last added to the language, because the Department of Defense requested it only in the last revision of the requirements.

An equivalent mechanism was described by Hoare in the famous CSP paper. However CSP was published a couple of months after the STEELMAN requirements.

I wonder whether the STEELMAN authors have arrived at this concept independently, or they have read a preprint of the Hoare paper.

It is also possible that both STEELMAN and Hoare have been independently inspired by the Interprocess Calls of Multics (1967), which were equivalent with the rendez-vous of Ada. However the very close coincidence in time of the CSP publication with the STEELMAN revision of the requirements makes plausible that a preprint of the Hoare paper could have prompted this revision.


The 286 worked perfectly fine. If you take a 16-bit unix and you run it on a 286 with enough memory then it runs fine.

Where it went wrong is in two areas: 1) as far as I know the 286 does not correct restart all instruction if they reference a segment that is not present. So swapping doesn't really work as well as people would like.

The big problem however was that in the PC market, 808[68] applications had access to all (at most 640 KB) memory. Compilers (including C compilers) had "far" pointers, etc. that would allow programs to use more than 64 KB memory. There was no easy way to do this in 286 protected mode. Also because a lot of programs where essentially written for CP/M. Microsoft and IBM started working on OS/2 but progress was slow enough that soon the 386 became available.

The 386 of course had the complete 286 architecture, which was also extended to 32-bit. Even when flat memory is used through paging, segments have to be configured.


The 286 worked perfectly fine as an improved 8086, for running MS-DOS, an OS designed for 8088/8086, not for 286.

Nobody has ever used the 286 "protected mode" in the way intended by its designers.

The managers of "extended memory", like HIMEM.SYS, used briefly the "protected mode", but only to be able to access memory above 1 MB.

There were operating systems intended for 286, like XENIX and OS/2 1.x, but even those used only a small subset of the features of the 286 "protected mode". Moreover, only a negligible fraction of the 286 computers have been used with OS/2 1.x or XENIX, in comparison with those using MS-DOS/DR-DOS.


That factor was downstream of its complexity though. It's far harder to implement a compiler for Ada 83 than even for modern C.

This. Nothing can compete with free.

There were effectively no free compilers in the 80s. If you had an expensive UNIX workstation it might come with one, but everyone in the micro world had to pay. Or they wrote in Assembly or a BASIC interpreter.

Granted, some were pretty cheap, at least by the early 90s.


Strange comment. GNAT?

GNAT was too late to the party.

I’m old enough to remember when the FSF said that blocking spam was censorship. Good to see them wake up.

Whatbmakes you say that? Devs use stacked PRs in small and large repos today.

Their examples show combined backend and frontend changes on the same monorepo in different PRs.

As far as splitting work into different PRs that need coordinated merging, I've only ever encountered that when it's a long lived refactor / feature.


I will speculate the DDOS attacks are funded by companies and governments that benefit from not being held accountable for their past deeds. I suspect X, Google, China, PRNK, Hungary, etc

If Dijkstra blamed Knuth it would have been the best recursive joke ever.


I hear you, friend!

While you were seeing those problems with Java at Google, I saw seeing it with Python.

So many levels of indirection. Holy cow! So many unneeded superclasses and mixins! You can’t reason about code if the indirection is deeper than the human mind can grasp.

There was also a belief that list comprehensions were magically better somehow and would expand to 10-line monstrosities of unreadable code when a nested for loop would have been more readable and just as fast but because list comprehensions were fetishized nobody would stop at their natural readability limits. The result was like reading the run-on sentence you just suffered through.


This is the way.


Poking values into your YAML or JSON files has benefits over building them with templates.


Ouch!


Oh, damn! I should have picked a name that references "you'll poke your eye out!"


So how many angels can you fit on the head of a pin?


Here we see Go haters in their natural habitat, the HN comment section.

Watch as they stand at the watering hole, bored and listless. A sad look on their faces, knowing that now that Go has generics, all their joy has left their life. Like the dog that caught his tail, they are confused.

One looks at his friends as if to say, "Now what?"

Suddenly there is a noise.

All heads turn as they see the HN post about UUIDs.

One of the members pounces on it. "Why debate this when the entire industry is collapsing?"

No reply. Silence.

His peers give a half-hearted smile, as if to say, "Thanks for trying" but the truth is apparent. The joy of hating on programming languages is nil when AI is the only thing looking at code any more.

The Go hater returns to the waterhole. Defeated.


I think you're massively misreading the tone of the comment you're relying to


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: