The release of Windows 95 was weird. There were PC users talking about how amazing Microsoft were, to have come up with all the things their marketing people were shouting about, such as pre-emptive multitasking and plug-and-play. Then all the Amiga (and Mac) users, completely underwhelmed, pointing out "we've had all these things for years, how has it taken so long?".
Yeah. one thing is not like the other. While AmigaOS was pre-emptive, Mac System - 6-8 weren't. It was co-op. Everyone who used 6 and 7 can remember copying file meant you couldn't do anything else, and 8 got multithreaded support in Finder finally, but it was still co-op. At the time I used various platforms daily. Namely, AmigaOS, Mac System 6-8, IRIX.. the difference was obvious. IRIX and hardware of course being from the future, but at at least 10x the price.
Even Mac OS classic was just cooperative multitasking. Near the end it got some very limited pre-emptive capability, but most only usable to do calculations.
"Wut" indeed! I was only skimming it anyway, but stopped there. I'm sorry, that paragraph is so effed up, I can't take anything else seriously from this author.
This is too often the problem with stuff about Steve Jobs. People worship him, and credit him with inventing everything. So, even ignoring how thoroughly mangled that quoted section is in every way, now he's the inventor of OOP. Did he also invent a time machine to take OOP back to the 1960s?
> Apple was not days away from going bust. They were months away... They just...
This is historical revisionism, and there's a lot of it around, where Apple is concerned. Since those days, Apple has done a great job of controlling the narrative in the media, and has managed to bury a great deal of what was written back then.
Microsoft was in the middle of one of their antitrust investigations, where they were accused of monopolising the market for computers. They had demonstrated others in the courtroom, running non-Microsoft OSes and office suites, including an Amiga and a Mac. But Commodore had already gone bust, so there was only Apple left.
Then came the news that the previous post was referring to - Apple was on the brink of bankruptcy. By all accounts of the time, Microsoft absolutely shat themselves, expecting the biggest fine in antitrust history. They could not allow Apple to fail, so investing was their only option. Nowadays, even that investment is sometimes framed as yet another amazing feat that could only be carried out by the deity that is Steve Jobs. Jobs even had to drop their still-ongoing OS look-and-feel lawsuits against Microsoft as part of the deal.
The Microsoft deal was originally negotiated by Gil Amelio, and while the monetary investment is what got the headlines and is what people remember, the most important part of the negotiations to Apple was that Microsoft committed to keep developing Microsoft Office for the Mac, which they had been threatening to cancel due to the platforms insignificance. Without Office, the Mac had no future.
Yeah that was a big part of it, but I wouldn't go so far as to call it more important than preventing immediate bankruptcy.
"We'll give you a wodge of cash and we'll keep supplying Office for Mac, so you can continue to supply the market with a rival to our OS at a volume that's insignificant to us, but just significant enough to prevent Windows from falling under the DoJ's definition of a monopoly".
Not sure what point exactly you are making. But the Wall Street Journal had a bunch of stuff about Apple engaging what was later known as 'Enron-style accounting'. They were a big company, and they did have a serious cashflow problem. So they needed a bailout from someone. (which happened to be Microsoft rather than wall street)
Also disagree with GP's point - Apple is definitely not Next. Next was an enterprise software company. If they were more successful they would be in the same category as Oracle.
What? Next an enterprise software company is one of the weirdest takes i’ve ever heard in my 3 decades in the industry. They were a workstation manufacturer with impressively cute UIs and an interesting software stack over MachOS
NeXT became an enterprise software company when it shut down its hardware division around 1993. At first it only sold its operating system, which got ported to x86, PA-RISC, and SPARC. Then, NeXT started selling development tools and libraries. The OpenStep API was developed as part of a joint project with Sun. OpenStep is an Objective-C API that is based on NeXTstep’s libraries, but made to be portable. OpenStep was the native API for the OPENSTEP (note the capitalization) operating system and was also available for Sun Solaris and even for Windows. I have a CD named OPENSTEP Enterprise, which is installable on Windows NT and Windows 95. There was also Portable Distributed Objects, which was NeXT’s take on distributed objects, which was big in the 90s (like CORBA). Finally, NeXT had a web server named WebObjects that had major customers such as Chrysler in 1996.
At the time Apple purchased NeXT, NeXT was definitely an enterprise software company. The black workstations were gone, the operating system was not marketed to casual users but to developers and others who needed software that used the OpenStep API, and it sold various developer tools.
All that is true, but only the first part of the story. The OpenStep stuff was also not really successful and effectively became a very expensive MS Windows dev tool (or least that's where 99% of revenue came from).
Next's only real successful product was WebObjects. (Which imo was a terrible take on a web server framework and it was just about to be obliterated by J2EE when Apple bought them out.)
eta: I guess its fun to romanticize this and pretend they only made cool black computers and portable unix software. But if Next was successful, HN would hate their fucking guts.
I can believe that, but I recall some tradepress article about more than 100 companies selling non-java 'web middleware' who got bowled over by J2EE, and otherwise Next would have just been another one of those. That was Sun's strategy, not Next's.
WebObjects was fundamentally just a bad abstraction, so good thing too.
Hey PJ, I like your posts because you have the historical background on a lot of this stuff that industry has mostly forgotten.
But... Since you mentioned it, I actually have read J2EE and WebObjects documentation. And I conclude that WebObjects was shit. It drew the 'Web MVC' line at the completely wrong place. Nobody ever cared about about DOEs or whatever, they just wanted a database driver. You look at this huge pile of industry crap and its no wonder why Rails was successful.
Three decades ago, they would relentlessly snailmail spam us with these weekly industry tabloids like 'ComputerWorld' and 'PCWeek'. These were always fun to read at lunch, even if they were all obvious advertisements, but certainly better info than vaguely remembering something from your stoner phase and then sticking your junk out.
I dislike it here because I like Mullvad, but yes, I think it’s fair to go straight to public disclosure.
Someone with likely substantial qualifications put in time to find this. The company is in it for profit (at least partially). What’s fair for the company is fair for the individual. The company can either offer to pay for bugs under the terms they want, hire more security folks to find the bugs themselves, or just accept that researches get to do whatever they want with their findings.
I’d tell Mullvad, but there are companies I don’t respect enough to feel compelled to give them a heads up. Perhaps the author feels that way about Mullvad, it’s entirely within their right to use this to publicly shame Mullvad.
This ought not be considered anything close to common courtesy. This is work. Mullvad is engaged in the business of making money. They should show how serious they are with your money.
Since when do you have professionals giving you examinations out of common courtesy? Out of courtesy can I get a free cancer screening?
If I doctor performed a cancer screening on me, for free and without me asking, then yes — as a matter of courtesy I would still expect that doctor to tell me if he found cancer, rather than reading about it on his blog later.
>Since when do you have professionals giving you examinations out of common courtesy?
Maybe when they decide on their own volition, without any external pressure, to go and poke around your system?
"Hey, I'm a mechanic, I was looking at your car parked out there and noticed something incredibly dangerous that needs immediate fixing. I'll tell you what it is for $1,000."
Even better, the mechanic writes a blog post about the dangers of non-functioning brakes, but doesn't tell the car owner, because they didn't have a sign advertising their "car issue bounty program".
Seems to be a systemic issue with computer guys feeling entitled to financial compensation for strange reasons. See also, people licensing their software as "open source" and then being mad when people make money off it.
Even better, the mechanic writes a blog post about how the locks on that guy's car don't work, and how anyone could just steal it, but doesn't tell the guy because, after all, the guy wasn't paying him to.
Both of y'all confusing individual with corporate.
The mechanic writes a blog post about how the locks on [a car model] don't work, and how anyone could just steal [cars], but doesn't tell the [car company] because, after all, the [company] wasn't paying him to.
Especially, when the car company spends on 'certifications' (security audits, in this case) and specifically markets it as a differentiator. That said, uncoordinated public disclosures in cybersecurity are bad form, given the well-established existing norms & culture; but at least, let's get analogies right.
When their 'common decency' is directly benefiting a money making corporation with shareholders and directors then yes they should definitely get some money out of it.
If you create a 3rd party app to some closed source insecure back end, thats on you for trusting them or not doing your due diligence.
Time and time again private companies have rug pulled things like api access for 3rd party apps (such as twitter/X). Building 3rd party clients for private systems should already be approached with heavy scepticism and always be prepared for the worst.
Most of HN readers/writers are American, of course they won't do anything unless they personally profit off it, the entire culture is built around this mindset. Meanwhile, Mullvad is Swedish, and we tend to assume we all want to help build a better world together. Mix the two, and you get this conversation :)
So what, suddenly Swedes can't live outside of Sweden? Kind of interesting to make complaints about generalizations and in the same comment falling for the same trap yourself.
You identified as Swedish, and then shifted the goal post to the country where you (apparently) currently reside.
What's your point again? Something about all Americans whether they live in the US or not? Are you trying to be incoherent? Daft? Representative of all Swedes by origin?
> Most of HN readers/writers are American, of course they won't do anything unless they personally profit off it, the entire culture is built around this mindset
American culture is highly varied. For some this is true, for others this is wrong and highly insulting.
It's OK for the country to have a pervasive culture yet not every resident or citizen of the country to be a part of that culture, or even actively work against it. If you're not one of them matching that description, it shouldn't be insulting, as it's not about you in the first place.
Maybe not everything is aimed towards you, especially if you don't feel like the description actually matches you :)
That is a lot of words for "my negative steroetypes about you and your country are fine, actually. Don't take it personally, bro. Maybe you're one of the few good ones!"
Every time someone makes a cultural comment here, the reply is always "America is a big country". America can be a big country and still have common cultural elements. It's not inaccurate to say that citizens of a large country mostly share some common characteristics. Those characteristics are what makes them one country.
It can be very interesting to read opinions in such places as "Letters to the Editor" in newspapers from the 1800s. The conviction that "things were so much better in the past" and "everything's gone to shit" (in the face of clear evidence to the contrary) is, and always has been, an integral part of the human condition.
> Rapes have not increased by some 200% over the last 10 years?
No. There has been an increase in reported rapes across many Western countries, due to a combination of much higher awareness (especially due to celebrities), and clearer laws and guidelines for the police on how to report, including many cases where offenses that would previously have been reported as "sexual assault" are now reported as "rape". There was also a huge jump in the reporting of historic offences, so if you want to know when they happened, you need to make adjustments from when they were reported.
> the social housing in London area is not majority occupied by foreigners?
No, it is not. That's (famously) completely made-up.
You appear to have been taken in by exactly the type of propaganda the article is exposing.
Social housing in the London area is only occupied by 47.6% foreign born nationals, which is technically not a majority. But that figure is from 2021, rather old, and in 2026 this figure may actually be a majority.
Less than half (47.6%) of London’s social housing was occupied by people describing themselves as born outside of the UK in the 2021 Census.
If someone tells you that some assertion about the world is completely made up and that you're falling for propaganda for believing it, this gives you extremely little information about whether or not that assertion is true, or whether closely-related assertions to that one are also true.
> I still use an old PC on Win7 as my primary machine
So do I. I've had to deal with 10 and 11 at work and had the same sort of problems, so I've refused to "downgrade" this PC.
It particularly used to really piss me off that when I was partway through working on something and had several applications open, with data loaded, that if I tried to leave it like that overnight so I'd be able to continue immediately the next morning, chances are Windows would decide to update and reboot, closing everything.
I found several ways online to supposedly stop it from doing that, but nothing ever worked.
Although 7's UI is much better than the flat nonsense we get these days, I don't find the UI to be the biggest problem. If using Windows 11, I'd want to replace the underlying OS, not keep it and replace just the UI. So while this project looks interesting, to me it's not fixing the real problem.
Just thinking more about how we're told it's "insecure". It's unfortunate that so many tech people are so gullible when it comes to the industry's marketing around this.
Many of us know a huge proportion of news stories come from PR firms that just want to sell us something (it comes up on HN every now and then). In the mid-2000s or so, Microsoft had a particular problem selling Office - there was no reason to upgrade to the current version, because the older one already did everything you wanted. Until that time, established practice was to buy new software only if you wanted its new features; the vendor had to give you a good reason to pay for it. To some of us, the PR that immediately followed the stories of struggles to sell their newer versions - PR that suddenly exploded everywhere - was obvious and transparent. "You must upgrade because old software is insecure!" But it grew into the monster we have today. Some people literally panic if they discover an older piece of software.
Think of young people growing up with that being blasted at them constantly. It must have contributed to the has-to-be-new-and-shiny mindset of Javascript developers, where they're terrified to touch anything that hasn't been updated for a few months.
That long, sustained, and paradigm-shifting PR campaign has been a huge win for many software vendors, and for Microsoft in particular. (Of course, after that, and after a few failed attempts, they managed to get the subscription-based model to work for Office, which in that particular case, bypasses the mess left by their earlier selling strategy anyway.)
But... Old software is often going to be insecure on the network. Are you arguing that an OS from 2013 with a browser from the same time is fine on the Web?
Who's using a browser from 2013? When I said I'm running Windows 7, I'm specifically talking about the OS, including an awful lot of updates it's had since 2013, not all software I run on it. Updates added such things as support for the later versions of TLS, several years ago. Although Google and Mozilla have dropped official W7 support from Chrome and Firefox, there are forks that add it back, which is why I'm running up-to-date browsers.
If we were talking about even older browsers though... 20 years ago, because of the insecure way browsers generally worked, everybody used third-party antivirus or e.g. Norton Internet Security, which seemed to cause as many problems as it solved. But browsers (and OSes) haven't been so open for years - we don't have quite that class of problems anymore, where just visiting a site was enough to get the browser to download and run all sorts of nasties. I don't remember quite when it was that we'd left the most dangerous period behind, when the security of browsers and OSes had been considerably hardened, but it was before 2013. Windows 7 was, and is, much safer on the network than XP, by design.
Fair. As long as people are careful about what they're executing on the OS and it isn't arbitrarily exposed to the network it is less of a problem. My comment about browsers was due to me thinking that a lot of software stops building for old OS targets. I guess W7 is still getting modern support from vendors.
FWIW I'm running CachyOS and for the first time in my life have moved 95% away from Windows (still maintain a partition that I use every few weeks for a game that can't run on Proton). KDE 6.6 is a delight to use and everything "just works" for me, I don't have to worry about ungodly telemetry, and software fixes come in quickly.
> It particularly used to really piss me off that when I was partway through working on something and had several applications open, with data loaded, that if I tried to leave it like that overnight so I'd be able to continue immediately the next morning, chances are Windows would decide to update and reboot, closing everything.
Whenever I use a recent(ish) Windows (rarely :-), it's annoyances like this that make for a poor UX. Again & again.
When you put a computer to sleep/hibernate, you expect it to come out of sleep in a similar state as before. When you select "shut down", you expect that. Not "installing update 1..20, then shut down".
It keeps amazing me that within Microsoft, after having done so many OSes used by millions, some eggheads think that breaking user expectations is a good design decision. It is not.
Regarding Britain, "conserve" used to mean posh jam, but nowadays it seems to be more of a marketing word - a brand trying to pretend they're posh, similar to how pretentious restaurants use French words for no obvious reason.
"Smooth jam" here in the UK is sometimes labelled as jelly, like this kind of thing:
I've always heard it came from Xerox PARC, but even if it originated at Apple, it would have been one of their OS devs. It's nuts how the cult of Steve Jobs leads some to label him as the inventor of everything.
reply