1. There were far more users of the classic Mac OS than there were users of NeXTstep/OPENSTEP. Mac OS X has many of OPENSTEP’s underpinnings, but it wasn’t OPENSTEP 5.0; it was Mac OS X, a continuation of the Mac but with new underpinnings. The interface was different enough to represent a new direction for the Mac but without turning the Mac UI/UX into that of NeXT.
2. At the time NeXTstep was under development (mid-late 1980s), the case law surrounding UI look-and-feel and how much borrowing and inspiration one could have before it became infringing wasn’t settled. Apple had lawsuits with Digital Research and Microsoft over whether GEM and Windows infringed on the Macintosh’s look-and-feel. Recall that NeXT was formed after Steve Jobs’ failed coup at Apple against then-CEO John Sculley. Apple sued NeXT due to Jobs’ poaching of key Apple employees who worked with him on the Macintosh and allegations that NeXT was going to use Apple’s intellectual property (in some ways NeXT could be thought of as the evolution of the “Big Mac” project Steve Jobs worked on before his departure). They ended up settling out of court, but given Apple’s litigious nature and given the history of how NeXT came to be, it was very wise for NeXTstep to feature a UI/UX that was a radical departure from the Macintosh. While I don’t think a lawsuit about right-hand scroll bars would succeed, having them on the left helps defend against allegations that NeXTstep ripped off the Mac.
We had the same requirement at my high school in Sacramento back in the early 2000s. I was given the option to test out of it, since I already knew how to use Office, which I had been using at home since fifth grade for reports and presentations. I had to study harder for Excel and Access, since most high school students don’t need sophisticated spreadsheets or databases, but I passed the exam on my first attempt.
A far better computer literacy course was the one I took at Sacramento City College as a dual-enrollment student in summer 2004, which was the prerequisite to programming courses. Even though I already knew how to program in QBASIC, Visual Basic 6 and C++, I still had to take this course. Anyway, we learned very basic computer architecture (the roles of the CPU, memory, storage, buses, etc.), the role of the operating system and the difference between it and applications, computer networking, the Web (with an introduction to HTML and CSS), the history of computing, and a brief introduction to programming, with exercises in C++ and even Scheme (the professor showed us his copy of SICP and threatened students who talked during his lectures with Scheme homework assignments).
It was a fun class. The professor knew I was a Linux fan, but I had a hard time downloading a distro at home due to my having dial-up. He gave me some FreeBSD install CDs. I became a fan of FreeBSD since, and exploring FreeBSD led me down a rabbit hole where I devoured the history of Unix and BSD. By the time I graduated from high school, I wanted to be a systems software researcher like Ken Thompson and Dennis Ritchie. This shaped my early career; I’ll never forget meeting Marshall Kirk McKusick my senior year of college at USENIX FAST 2009.
Turned out that computer literacy course I was required to take at Sacramento City College despite having computer literacy had far-reaching impacts in my life.
Your words resonate with me. Even before LLMs, I’ve been disappointed with the general direction the software industry took in the 2010s. Today’s software industry is not the industry of Licklider, Engelbart, Bob Taylor, Alan Kay, Woz, Stallman, Ritchie, Thompson, Pike, Joy, and many others whom I admire, who helped establish an ethos of computing that fostered a sense of freedom, creativity, and wonder.
Instead, what we have today is a computing ecosystem dominated by powerful players who care about money and control. Speaking from the standpoint of a Bay Area resident, since roughly 2012, the field has been increasingly taken over by people who are in it for the money. Combine that with Alan Kay’s observation that computer science is a “pop culture” that often lives in the moment and has little regard for the past, and also combine that with the “move fast and break things” attitude that permeates modern software development, and this has created an environment that seems hostile to the types of nerdy pursuits that the industry once encouraged. The working environments of many major software companies and the products they release are a reflection of the values of the companies’ executives, managers, and shareholders.
While I’m not anti-AI, I see agentic coding as another step in the direction that the software industry was already heading towards, where it can move even faster and break even more things.
There is still wonder, joy, and freedom in computing, but I feel this is increasingly confined to the hobbyist world and certain niches in research environments.
21 miles, 30-45 minutes each way in the East Bay region of the San Francisco Bay Area. I have to leave home before 6:30am to achieve a ~30 minute commute; commute conditions deteriorate after 6:30am. If I’m unlucky, though, it could be 60 minutes in stop-and-go traffic.
When I took an introductory programming class at Sacramento City College in fall 2004 during my senior year of high school, we spent the first half of the semester designing our programs using flowcharts and pseudocode. We were encouraged to check the logic of our flowcharts and pseudocode. In the second half of the semester, we implemented those programs in C++.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
I think it's for all intents and purposes impossible to program like this in this century. Like imagine just writing x + y in C++. Are you seriously going to enumerate every declaration of operator+ in the translation unit in your head to see if it's eligible (don't forget ADL)? And then every single possible implicit conversion or promotion that could make other ones eligible? And then go through all the overload resolution rules that practically no humans have memorized (with any template instantiations that may come into play) to figure out if the declaration you wanted is actually the best match? That's before you even look at its definition...
I collect HP calculators: I have an HP 12C, an HP 15C Collector's Edition (there are a few of them left still for sale), an HP 32Sii, and an HP 48SX. I sometimes use them, but whenever I'm in front of a computer (which is almost all the time), I find myself using the Unix dc command.
Handheld calculators are nice, but outside of exam settings, I could use a smartphone or a computer, though calculators are nice when I want to work distraction-free through something that requires performing calculations. I believe this is why HP largely exited the calculator market: HP's target market was professionals, and cheap computers and smartphones killed the calculator market for them, similar to how electronic calculators killed the slide rule. Texas Instruments, however, is still in the calculator business, largely due to their successful courting of American middle and high schools, as well as ETS and other testing agencies, beginning in the 1990s. I don't know the situation in Japan regarding calculator usage, but I see Casio scientific and graphing calculators proudly displayed at electronics stores such as Yodobashi Camera and Bic Camera.
HP-35 (1972, first scientific, first in space) - in leather case
TI-30 (1976, first low-cost scientific)
HP-12C (1981, financial, c. 2000 remanufacture)
HP-15C (1982, advanced scientific) - in leather slipcase
HP-16C (1982, computer programming) - in leather slipcase with manual
TI-30 SLR (1982, TI’s first solar-powered scientific)
HP-17B II (1990, financial)
TI-85 (1992, TI’s first with link port)
TI-82 (1993)
TI-92 (1995, TI’s first with computer algebra system)
I use the HP-16C pretty regularly when I'm working on network protocol programming. I have good apps that do it, but there's something about having the calculator right in front of my keyboard rest and turning to it that I like more. In a pinch or outside the house I'll use JPRN instead.
The scary thing is I don’t know of a place that would be better for Americans in the next 20-30 years:
- Western Europe needs to figure out quickly how to adapt to a likely diminished or non-existent American role in NATO while at the same time dealing with a very assertive Russia.
- Canada, Australia, and New Zealand are nice but have astronomically high housing prices.
- Japan is struggling with three decades of stagnation, an aging population, and the weak yen.
- Taiwan faces an existential crisis should mainland China attempt to repossess it.
- South Korea is a bright spot, but it has to deal with North Korea and (to a lesser extent) China.
- The developing world is still developing and still needs more time to approach a standard of living that matches that of the developed world.
- China is probably going to overtake the United States in terms of economy, and it has a high standard of living in its urban areas, but living in China means living under CCP rule.
I’m not optimistic about the US in the next few decades, but I’m not optimistic about other developed countries. I’m in my late 30s; sadly if the next 20-30 years are rough, then that’s the rest of my working life…
> - Canada, [...] are nice but have astronomically high housing prices.
Vancouver, Toronto, and the surrounding areas have crazy housing prices, but the rest of the country is still mostly okay. These are the two biggest English-speaking cities, and about a third of the country lives in either of them, so they're where most new immigrants tend to go, but there are still tons of other great cities in Canada with better housing prices.
Calgary's housing has gone kind of out of control too. It's slowed down now, but for a while it was crazy. I own my house in Calgary and my property value went up by ~250k since I bought in 2018. Lucky for me but kind of insane growth
I'm in Calgary too, and housing has definitely jumped a lot since Covid, but we're still not quite at the Vancouver/Toronto insanity of $1MM for a 1-bedroom apartment (and hopefully it stays that way!).
Yeah, I agree. As much as I like seeing the value of my property grow, I don't think it's overall very good for the city or society if no one can afford homes
> Western Europe needs to figure out quickly how to adapt to a likely diminished or non-existent American role in NATO while at the same time dealing with a very assertive Russia.
Primary, western europe seems to be last chance for democracy. Like, last democracies standing.
That assumes only nation-state-level entities. This has been a very poor assumption: even within my lifetime, the last ~40 years, we've seen nations like Yugoslavia and the USSR break up, and some of the successor states (eg. Slovenia, Croatia, Ukraine, and the Baltic States) have developed robust democracies where previously totalitarian communist governments existed.
Within the territory of the U.S, states like California and NY and Massachusetts continue to have robust democracies even if the federal government doesn't. In California's case it's often a bit too robust, and we often get ourselves into trouble with ballot propositions that have a lot of popular support even when they're economically unworkable.
The Mac isn’t a monopoly, but choices for desktop operating systems are indeed limited. I use macOS, Windows, and Linux on a regular basis. The only one that’s improving is the Linux ecosystem. I prefer macOS to Windows, but macOS is not as polished in 2026 as it was in 2016 or especially in the Snow Leopard era.
I do sympathize with the viewpoint that many academics are not in a position to give good advice about industry since many of them either never worked in industry or had limited exposure via internships. Additionally, the values of academia are sometimes different from industry. Academia, at least in its purest form, is about advancing and disseminating knowledge, while industry is about serving customers through providing products and services.
With that said, I discovered that I’m an academic at heart after nine years in industry, though I left right before agentic coding took off. I got tired of “moving fast and breaking things,” of prioritizing shipping things and “the bottom line” over everything else.
With that said, agentic coding, in my opinion, only amplifies long-standing trends, that shipping matters more than craftsmanship. Even without LLMs, software engineering has long had a “git ‘er done!” attitude. To be fair, market effects matter greatly in software businesses. Quality matters insofar as avoiding completely unusable software, but many software companies succeed without building carefully-crafted software. Even Apple, which has a reputation for being perfectionistic, doesn’t make perfect software.
Academia has its own problems (publish-or-perish, low pay compared to other occupations that require heavy investments in education, politics, etc.), but it seems to allow more breathing room for computer scientists to focus on the craft of programming without as much pressure to ship (publish-or-perish aside).
It’s even more extreme in the Bay Area. While San Francisco is a job center, there are also major suburban job centers such as Palo Alto, Cupertino, Mountain View, and Sunnyvale. The problem is living close to work is painfully expensive for all but the most well-off employees. A Google executive could comfortably afford a nice house in Los Altos or Palo Alto and have an easy commute. A Google engineer could commute from Fremont or Pleasanton, which would be grueling in a car, but is comfortable on a Google shuttle bus with leather seats and WiFi. But if you’re a teacher working for a school in Mountain View, my condolences. If you want to afford to buy, you’re looking at a grueling commute from either a middle-class exurb like Tracy or from a high-crime, impoverished area like East Oakland. Even renting an apartment closer to work would be daunting in terms of cost.
1. There were far more users of the classic Mac OS than there were users of NeXTstep/OPENSTEP. Mac OS X has many of OPENSTEP’s underpinnings, but it wasn’t OPENSTEP 5.0; it was Mac OS X, a continuation of the Mac but with new underpinnings. The interface was different enough to represent a new direction for the Mac but without turning the Mac UI/UX into that of NeXT.
2. At the time NeXTstep was under development (mid-late 1980s), the case law surrounding UI look-and-feel and how much borrowing and inspiration one could have before it became infringing wasn’t settled. Apple had lawsuits with Digital Research and Microsoft over whether GEM and Windows infringed on the Macintosh’s look-and-feel. Recall that NeXT was formed after Steve Jobs’ failed coup at Apple against then-CEO John Sculley. Apple sued NeXT due to Jobs’ poaching of key Apple employees who worked with him on the Macintosh and allegations that NeXT was going to use Apple’s intellectual property (in some ways NeXT could be thought of as the evolution of the “Big Mac” project Steve Jobs worked on before his departure). They ended up settling out of court, but given Apple’s litigious nature and given the history of how NeXT came to be, it was very wise for NeXTstep to feature a UI/UX that was a radical departure from the Macintosh. While I don’t think a lawsuit about right-hand scroll bars would succeed, having them on the left helps defend against allegations that NeXTstep ripped off the Mac.
reply