You're assuming all women in your cohort start not pregnant. However, given a random sampling of women across the entire human race, if you have approximately 14,000 women, statistics says you'll have a baby in a month. That is to say, the chances of one of those woman being 8 months pregnant reaches close enough to 1, given about 14,000 randomly selected women.
Also, you can get a baby tonight if you steal one from the maternity ward.
The real question is, how do LLMs turn the mythical man month on its head. If we accept AI generated code, can an agentic AI swarm make software faster simply by parallelizing in a way that 9 women can't make a baby in 1 month because they're am AI, not human, and communicate in a different way.
The pitfall of AI coding is that previously every shiny tangent that was a distraction, is now a rabbit hole to be leaped into for an afternoon, if you feel like it. It's like that ancient Chinese curse, may you live in interesting times. Everybody can recreate an MVP of Twitter in a weekend now when previously that was just a claim a certain type of people made.
> You're assuming all women in your cohort start not pregnant. However, given a random sampling of women across the entire human race, if you have approximately 14,000 women, statistics says you'll have a baby in a month. That is to say, the chances of one of those woman being 8 months pregnant reaches close enough to 1, given about 14,000 randomly selected women.
There's a good point in here along the lines of "if you need X in a month, and someone else has something that's 90% of what you want X to be, can you buy it from them before starting any crazy internal death marches instead?"
> The real question is, how do LLMs turn the mythical man month on its head. If we accept AI generated code, can an agentic AI swarm make software faster simply by parallelizing in a way that 9 women can't make a baby in 1 month because they're am AI, not human, and communicate in a different way.
This is quite possibly only a one-time switch from a changed baseline, though. Give it a few years and "the fastest way an LLM tool can do it" will be what gets tossed out a an estimate, and stakeholders will still want you to do it in a tenth the time...
9 pregnant women produce one baby/month on average (assuming no miscarriages or late births,etc).
On paper your CPU can execute at least one instruction per core per cycle but that's on average too, if you actually only have one instruction to run it takes several cycles.
Actually, I like quite a lot of the subtle jokes on HN. It is harder to notice, fewer to find, and I don’t get it many a times. But when I get it (or someone explains it to me, perhaps out of pity), I chuckle, laugh, and laugh again. And I remember those comments.
I think the occasional joke is fine but when you have too many then the comments get diluted. It's exactly that kind of thing that makes me hate Reddit and so many other places: spam.
I like to think everyone came to the conclusion that it would strengthen the piece if most comments on it appear to miss the point and are slightly robotic.
> As a rule of thumb, Macs will not run any version of macOS older than the one they shipped with when they launched. Apple provides security updates for older versions of macOS, but it doesn’t bother backporting drivers and other hardware support from newer versions to older ones.
So the answer is “no”, they probably won’t be able to downgrade on the models that are about to be released.
In the long run https://github.com/google/crubit will very likely solve this for Rust even if it's a bit specific to Google's use cases right now as per readme.
For many relevant specs you can find "draft versions" that are essentially the final version without the official stamp on the open web so there isn't that much of a need.
Funny, another commenter on this post was saying the opposite, that Rust was likely being used to just port existing features and that was easier because there were probably good tests for it already.
If you've actually written considerable amounts of Rust and C++, these statistics don't require justification. In my opinion it's completely expected that Rust code is easier to write correctly.
As a relatively novice programmer who's worked in tech for decades but not as a software developer: I take issue with the idea that you need to write considerable amounts of Rust and C++ for these statistics to be expected. In fact, despite Rust's initial vertical learning curve I'd say that any junior developer trying to implement anything with any degree of complexity at all in Rust and C++ would see the benefits.
At the very least, the fact that IDE integration can tell you all kinds of stuff about what you're doing/doing wrong and why accelerates things greatly when you're starting out.
The problem with junior developers is that Rust will be incredibly frustrating to learn by perturbation, because the compiler will reject most random changes to the code. Which is the point of course, but C++ will compile programs which then crash, giving you a very misguided feeling that you’re making progress, but this is very important in the process of gaining new skills.
I don’t see a way around it, programming without garbage collection is hard, Rust makes it very clear very quickly, which is also the point, but this is at odds with making the learning curve accessible.
> The problem with junior developers is that Rust will be incredibly frustrating to learn by perturbation
Yes, this is the biggest issue with Rust that I've seen; most language will let you do something wrong and then as you learn you get better. Rust will refuse to compile if you're not doing things correctly (and normally I would put 'correctly' in quotes but correctness in Rust is well defined).
The first time I tried to experiment with learning Rust was a disaster. I just wanted to decode some JSON and filter it, but -- oops! -- I don't own that variable. Okay, well I can pass it somewhere else mutably, right? But then that function does the work and returns something that... what's a lifetime? What's a 'a mean? How do I... screw it, I'll go back to Python.
Eventually, after the tooling and the tutorials got better I came back to it and really enjoyed what I've seen so far and even rewrote one of my own personal tools in Rust[1] to experiment with. It's nothing impressive, but it was fun to do.
The logic in my comment wasn't that you need to have written considerably amounts of code to be expecting this, just that to not be expecting this would make me think you hadn't. If that makes sense.
On your second point, I think IDE integration for C++ is similar as it is for Rust. Just Rust errors and tooling are a million times better regardless of IDE.
Oh, the more junior the developers, the quicker they will get any benefit. That's common for any language that enforces correctness, but the C++ vs. Rust comparison isn't even fair; C++ is an incredibly hard language to use.
Apple should have modernized ObjC instead of making Swift the lingua franca. Both speed of iteration and flexibility (on which web-stack-rivaling productivity features would have been possible) are gone forever.
Swift Concurrency is a tire fire that not even their async-algorithms team can use completely correctly, and useful feature like typed throws are left half finished. The enormous effort the constant further bastardization of Swift takes, is at least in part the reason for the sorry state dev tooling is in. Not even a 4T dollar company can make a reliable SwiftUI preview work, in their own IDE. Variadic generics (a seemingly pure compiler feature) crash at runtime if you look at them the wrong way. Actors, the big light tower of their structured concurrency, are unusable because calls to them are unordered. They enforce strict concurrency checking now, but the compiler is too dumb to infer common valid send patterns; and their solution to make this abomination work in real codebases? Intro a default that lets _everything_ in a module run on the main thread per default!
Swift has so many issues they would honestly be better off just moving to Rust rather than fix Swift. Seriously. The fact that it's so easy to get the compiler to spend exponential time resolving types that it very often just shits the bed and begs you to rewrite your code for it to stand a chance is shameful coming from, as you say, a $4T company. Points to deep problems with Swift.
While the C calling convention continues to rule operating systems and FFIs, I think it’ll continue to limp along. Hopefully one day that can be fixed, it’s annoying that C is what I have to reach for to call SomeLib no matter what language I’m using
I remember that quite well. However, the backlash was very specific; as far as I remember it was never directed at the company as a whole, let alone the person of, say, Eric Schmidt.
Eric Schmidt didn’t present as a creepy weirdo.
He also didn’t make the company a reflection of himself. That kept the glasshole backlash compartmentalized.
Strange things happen when a leader merges the company brand and with his personal brand. It can strengthen the company brand (in the case of a plucky can-do technologist) but the company brand starts to get colored by the personality of the person (in the case of a person who goes off the deep end and starts saying weird and inflammatory stuff).
Why do Hunyuan, OpenAI 4o and Gwen get a pass for the octopus test? They don't cover "each tentacle", just some. And midjourney covers 9 of 8 arms with sock puppets.
Good point. I probably need to adjust the success pass ratios to be a bit stricter, especially as the models get better.
> midjourney covers 9 of 8 arms with sock puppets.
Midjourney is shown as a fail so I'm not sure what your point is. And those don't even look remotely close to sock puppets, they resemble stockings at best.
I think that might be true of the language committee, but there's presumably a huge crowd of people with existing c++ code bases that would like to have a different path forward than just hoping that the committee changes priorities.
That is what many of us have done moving into managed languages, with native libraries when required to do so.
The remaining people driving where the language goes have other priorities in mind like reflection.
The profiles that were supposed to be so much better than the Safe C++ proposal, none of them made it into C++26, and it remains to be seen if we ever will see a sensible preview implementation for C++29.
C++ 26 doesn't have the technology, but it wouldn't matter anyway because what's crucial about Rust isn't the technology it's the culture.
If WG21 were handling Rust instead f64 would implement Ord, and people would just write unsafe blocks with no explanation in the implementation of supposedly "safe" functions. Rust's technology doesn't care but their culture does.
Beyond that though, the profiles idea is dead in the water because it doesn't deliver composition. Rust's safety composes. Jim's safe Activity crate, Sarah's safe Animals crate and Dave's safe Networking crate compose to let me work with a safe IPv6-capable juggling donkey even though Jim, Sarah and Save have never met and had no idea I would try that.
A hypothetical C++ 29 type safe Activity module, combined with a thread safe Animals module, and a resource leak safe Networking module doesn't even get you something that will definitely work, let alone deliver any particular safety.
> If WG21 were handling Rust instead f64 would implement Ord, and people would just write unsafe blocks with no explanation in the implementation of supposedly "safe" functions. Rust's technology doesn't care but their culture does.
I'm sure you think this was somehow succinctly making your point, but I can't see any connection at all, so if you did have an actual point you're going to need to explain it.
OK? I don't see how that's connected? It's not controversial that f32 and f64 are partially ordered, the problem in C++ is that the difference between "Partially Ordered" and "Totally Ordered" is semantic not syntactic in their language and all semantic mistakes are just IFNDR so it's a footgun.
It's "9 women can't make a baby in one month".
reply