Hacker Newsnew | past | comments | ask | show | jobs | submit | compiler-guy's commentslogin

The nice thing about C++ is that if you want to use plain old C arrays, you can. And there will be no extra overhead over plain C.

You will lose many nice features like fancy strings and easy array resizing (which may or may not be acceptable to you), but you don’t have to pay for it if you don’t use it. (Mostly)

This does seem pretty complicated. And I doubt I will ever use it. But for some the trade off is worth it, and they get to make the choice.


Language evolves. Historical meaning can inform our understanding of modern usage but doesn’t dictate or govern it.

Linguists call this “semantic drift” and it occurs in every language we know of. https://en.wikipedia.org/wiki/Semantic_change

Wikipedias first English example is:

Awful – Literally "full of awe", originally meant "inspiring wonder (or fear)", hence "impressive". In contemporary usage, the word means "extremely bad".


Possibly even more directly, the etymological fallacy: https://en.wikipedia.org/wiki/Etymological_fallacy

Just because we can - and should - understand what people mean when they use words wrongly, doesn't mean that words don't have correct meanings. That makes the world poorer, for no benefit at all.

Why aren't you responding in Old English then? Surely that would be more correct since it is closer to the origin

Why aren't you busy burning old manuscript and books, since they don't conform to your "new meaning" of words?

If some engineer optimizes something in the Google search stack that makes it, on average, just 0.01% faster (not 1%, but one-one-hundredth of a percent), then they have paid their salary for the entire year. Almost in perpetuity. No matter what level they are.

Very small gains multiplied out over extremely large amounts of compute over large amounts of time add up big.

And that's why Google can spend so much money on fairly small scoped teams.


The bureaucracy at Google has grown and grown. And then grown some more. But it is nowhere near as bad as the GP makes it sound.

Both Google and Microsoft are bigger, and with more products than Meta.

But both Google and Microsoft also massively overhired around the same timeframe as Meta, and are still digging themselves out of the mess of their own making. And making their teams pay for such stupidity.


Quick browsing at adafruit.com (or any other similar vendor), reveals plenty of displays that are 128, 240, and 320 pixels wide. At 6 pixels of width per character, that's only 21, 40, and 53 characters wide. Seems quite useful to me.

There are also several 32x32 led panels, which one could imagine needing some text.

Also, this kind of thing is just interesting, regardless of the usefulness.


As true and amusing as the wadsworth constant is, it has very little to do with software engineering.

It could be set up such that the AI can "fire" them, in that they no longer work at the store, and aren't paid wages that count against the experimental establishment's costs, but still get paid to do something else, or to do nothing at all.

I doubt the experiment is set up that way, but that would be an ethical way to do it.


You have to fix them at some point, but not in the middle of doing other things. Right now. With no possible way to make progress elsewhere while deferring this decision.


One of the things that makes jj worth trying out is simply the fact that it is different than git, and having exposure to more than one way of doing things is a good thing.

Even if you don't adopt it (and I didn't), it's easy to think that "this way is the only way", and seeing how systems other than your own preferred one manage workflows and issues is very useful for perspective.

That doesn't mean you should try everything regardless (we all only have so much time), but part of being a good engineer is understanding the options and tradeoffs, even of well loved and totally functional workflows.


being a good engineer is also understanding when something is a waste of time because the gain is insignificant 99% of the time


Using "good engineering" as an argument against learning is definitely an interesting approach.


I think siblings point needs to be made more sharply: this could've gone somewhere good, "I evaluated it and found the gain was not worth the cost to change", but instead went to "the gain from a change is insignificant 99% of the time, so it's not worth understanding it".

The latter is poor engineering.


It seems like all of your comments are like this. Consider stopping that!


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: