another one, this is 2nd most frequent thing people write here, not sure how to even approach answering :)
so I’ll do what I was thought in first grade to never do and answer a question with a question - how much time per week does a brick layer spend laying bricks? they are looking at these new “robots” laying bricks automatically and talking on BrickLayerNews “man, the brick laying has not been a bottleneck for a long time.”
But to answer your question directly, a lot of time if other people do their job well. Last week I had about 7 hours of meetings, the rest of the time I was coding (so say 35 hours) minus breaks I had to take to stretch and rest my eyes
Interesting! I guess it really varies between jobs, roles, and companies.
Thats never been my experience but I have an odd skill set that mixes design and dev.
I’ve always spent a lot of time planning, designing, thinking, etc.
How detailed are the tickets if you spend all your time coding? You never have to think through architecture, follow up on edge cases the ticket writers didn’t anticipate, help coworkers with their tasks, review code, etc.?
i think this is it. you're a bricklayer. No, the bottleneck for erecting buildings is not bricklaying.
Without taking all the time to write a dissertation to try to convince you, because why; how about we just start with even zoning laws and demographic analysis preclude the laying of the bricks.
is it so unreasonable to think it is not about the laying of the bricks?
writing software, if you know what you are doing, is very similar to laying bricks. write smallest possible functions that do one thing and do it well and then compose them, like bricks, to make a house (which is what brick layers do).
comments like this come from places where it is more like bunch of chefs in a italian restaurant making spaghetti pasta (code) :)
No, thats a common mechanistic view of building software but it's not really accurate. Unlike with bricks, the way you arrange your components and subcomponents has an effect on the entire system. It's a complex phenomenon.
Of course your view is quite common especially in the managerial class, and often leads to broken software development practices and the idea that you can just increase output by increasing input. One step away from hiring 9 pregnant women to make a baby in a month.
This was really frustrating me. YT started recommending this channel and I could recognize the voice as an AI impersonation but had no way to know if it was at least reading something really written by Feynman. Eventually I concluded it wasn't, but there wasn't clear criteria under which I could report the channel. I'm not sure it's even against YT's TOS.
I think you're not technically wrong, but you're defining NAT differently than the majority of people you're arguing with (those who assume NAT also implies a firewall blocking inbound connections), and the remaining minority (the "on the WAN subnet" crowd) are dismissing outright the idea as a reasonable attack vector that an attacker close enough to be able to send packets destined for non-internet routable addresses to your router.
Is the latter something that was/is actively exploited?
It's there in the name - "Rust graphics community" as opposed to simply "the graphics community". Language fetishization above the end goal - or, rather, the language fetish is the real goal.
If you want heavy concurrency, to get some value out of all those CPUs on a desktop, you need all the help you can get from the language. Unity and Unreal pushed it through in C++, but both required huge efforts.
If gRPC overhead is critical to your system, you've probably already lost the plot on performance in your overall architecture.
You make a fair point about smart pointers, and median "modern C++" practices with STL data structures are unimpressive performance-wise compared to tuned custom data structures, but I can't imagine that idiomatic Java with GC overhead on top is any better.
He definitely made games. Chris Crawford was one of the first known names in game design, a few years ahead of contemporaries like Sid Meier whom I expect you'd still recognize. Crawford seemed to alternate between computer war games, with reliable prospects for commercial success in the 80s, and more experimental fare about managing nuclear reactors, geopolitics, and such - difference being he seemed to get bored by the whole thing and completely disembarked in pursuit of whatever it was he intended to achieve via Erasmatron, Storytron, etc. It's fascinating to read his writings over that period. It seemed a sort of tragic paradox overshadowing all of it that, if he was so bored of mechanistic, algorithmic, and predictable computer game mechanics, maybe stop pursuing computer games as your chosen medium? It may have been a blind alley in the end, but someone had to explore it.
Nevertheless, it is quite sad - however, it's difficult for me to relate to the experiences of someone who lived through that first wave of personal computing and played a notable part in it - perhaps through that lens, anything was possible.
The 8x16 font from the Atari ST's hi-res mode is pretty slick if you like something bold and a little futuristic.
https://github.com/ntwk/atarist-font (or rip it directly from the ROM)
It is of some practical use, but there are a lot of slightly incompatible versions of the IBM PC and of MS-DOG, so it doesn't offer the kind of strong reproducibility that I'm looking for.
A DOS-executable is far more write-once-run-anywhere now than Java ever was, and the best thing is that no one is going to ever deprecate any API in DOS. There is no software rot in a dead OS.
reply