Hacker Newsnew | past | comments | ask | show | jobs | submit | ShinTakuya's commentslogin

I mean, supply chain attacks are a thing that could have happened even in the earlier days. Linux almost got backdoored in 2003.

Also with the number of remote code execution exploits that have occurred in Web browsers over the years it's hard to know for sure if what you installed hasn't been hijacked unless you spent all your time on gnu.org


Yes, but the probability of the average user getting pwned was so small that it wasn't worth the constant firewall babysitting.

No, there was a big internal project (which they communicated publicly about - search for the blogs relating to it) to address it that involved roughly a year of effort from a big chunk of the developers.


Thanks. Do you have a view on whether the project went far enough?


Rovo is backed by the typical LLM providers in general, Atlassian isn't training its own models.


If you're experiencing this you're either a very junior dev or you're not as senior as your title might suggest...


They’re not facing this. They’re just lying.


You're assuming performance has been the core priority, or even a priority at all, and I think this is a bad assumption to make. I would estimate a much smaller number of people-months of work if I were you.

Dev users assume the only problem a product can solve is performance, when there is a lot more than that in reality.


Maybe in the past companies wouldn’t take the extra time for performance enhancements - but they’re apparently saying that AI is sooo good and speeds up work that they don’t need all of these extra people. So if their product was sped up it would enable their customers to work faster and lay off all of their extra employees (or just keep everyone and just do more stuff faster).

So are they doing this to make the product better or, as others have mentioned, they can’t innovate further and can’t grow their market so they need to cut costs.


Or, you know, most steam deck users aren't using them constantly and so they don't get picked up in the survey.


The average Linux gamer is likely to have a very different setup to the average Linux user in general. It's a subset of a subset.


As far as I'm aware it keeps a history of the frequency you visit each directory so yes it will select the one you've visited more often (assuming you don't always start at the base one and work your way down).


No the issue is that the one I want _isnt_ the most recent. Because 90% of the directories I visit contain the string 'src'


There's nothing inherent about C++ that makes it more suited than Rust for game engines though, Rust supports careful management of memory too. Of course, nothing besides inertia (i.e. Libs, existing code, etc.). And that of course is more than big enough of a reason to stick with it.


Rust supports careful management of memory at the expense of unsafety, at which point that particular area of the code offers no benefit over C/C++, let alone value in a rewrite. There is no type system for reinterpreting bits of memory.

And for the safe parts, the posts that I've read from people who have spent a non-trivial amount of effort with the language do not paint a clear picture either on whether there was really a benefit to the language overall.

So to say that "the world has moved on" in light of all of this is pure hubris.


There are. The count is gotten by extrapolating from randomly selected areas of sky. This is more like another detailed picture of a small patch of sky.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: