Hacker Newsnew | past | comments | ask | show | jobs | submit | mrbnprck's commentslogin

> open-source models with up to 120 billion parameters

Sad, 120b models are definetely feasible to self-host. I'd be more interested in cloud providers (especially one with computational resources like the Schwartz group) hosting the larger 500B+ models. Probably just too expensive/unsustainable if one could 5x fold the cost by serving smaller models solving the needs of the 90% customer.


Its still posssible to run any LLM in a loop and optimize for LoC while preserving the wanted outcome.

Could it be that slop PRs are less frequently rejected/commented due to (unfortunate) increased acceptance of it? As it turns out when maxxing AI on leaf parts of a program, the quality of the code doesn't matter that much anymore when compared to building the fundament.

Arguably its only a matter of making lsp features available to the coding agent via tool calls (CLI, MCP) to prevent the model start doing such changes "manually" but rather use the deterministic tools.


Part of why I'm not terribly fond of CLI harnesses, and prefer ones built into editors like zed. They can (but sadly rarely do) access structured information about your codebase, that's more sophisticated than looking for all strings that match


Age, possibly. Ternus has 6 more years until retirement than Federighi.


I remnber that ASN.1 does sth similar. You'd give a ASN.1 notation to a language generator (aka producing C) and not have to worry about parsing the actual structure anymore!


Literally every schema-based serialisation format does this. ASN.1 is a pretty terrible option.

The best system for this I've ever used was Thrift, which properly abstracts data formats, transports and so on.

https://thrift.apache.org/docs/Languages.html

Unfortunately Thrift is a dead (AKA "Apache") project and it doesn't seem like anyone since has tried to do this. It probably didn't help that there are so many gaps in that support matrix. I think "Google have made a thing! Let's blindly use it!" also helped contribute to its downfall, despite Thrift being better than Protobuf (it even supports required fields!).

Actually I just took a look at the Thrift repo and there are a surprising number of commits from a couple of people consistently, so maybe it's not quite as dead as I thought. You never hear about people picking it for new projects though.


FB maintains a distinct version of Thrift from the one they gave to Apache. fbthrift is far from dead as it's actively used across FB. However in typical FB fashion it's not supported for external use, making it open source in name (license) only.

As an interesting historical note, Thrift was inspired by Protobuf.


Very true. ASN.1 is mostly not a great fit, yet has been the choice for everything to do with certificates and telecommunication protocols (even the newer ones like 5G for things like RRC AND NGAP) Mostly for bit-level support and especially long-term stability. * and looking back in time ASN.1 has definetly proven its LTS.

actually never heard of thrift until today, thanks for the insight :)


Honestly, first time I've seen someone praising Thrift in a long time.

Wanted to do unspeakable and evil things to people responsible to choosing it as well as its authors last time I worked on a project that used Thrift extensively.


How come? I haven't used it for like a decade but I remember it being good.


Lot of network issues coming from Thrift RPC runtime apparently not handling anything well.

I recall threatening I'll rewrite everything with ONC-RPC out of pure pettiness and wish to see the network stack not go crazy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: