Hacker Newsnew | past | comments | ask | show | jobs | submit | paulddraper's commentslogin

> if the only penalty the company/agency gets

What is the penalty for the government?


Elon Musk

> I would love to figure out how to stop that from happening automatically.

AGENTS.md


> AGENTS.md

-- which will be ignored just often enough that you can never quite trust it.


Yup. No matter how much you tell it to keep things simple, modular, crisp, whatever, it generates tons of garbage much too often.

Btw it may be obvious but afaik claude by default only reads CLAUDE.md and not AGENTS.md

And yet still less often than the average developer.

By all means elaborate. I can't imagine "don't have stupid ideas or write messy code" is going to make much difference.

To elaborate: That advice isn’t as objective as you think.

What one developer calls clean the other calls messy.

My advice is to use it, then document the issues when it gets messy. It takes some time, but no more than recruiting, training, paying another engineer.


I think the issue is deeper than prompts, agents.md, smart flows, etc. I think the problem is that LLMs are searchers, trained on preferring some results. So, if the dumb solution is there, and the smart solution is not there, they won't spit it out.

> Sometimes HN drives me crazy.

You can tell the difference between those who build businesses and those who simply use them.


To state the obvious, "good engineering/design practices" will not tell you what features are used or not.

> Git has served us well for 20+ years

Funny. I think that, but the usual HN narrative is that Git is UX hostile.


> This was bad enough that Node.js eventually changed unhandled rejections from a warning to a process crash, and browsers added unhandledrejection events. A feature designed to improve error handling managed to create an entirely new class of silent failures that didn’t exist with callbacks.

Java has this too.


You can choose for attackers not to use AI?

"Artisanal art" as it were.

You are in violent agreement.

> inference is indeed profitable


> So you're asking for some type of equity that's private?

To read more: https://en.wikipedia.org/wiki/Private_equity


Anthropic revenue is ~$30B/year.

Revenue is a meaningless measure though; what's the spend:income ratio? Excluding capital investments, what's the cost of operations?

At a very minimum, to repay the +$100b in investment within a reasonable timeframe, what's the minimum figure they have to bank post-tax each month?


Since when revenue is meaningless? It’s an indication of market acceptance. Anthropic has one of the most expensive plan, they didn’t undersell other models. Open weight models would otherwise dominate if cost is the only factor.

Also, investment is not money in the bank. They can’t withdraw $100b tomorrow. That means they don’t have to repay until after they got the investment, which is a commitment over several years.


> Since when revenue is meaningless?

When you're selling $10 Bill's for $1, then revenue is meaningless.


It is meaningless when what you sell costs more than what your customer pays for it.

I could sell $100B of GPUs at 90% of their cost tomorrow and I have market acceptance.


Because at some point, you have to turn a profit. That's why people are wondering the margins, if their revenue is 30B but expenses are 60B with current investment repayment factor in, that means massive revenue increases or massive lowering of expenses are required to make the business profitable. What's the business impact if they do?

> At a very minimum, to repay the +$100b in investment within a reasonable timeframe, what's the minimum figure they have to bank post-tax each month?

I am completely confident that Amazon of all companies is totally fine with not taking a return for a long time.

Amazon didn't book a profit for the first decade of their company. It's completely modus operandi to burn, burn, burn to get as big as possible.


Reportedly, they lost $4B last year.

By all accounts they in striking distance of profitability if they wanted.

It makes sense; Anthropic is by far our biggest vendor expense outside of AWS. And I suspect that is true at a number of companies.


> By all accounts they in striking distance of profitability if they wanted.

By their accounts they are in striking distance of profitability. Until they go public all we can do is estimate how much they burn by looking at how quickly they need more capital - this latest investment by Amazon ($5b investment with on $100b returned over 5 years) tells me that their previous raises have been spent.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: