> It's all circumstantial but everything points towards "desperately trying to cut costs".
I have been wondering if it's more geared at reducing resource usage, given that at the moment there's a known constraint on AI datacenter expansion capability. Perhaps they are struggling to meet demand?
I once decided to deny new customers in order to be able to service current demand at the quality we wanted. It backfired and made people want our product even more. Our phones were blowing up. That approach can have unintended consequences!
Signup prices seem higher now than three months ago.
This is actually the least frustrating method because people who can't afford to pay are not as angry as people who paid and aren't getting served (like when sign-in emails don't arrive for hours or days), or people who have paid for a long time to suddenly see quality decrease.
But it might not be best for business: Having more users than you can handle might suck, but if you're popular enough, people are still gonna put up with it.
Bad for business and probably unwise for the type of product people will pop their head in to check on, then stop paying and return much later to see whether it's still not much more than a parlor trick for them.
I wish they would just rip the bandaid to stop everybody's entitled whining.
"We're sorry, what we were able to give you for $100/mo before now needs to be $200/mo (or more). We miscalculated/we were too generous/gave too much away for too little. It's a new technology, we are seeing a ton of demand, we are trying to run a business, hope you understand. If you don't want it, don't pay for it."
> "We're sorry, what we were able to give you for $100/mo before now needs to be $200/mo (or more). We miscalculated/we were too generous/gave too much away for too little. It's a new technology, we are seeing a ton of demand, we are trying to run a business, hope you understand. If you don't want it, don't pay for it."
Anthropic's thing has always been that they are perceived as slightly ahead of the competition, if they 2X their pricing then the competition that used to be "slightly worse" suddenly becomes an absolute bargain and guts their user base.
This is my take too, although I'm not prepared for a max400 reality to replace the max200, but... I hate all of the whingeing. Piggies at the buffet line seem to be the loudest on this subject.
can't tell if you're being facetious but yes, there's not enough cash in the world to double energy/silicon fab capacity in a year. Infrastructure takes time, hardware is hard, and you have to be willing to bet that the demand will be there 5 years from now to make an investment today.
Honestly, I wish they couldn’t subsidize with VC cash and such and offer below cost to begin with. Like I wish it were illegal. Basically this allows things like Uber, more or less putting taxis out of business and then being worse than what they replaced.
I’d like to see a lot more than entitled whining. I would like to see the fist of regulation slammed down on the back of these tech shenanigans where they know they’ll never be able to match the prices they’re starting with
I wish they would too. I’d respect them more for the transparency. I think everyone’s enshitiffication sensors have rightly been dialed up over the years. So without explanations for the regressions it just feels like another example
It sounds more like a "driver program" gatekeeper so you are arguing about semantics. I'm not claiming that there is no problem, just that an argument based on the distinction between "hardware" and "driver" is void.
https://en.wikipedia.org/wiki/Mens_rea#Levels_of_mens_rea_wi... is relevant here. There are exceptions - I don't know the specifics especially in relation to US law - but the starting point is that unknowingly causing the situation to exist doesn't make you guilty automatically. You have to intend it.
> Doesn't this mean that solar/wind are insanely lucrative?
This is how markets are supposed to work. It provides an economic incentive for production to increase, which is what we want.
Consider what happens if you develop a farming method to produce potatoes for a fraction of the usual cost, but you can only meet 10% of total demand at your local market. What price are you going to sell your potatoes for when you show up to the market? You (like any free market seller) want to maximise your return, so you'll be able to sell for a fraction under the previous market rate, undercutting everyone else. Your farming method would be extremely lucrative.
Sure, but those same free markets will happily see those expensive producers go out of business. In the electricity scenario, that would mean blackouts.
If you triple the price, you don't have a new gas plant appear out of thin air. And the result won't really be lower consumption either, because most people would have fixed rate contracts (not in the UK so don't know specifically, but this is very common elsewhere)
> Sure, but those same free markets will happily see those expensive producers go out of business.
No, because remember you are only able to meet 10% of market demand. The expensive producers will still get 90% of the business, and the market price for their product will remain basically the same. This is what we observe in the electricity markets today: the price to us is the cost of the most expensive product. The cheaper producers who cannot meet the full market demand still get to sell at the cost of the most expensive product.
> The cheaper producers who cannot meet the full market demand still get to sell at the cost of the most expensive product.
Which would mean it's super lucrative and your same laws of economics will tell you that that means they'll be building like crazy.
My while point was that as soon as you get to a day where no gas is needed you've lost the ability to react quickly because no supplier will just leave a gas plant around just for that
The markets accomodate that though. Market participants buy energy futures because they do need guaranteed future energy. Solar and wind producers cannot sell such futures (or else they can but will be forced to buy from gas plants to fulfil them when they can't). So in practice, wholesale buyers continue to buy a mix.
Yes, but here’s the thing: you don’t have a monopoly over your potato farming method. Lots of new farms are built, and the more that do, the more the average price of a potato drops. Your expected return starts to drop. Yours - and everyone else’s - profit margins get squeezed.
Investors begin to refuse to build new potato farms because a return on their investment gets worse whenever anyone decides to build a new farm.
But the people need potatoes and more potato farms! The government issues an incentive scheme to guarantee a minimum price for each potato sold. Potential farm owners bid against each other for the lowest price, but it means they can build a farm and expect to break even.
> Investors begin to refuse to build new potato farms because a return on their investment gets worse whenever anyone decides to build a new farm.
If they all refuse, then they're leaving money on the table. One investor could invest in 10% production only, and that would be very lucrative. It would be exactly my low cost to produce potato scenario.
In practice, they don't all refuse, or all invest. The market finds a balance. In time, producers switch to the new method, because anybody who doesn't leaves an opportunity for someone else to take their business and make more money.
This takes time, though. If we want things to go quicker, then we need to guarantee return on investment for longer, which is exactly what the government does by guaranteeing prices to renewable energy producers.
> This takes time, though. If we want things to go quicker, then we need to guarantee return on investment for longer, which is exactly what the government does by guaranteeing prices to renewable energy producers.
Yes exactly. The incentives to renewables producers exist to ensure accelerated growth. This may mean we are paying more for renewables in the short term (though no more than fossil fuels) but the investment should pay dividends in future.
> This might be obvious, but all of those things have a single common denominator: Microsoft, over you, getting to decide what your computer is doing.
Sure, but Microsoft have to strike a balance, too. If they push too hard in this direction, they'll lose their users to Macs on one side (probably the majority) and Linux on the other (a minority in number, but perhaps significant in expertise and clout). Once an exodus begins, it's much harder to stop. So where we are in that balance, and the state of user mindshare migration, is still interesting to discuss.
You cannot git push something that is not committed. The solution is to commit often (and do it over ssh if you forget on a remote system). It doesn't need to a presentable commit. That can be cleaned up later. I use `git commit -amwip` all the time.
Sure, you might neglect to add a file to your commit, or commit at all, but that's a problem whether you're pushing to a central public git forge or not.
TCP has an "urgent data" feature that might have been used for this kind of thing, used for Ctrl-C in telnet, etc. It can be used to bypass any pending send buffer and received by the server ahead of any unread data.
Just googling it now and TCP urgent data seems to be a mess.
Reading the original RFC 793 it's clear that the intention was never for this to be OOB data, but to inform the receiver that they should consume as much data as possible and minimally process it / buffer it locally until they have read up to the urgent data.
However, the way it was historically implemented as OOB data seems to be significantly more useful - you could send flow control messaging to be processed immediately even if you knew the receiving side had a lot data to consume before it'd see an inline message.
It seems nowadays the advice is just to not use urgent data at all.
Unfortunately the can be many buffers between you and the server which "urgent data" doesn't skip by design. (the were also lots of implementation problems)
It absolutely could have happened when the ecosystem norm is `curl https://third.party/installer|sudo sh`. That was the normal method for third parties to ship software before snaps came along.
We have Flatpaks to solve this problem too now, but AFAICT while Flatpaks do support sandboxing the UX for that is such that most Flatpak non-power-users aren't enforcing sandboxing on Flatpaks they install, so in practice the feature isn't present where it's most needed.
Whether they are derivative works in the context of copyright law (which the GPL relies upon) has not yet been decided by the courts AFAICT. So your assertion may be your personal opinion but we don't know if the law agrees or not yet. From some quick searches it seems that the answer isn't a slam dunk one way or another and is still working its way through the courts.
I have been wondering if it's more geared at reducing resource usage, given that at the moment there's a known constraint on AI datacenter expansion capability. Perhaps they are struggling to meet demand?
reply