Hacker Newsnew | past | comments | ask | show | jobs | submit | Almondsetat's commentslogin

your loss

I'm sorry, are you trusting an LLM to touch a camera that costs like a new car?

Only a little bit of touching for the really expensive one. The Credo 50 was less than 1K though.

Also Phase One Support/Repair is absolutely phenomenal and unless you toast the sensor; repairs are “fairly” economical.


Windows' subsystem for Linux

Says who? Not all devices can have the same level of repairability by laypeople. What if I complained that todays' CPUs are too miniaturized and that in my time I could swap the individual vacuum tubes in case something went wrong?

If CPU failure was a leading cause of device obsolescence, your argument would make sense. Next, the EU or other regulators should explicitly regulate software mechanisms that prevent owners of a device from installing an alternate OS, enabling open source or aftermarket OS developers to support devices that mainstream vendors no longer want to support.

>Says who?

The EU, just now.


So the EU is the objective truth of the universe, I guess

No, not everything can be repairable or replaceable, but batteries can and should be.

On Windows: download a 3GB exe and install

On Linux: add repository and install cuda-toolkit

Does that take a few hours?


We're so not accustomed to moon pictures taken with "normal" cameras. These almost look like 3D renders to me, it's incredible


Why is normal in quotes? Do you mean visible light vs filtered monochrome with false-color outputs or infrared/radio/x-ray like some other telescopes use? Would that be the abnormal you are referring? The Apollo images were taken with Hasselblad film cameras that were "normal" cameras[0].

[0]https://www.hasselblad.com/about/history/hasselblad-in-space...


I think they mean "consumer" or "off the shelf" as opposed to "custom build for the mission", we've seen iPhone and GoPro photos


What's custom built? They're using stock Nikon bodies, and as you've said iPhone and GoPros. Where's the confusion?


A camera you can order on Amazon in the next 5 minutes


I am curious to see how being namedropped by NASA affects the sales of these cameras. It's probably not as much as I am imagining but still not an insignificant number. Probably less than Nutella that accidentally flew off the shelf live on stream straight across the shot


[flagged]


I understand what cameras were used. I'm asking why normal was in quotes. What else would there have been?


>This barely mentions Windows Forms

Apparently, you do too, since what you said is basically the same as what the article said (.NET wrapper for C#, fastest prototyping to date)


Can you build a computer at home?

There is absolutely nothing self-sufficient about computer hardware


Or generate electricity? Or grow enough food to survive? Medicines?

"Self-sufficiency" arguments coming from tech nerds are so tiring.


No, and that's the reason we're now paying twice what we paid a couple years ago. But I can write software at home.

We're already vulnerable to enshittification in so many areas, why increase the list? How does that work in my favor at all?


When people talk about software or computers being "fun" in the past, it reminds me how advertisements about children's foods talk about how their cereal brings "fun" to the breakfast.

What does that even mean? Seems like empty words to me from people too accustomed to tv commercials.


"fun" and "play" are ambiguous words in English.

There is the trivial meaning, where the subject of the sentence is apparently whiling away time, achieving nothing of note except pretending perhaps to be in an imaginary land.

Then there is another sense, one that includes the thrill of experimentation, the disappointment of failure, the doggedness of persistence, and the satisfaction of victory and success when the puzzle is complete, understood, and the whole thing is working as desired or expected. This is why we call programming "fun" and if you are having fun doing it for yourself, you should perhaps be very careful where you end up doing it for work, if you do.

You could do that on computers of the 1990s, and still have the feeling of a broad system, but one which was not unfathomably deep. That's because those systems could be completely understood by one human brain, and being able, striving to be able to do that, was indeed enormously engaging, but people who waxed lyrical about such things were often seen as weirdos, and humans don't like that, generally, so instead they reach for a word that has universal meaning: "fun". Of course, words that have universal meaning, and for which everyone has their own interpretation (though they may not be aware of it), in this manner ironically tend to lose all shared meaning in the strictest sense.

What's sometimes overlooked in the Smalltalk story is that Alan Kay was leading the "Learning Research Group", which is why he refers to educational theorists like Jean Piaget. In some of Alan's talks he goes into some detail showing how children can learn about calculus by watching and visualizing the acceleration of a ball as it falls and bounces. This sort of thing is a serious kind of fun because it actually has a positive benefit, much like sport does for many people.

On the other hand, the use of the word in "making breakfast fun for children" in the advertising sense is a disgusting perversion, and is no way reasonable comparable to the idea of "computers being fun in the past".

Now, if you'll excuse me, I'm going to have my breakfast consisting of dippy eggs and soldiers, and marvel at the viscosity.


Because the true goal is AGI, not just nice little tools to solve subsets of problems. The first company which can achieve human level intelligence will just be able to self-improve at such a rate as to create a gigantic moat


There’s no evidence that the current architectures will reach AGI levels.

Of course OpenAI wants you to think they will rule the world but if we’ve reached the plateau of LLM capabilities regardless of the amount of compute we throw at them then local models will soon be good enough.


> The first company which can achieve human level intelligence will just be able to...

They say prostitution is the oldest industry of all. We know how to achieve human-level intelligence quite well. The outstanding challenge is figuring out how to produce an energy efficient human-level intelligence.


There's no particular reason to assume a human level AI would be able to improve itself any better than the thousands of human level humans that designed it.


Sure, but: that single human with the intelligence of a top tier engineer of scientist will have immediate access to all human knowledge. Plus, what do you think happens the moment its optimizes itself to run in 2, 4, 8, 16, etc. parallel instances?


Well, A) "top tier engineer/scientist" is a significant step above generic human, B) the human engineers/scientists also have immediate access to the same database, C) The humans have been optimizing it for even longer, so what makes us think the AI can optimize itself even a couple percent?

For example, if the number of AIs you can run per petaflop started to scale with the cube root of researcher-years, then even if your researcher AIs are quite fast and you can double your density in a couple years, hitting 5x will take a decade and hitting 10x will approach half a century.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: