Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why is error correction seen as an exclusively digital thing? All sorts of things have error correction.


I believe Deutsch might argue that when error correction is implemented, a system becomes digital. Babbage's early computer, for example: "thus Babbage's computers assigned only ten different meanings to the whole continuum of angles at which a cogwheel might be oriented. Making the representation digital in that way allowed the cogs to carry out error-correction automatically"

What examples are you thinking of?


Interesting take on it. I was thinking mainly of DNA with its explicitly error-correcting features that try to fix transcription mistakes (at least as I understand it? I don't do much of the squishy science) but then branched off into control loops. Sure, if you're talking encodings and entropy etc. then digital is kind of implied, but I don't think the quote I replied to requires such things:

> in many fields there comes a point when one of the incremental improvements in a system of knowledge or technology causes a sudden increase in reach, making it a universal system in the relevant domain

If you're learning to balance a stick on your finger, there's no digital algorithm executing. You're just trying to tune your feedback loop to move the base of the inverted pendulum so as to keep it upright. Yet there's a clear point where you move from being able to balance the stick for 1 second, to 2 seconds, to 5 seconds, and suddenly you can now do it indefinitely until you mess up or get bored.


That stick carries no data, so it's not "error correction" as it normally means in this kind of context. And if you tried to apply any data in the angle it would immediately be destroyed. Or if you distinguished between "stick up" and "stick not up" you're now applying error correction to a digital system with 2 states.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: