Hacker Newsnew | past | comments | ask | show | jobs | submit | windlessstorm's commentslogin

But we convey a lot in a lot higher bandwidth, because we use parallel channels each with low bitrate limitations.


Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth. - Arthur Conan Doyle


That is such a useless quote.

I throw a ball to my friend. The ball doesn't turn in to chocolate and fly away etc.

What happened to the ball? Caught or not?

To take the quote literally, neither. The rarest bird on earth capable of catching the ball swoops down and lifts it away while I win the lottery and so on.


Site is got down of big HN hug I guess


Let us know here if you are planning to do it :)


All code pushes stuck since 30 mins. Should have started hosting own instance.


This makes me wonder how the information is stored in nature? Like how the characteristics of the particles and fields and all the interaction rules are are stored or embedded?

If we know how much bits nature is taking to store some data and how it is storing, can we use this knowledge of structure to somehow compress the data and store it more optimally? Or is nature have most optimal storage ever?


Statistical mechanics is basically the study of how nature stores information. Or at least, what this means for certain kinds of physics.

This article is about relatively new things, but if you rewind 100 years, you find people thinking hard about this in simpler contexts. Gibbs is the big name, who essentially re-wrote thermodynamics (the study of heat, which until then had been thought of as some kind of invisible fluid) in the statistics of microscopic particles. This often involves counting the number of different possible states, for instance of all the atoms in a gas. Gasses whose molecules have two atoms (like N_2, compared to He) have larger heat capacity precisely because there are more different ways that these bi-atoms can be oriented, i.e. more information is needed to write down their states completely.

The older, thermodynamic, description is in a sense an optimally compressed representation of this microscopic picture. It keeps only what little information is visible to giants like us, who cannot see the individual atoms.


Scientists don't really ask the question that where nature stores the laws it operates under. This is because the laws of nature (such as general relativity) are human conceptions - we don't think these laws are true laws of nature, only approximations to them. If the actual laws of nature are different from the ones found in science textbooks, they probably they have different storage requirements. For instance, right now physics theories use a number of different constants, such as the masses of all the massive particles (electrons, quarks, neutrinos). But we think the true laws of nature will require a fewer number of constants to define, or better still no constants at all - the numbers will emerge automatically from some consistency requirements. It seems pointless to go looking for places where nature is storing the electron mass when there might be no such place. In other words, don't confuse your map of reality with reality itself.


Along these lines, also see the Anthropic Principle [0] which states that "observations of the universe must be compatible with the conscious and sapient life that observes it."

[0] [RABBIT HOLE WARNING] https://en.wikipedia.org/wiki/Anthropic_principle


Also silly unfalsifiable argument without any external reference frame. It might well be that the universe is ultimately incompatible with sentient life, just incidentally appears as such in the current timeframe.

The simpler version of it is "God's will".


Like how the characteristics of the particles and fields and all the interaction rules are are stored or embedded?

That's an interest thought; I assume the storage is the particles. An electron acts like an electron because it's an electron. You'd have to change the particle itself to get it to act some other way.

That said, we are already exploring some ways to store information like nature: https://en.wikipedia.org/wiki/DNA_digital_data_storage


So that's where my question is. Can we store the complete information an electron carry optimally, like in a space smaller than the electron itself? This leads to the question: is this possible to save the current state of universe in a space smaller than universe without loosing any information?


Well, for starters, the holographic principle hints at the sheer level of symmetry in the state space of the universe -- so redundant that a whole dimension can be shaved off. (3d space -> 2d space.)


Or let's call these hoppers. There would be small hopper stations every five or ten blocks. You hop into one hopper, it flies you to a nearby station avoiding all traffics and zig zag roads. You hop into another hopper to incrementally get to your destination. This way these drones dont need to have huge batteries or any other complicated features for long distance flight.


Tried similar stuff few months back. Start a simple socket listening at port 80. Use any browser to make connection. After recieving any connection on server, send a properly formated HTTP response ( precede with 200 ok and must include content-length, everything else I found was optional). Now scale this to include custom root locations. Extending this to handle all the specifications the thousands of common protocols a server should handle will now feel like headache, so here I didn't continued anymore.

Next I wanted to replace the browser with my own client. Soon realized how huge of a obstacle I was heading towards. Got bogged down by the complexity of a simple browser and still havn't got started on this. Maybe someday :D


Tiny microprocessors are there in almost all devices supporting DMA since long time right? So what this RISC-V versions will bring new to the table?


> So what this RISC-V versions will bring new to the table?

Not much, except being open, so WD can produce them without paying anyone royalties.


But in that case there would not be much of a point in making such an announcement, right? From a user perspective, I do not care at all what ISA the microcontroller inside the HDD/SSD uses, if it is not user accessible.

Unless they pass the savings on to their customers, that is. And even then, I am not so sure. Shaving a couple of cents of the price of a disk drive does not seem like a big deal to me.

Their stockholders might care, though.


Depends on what you care about, I suppose. I find it interesting, because I am interested in low-power processors.

This move, if it works well for WD, could lead to more attention being paid to a more open competitor to ARM, which would provide some competition and put downward pressure on ARM pricing. That, in turn, could have some potentially interesting second-order effects.

But yeah, if you only care about consumer prices and visible features, this is probably pretty boring stuff.


Mmmmh, now that I think of it: Does WD have their own fabs, or do they buy their chips from other vendors.

And if it's the latter - would WD buying a couple of billion chips a year have any effect on prices?

And now that you mention it - a company like WD announcing they will use RISC-V in their disks means they are serious about this, which in turn might make it easier for other to consider RISC-V a serious option.

I am very excited about RISC-V in theory, but unless somebody builds a "Raspberry-V", so to speak, it will probably be a long time before I get to play with one of these. I also think a high-performance implementation of RISC-V could make for an attractive component of a desktop machine / workstation. The Talos Raptor / II seems to be a sweet machine, but it is totally outside my budget. A less-high-end machine built around a RISC-V might change the equation.


nullprogram.com


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: