Hacker Newsnew | past | comments | ask | show | jobs | submit | dgan's commentslogin

> For you, the turquoise is green!

No, turquoise is turquoise. You gave me two options, and you act like I didn't know that word exists


This is the first commercial website where i just couldnt find the price, neither the "buy" button ..

The UK site has them out of stock, and about 350 quid.

So I won't be replacing my Aeropress or Bialetti pot with one this month.


They linked to the manual page, if you go to the root of the site, there's a link to a different domain which is the "Official authorised online store for all Cafelat items."

Cant wait to look like an idiot after sending a heart to my boss while trying to select the hostname

Maybe he will like it! Who knows?!

I randomly bought NAC just to try it. I dont know about the chemical interactions, but going out with collegues at that time taught me that it's basically impossible to get drunk. Usually a pint of beer is enough to make le feel at least a little dizzy, but when taking NAC, it was all like drinking water

If you all think NAC is great, wait till you try liposomal glutathione (glutathione is one of the things NAC is a precursor for, one of the general take-out-the-trash compounds for your cells). Of all the supplements I’ve tried, it has probably the most immediately noticeable positive effect (maybe because you take it by leaving it under your tongue to be absorbed sublingually for a bit before swallowing). Generally leaves me feeling great, even if I was kind of dragging and tired beforehand.

NAC taken before consuming alcohol has a positive effect apparently, but taken afterwards it's detrimental as mentioned here: https://en.wikipedia.org/wiki/Acetylcysteine


Took me a while, because i pronounce "Pfizer" as "pfee-tseh-r" in my head

That's the original pronunciation

Not sure why this is voted down, it's true.

On mice.

Just a note: “research about the safety of taking NAC every day for the long term is limited.” cf. a concerning 2019 animal study regarding higher risks of cancer https://doi.org/10.1172/jci.insight.127647 also discussed at https://www.science.org/content/blog-post/n-acetyl-cysteine-...

Same! I thought I was going crazy but the effect is clear and reproducible. My hangovers are also less bad.

When I go out drinking with my pharmacist buddy, we take NAC before going out. He swears it makes hangovers less likely. I can't say I've noticed that particular effect, but I do seem to sleep a bit better on those nights.

I am don't have an opinion on the efficacity of such poisoning, but your comment is about as useful as "when being violently attacked, do not resist, as you only make yourself suffer for longer"

I think it is pretty obvious that at the challenge with all abstract mathematics in general and the category theory in particular isnt the fact that people dont understand what a "linear order" is, but the fact it is so distant from daily routine that it seems completely pointless. It's like pouring water over pefectly smooth glass

You're more right than you'd think. The whole point of mathematics is precise thinking, yet the article is very inaccurate.

Nobody seems to care or notice. I'm watching in disbelief how nobody is pointing out the article is full of inaccuracies. See my sibling thread for a (very) incomplete list, which should disqualified this as a serious reading: https://news.ycombinator.com/item?id=47814213

My conclusion cannot be other than this ought to be useless for the general practitioner, since even wrong mathematics is appreciated the same as correct mathematics.


> Nobody seems to care or notice. I'm watching in disbelief how nobody is pointing out the article is full of inaccuracies.

I don't know. I finished my graduate studies in math a few years ago, and pretty much every textbook by well-known mathematicians was packed with errors. I just stopped caring so much about inaccuracies. Every math book is going to have them. Human beings are imperfect, and great mathematicians are no exception. I'd just download the errata from the uni website and keep it open while reading.


Is there a "mind-blowing fact" about category theory? Like the first time I've heard that one can prove there is no analytical solution for a polynomial equation with a degree > 5 with group theory, it was mind-blowing. What's the counterpart of category theory?

A thing is its relationships. (Yoneda lemma.) Keep track of how an object connects to everything else, and you’ve recovered the object itself, up to isomorphism. It’s why mathematicians study things by probing them: a group by its actions, a space by the maps into it, a scheme in algebraic geometry defined as the rule for what maps into it look like. (You do need the full pattern of connections, not just a list — two different rings can have the same modules, for instance.) [0]

Writing a program and proving a theorem are the same act. (Curry–Howard–Lambek.) For well-behaved programs, every program is a proof of something and every proof is a program. The match is exact for simple typed languages and leaks a bit once you add general recursion (an infinite loop “proves” anything in Haskell), but the underlying identity is real. Lambek added the third leg: these are also morphisms in a category. [1]

Algebra and geometry are one thing wearing different costumes. (Stone duality and cousins.) A system of equations and the shape it cuts out aren’t related, they’re the same object seen from opposite sides. Grothendieck rebuilt algebraic geometry on this idea, with schemes (so you can do geometry on the integers themselves) and étale cohomology (topological invariants for shapes with no actual topology). His student Deligne used that machinery to settle the Weil conjectures in 1974. Wiles’s Fermat proof lives in the same world, though it leans on much more than the categorical foundations. [2]

[0] https://en.wikipedia.org/wiki/Yoneda_lemma

[1] https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspon...

[2] https://en.wikipedia.org/wiki/Stone_duality


We should call it “relationship lemma”. That way its function is contained within its name. And would not require the definition step every time.

We should strive to name all things by their function not by their inventor or discoverer IMO. But people like their ribbons.


In my study, it's basically never that the person names the thing after themselves. My theory goes: Often a discovery is presented in a paper by someone(s), who gives it a usually only barely passable name. For a time, only a handful of experts in the field know about it and none of them care to write general explainers for the layman. So they call it what's easy. "[Name] [concept]" because they're used to talking in names all the time. Academic experts have a large library of people's names tied to the concepts in their papers, i know my PI certainly did, every query was met with a name that had solved it to go look up.

Anyways, the discussion begins with these people. Who all use the name to reference the paper which contains the result. As the discussion expand, it remains centered on this group and you have to talk _with_ them and not at them so you use the name they do. This usage slowly expands, until eventually it gets written in a textbook, taught to grad students, then to undergrads, and it becomes hopeless to change the name.

I share the frustration with naming, we can come up with such better names for things now. But until we give stipend bonuses for good naming, the experts will never care to do so. But i wholeheartedly disagree that the problem as a whole can be reduced to "people like their ribbons". Naming something after yourself is so gauche and would not be tolerated in my field at least. The other professors would create a better name simply out of spite for your greed.


well, this is more applied and less straightforwardly categorical, but thinking along the lines of solely looking at compositional structure rather than all the properties of functions we usually take as semantic bedrock in functional programming (namely referential transparency) is how you start doing neat arrowized tricks like tracking state in the middle of a big hitherto-functional pipeline (for instance automata, functions which return a new state/function alongside a value, can be neatly woven into pipelines composed via arrow composition in a way they can't be in a pipeline composed via function composition)

https://en.wikipedia.org/wiki/Abstract_nonsense

https://math.stackexchange.com/questions/823289/abstract-non...

Sometimes the proof in category theory is trivial but we have no lower dimension or concrete intuition as to why that is true. This whole state of affairs is called abstract nonsense.


I think that CT is more akin to just a different language for mathematics than a solid set of axioms from which you can prove things. The most fact-y proof I've personally seen was that you can't extend the usual definition of functions in set theory to work with parametric polymorphism (not that just some constructions won't work, but that there isn't one at all).

Well, group theory is a special case of category theory. A group is a one object category where all morphisms are invertible. You do group theory long enough and it leads you to start thinking about groupoids and monoids and categories more generally as well.

Sure, category theory can't prove the unsolvability of the quintic. But did you know that a monad is really just a monoid object in the monoidal category of endofunctors on the category of types of your favorite language?

Isn't that just the definition?

I think they're making a joke

Phil?

One of the most striking things is that cartesian products of objects do not correspond to set-cartesian products. This to me was mind-blowing when studying schemes.

Just Yoneda Lemma. In fact it feels like the theory just restates Yoneda Lemma over and over in different ways.

And the number of things you can prove using Yoneda lemma just proves how powerful category theory is.

How is this useful?

>so distant from daily routine that it seems completely pointless

imo, this is a problem with how it's taught! Order theory is super useful in programming. The main challenge, beyond breaking past that barrier of perceived "pointlessness," is getting away from the totally ordered / "Comparator" view of the world. Preorders are powerful.

It gives us a different way to think about what correct means when we test. For example, state machine transitions can sometimes be viewed as a preorder. And if you can squeeze it into that shape, complicated tests can reduce down to asserting that <= holds. It usually takes a lot of thinking, because it IS far from the daily routine, but by the same rationale, forcing it into your daily routing makes it familiar. It let's you look at tests and go "oh, I bet that condition expression can be modeled as a preorder on [blah]"


You say pretty obvious, but it took me 2 years during my PhD to be consciously aware of this. And once I did, I immediately knew I wanted to leave my field as soon as I would finish.

I'm just curious. Do you play computer games?

I have played quite a lot of video games in the past yes. But not much anymore.

If i had a euro for every time I started writing a compiler, and got lost in the parser weeds, i d have ... At least couple of euros


Motivational comment to remind OP that his life matters, even and especially, in difficult times


You talking like a bite must happen. No it's not. Source: myself, we ve had a dozen of dogs. Among them : rotweiler, new foundlands, montagne de pyrénées, terrier, and dozens of chihuahua and spitzs


I am on latest Fedora Gnome, and tab switching between windows randomly stucks. It's so annoying, i had to go back to X11, even if handles badly high dpi laptop; the alternative being to reboot randomoy in the middle of the work


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: