Hacker Newsnew | past | comments | ask | show | jobs | submit | tednoob's commentslogin

It seems it is not just rest but that she had some procedure done as well. https://youtube.com/shorts/ndhu7uo3PrI?si=pPzpDk4lLw3eRf0j


Indeed she was not apparently getting much better until she got the nerve block procedure a few months ago.


My sister had ME/CFS after she had a burnout after her second kid. She was never permanently bed ridden but sometimes had to spend days resting or recovering after strain. She's not well today, but much better than at her worst. There wasn't really an accepted diagnose for it when my sister got it, and she had to fight to be recognised as sick.

I do wish Dianna the best recovery and future progress.


It happened to me, though I resigned when I hit burnout during covid. My whole identity was just being good at my job, and then I was no longer that. In part I think some blame is also to be placed on these companies who try to make the employees feel like a tribe or family. Since I've always been alone it was easy to slip into that false sense of belonging.


I'm sorry that happened to you. My own experience with burnout was pretty damning, but oddly, that happened with a career that was far more aligned with who I really am than my current career. There was a click, for me, that made me realize I cannot define myself by what I do for a paycheck and since then, my current career rarely comes up in IRL conversation, contrary to my HN history (which has more to do with my job being tech-related, so it fits in the context of HN comments).

But you touched on something that I struggled with for years; a sense of belonging. Humans are, by nature, fairly tribal. That's both a good and bad thing. However, we as individuals have to be mindful about how much we are acting on our sense of belonging. At the extreme end, when we let our desire to belong to something larger than ourselves call the shots, we tend to get radicalized or fall into religious zealotry. On a more day-to-day experience, our sense of belonging can drive us to seek external validation from people who simply will not offer it, which spawns things like discontent and resentment that cause more irrational behavior and damage your self-worth. It's a slippery slope.

What I have found is that being mindful about self-validation helps mitigate that. Reminding myself that I am good enough despite my flaws, I was not born to toil/be busy/make someone else rich, and my experiences and perspectives are valuable to me have become tools that help me make decisions about work/tasks that strategically avoid burnout. I never offer too much, and I know my limits very well, at this point. The result is most people see and respect that about me, where the ones that do not will quickly lose interest and move on to find someone they can successfully abuse.


> Since I've always been alone it was easy to slip into that false sense of belonging.

Same thing happened to me. Work was the first place where I felt I actually belonged and knew my own worth. It can be very intoxicating.


Do you now have more of a personality outside work?

I’m going through this now, just resigned due to burnout while being a “rockstar developer” with no life recently.


Is this method used during training? Seems to me there could be a point to only backpropagate when the model is wrong?


The model is always wrong, since it predicts a propability distribution over all possible tokens, but the target has 100% possibility for one token and 0 for all others.


I mean this is implicit in back propagation, say, you need to store gradients anyway but if you get to a zero loss than you are just done.


A reverse Turing test. A discussion with a computer to convince it you are a sane human.


This comment really made me chuckle. Well done =)


You have two adversaries. Adversary A have control over your hardware source, and adversary B have control over /dev/urandom. If you XOR them A and B must cooperate to compromise your random generator. You can combine as many sources of randomness as you want, and each increase the difficulty for an adversary to defeat your generator.


Do you have to know the target? Correct for bias you have, then question what it is upstream that is creating that bias.


How do we correct for the bias if we don't know what the unbiased state would look like?


> How do we correct for the bias if we don't know what the unbiased state would look like?

The goal isn't to choose some specific gender balance and then take whatever steps necessary to produce it, it's to eliminate unfair bias. The way to do that is to find it, not indirectly by looking at ratios, but directly by actually finding it.

High school teachers who say things like "women are no good at computers" to young women should be reprimanded etc. Poll women who didn't go into tech and ask them why not, and if any of the reasons are unjust then change them.

If no one can find anything like that then the gender balance at that point is what it's supposed to be. We're obviously not there yet, but the way we know is because we keep finding things like that, not because the gender balance is uneven.


I agree. But I notice that most discussions on sexism in technology companies focus on the outcomes; gender ratios and pay gaps. It would be interesting to try to measure the bias directly. Hide a bunch of microphones in offices and see if the number of sexist comments is larger at Google than it is at a law firm or a hospital.

Perhaps somebody has already done this. What would be the correct search terms?


> But I notice that most discussions on sexism in technology companies focus on the outcomes; gender ratios and pay gaps.

It's a specific instance of the more general manage-by-metrics disease. The thing you actually want is hard to measure, but it correlates with something that is easy to measure. So instead of doing the hard work to understand what is actually happening in detail, they measure the easy thing and optimize for that instead. Even though doing that frequently breaks the original correlation.

The result is the bureaucracy edition of a paperclip maximizer. You get what you measure instead of what you really want. Or bang your head against the wall, if the thing you measured is actually stickier than the real problem.

> t would be interesting to try to measure the bias directly. Hide a bunch of microphones in offices and see if the number of sexist comments is larger at Google than it is at a law firm or a hospital.

To some extent this is just the same disease. Is sexism supposed to be alright if it turns out there are equally large amounts in both places? Should we be satisfied that it's the root of the problem if there is very little at Google but even less somewhere else?

Stop trying to measure things against other things and just consider them in their own right. Sexism is bad regardless of how common it is. You don't fight it because there is more of it over here than over there, you fight it everywhere because it exists when it shouldn't exist.


Sure, but knowing where it was most prevalent might give us information about how to promote a better culture.


Frankly I do not see how 50/50, or any fixed number, can be a representation of the unbiased state. The unbiased state is unknown.

If you look at what the orchestra did and remove factors you know are irrelevant for the applicants then you should move towards a more unbiased state independent of what the number actually is. The problem is to find out what bias you have. One way is to compare the people who apply with the people you hire. You can see what traits you select on, then you can decide if you think those traits are good or bad.


You ensure a fair process (without attempting to overcompensate/correct for other processes out of your control), and whatever comes out is the unbiased state.

You can't escape the fact that (assuming any innate differences whatsoever) equality of opportunity will mean unequal outcomes, and equality of outcomes requires unequal opportunities.


Google are countering an existing bias by hiring people contrary to the bias. The orchestra is removing the bias.

Is it possible for Google to remove bias? Isn't the real problem further upstream? I bet that if you look at all applicants applying for jobs at Google you will see the same skew as Google have in their offices already.


I think I would say that I have some regrets not buying/mining. When I first heard about bitcoin I thought that it was pretty neat but I did not want to support it because I thought it is such a waste of exergy.


Isn't it better to assume nothing is safe for children and to leave the internet as the wild wild west that it is? Have the Government set up their own CA to sign the stuff they approve of. A child safe web browser would just disregard anything that doesn't have a valid certificate.


That raises some interesting ideas.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: