Hacker Newsnew | past | comments | ask | show | jobs | submit | _hao's commentslogin

It's sad to see that the sane opinion is so heavily downvoted.

LSP as a protocol is fine, but the actual technical implementation of JSON RPC is braindead. Only web devs that don't know anything about native code could devise such an abomination. What happened to plugins and dll's?


I love C++ for the power it gives me, but boy do I hate reading C++ code. I know most of these things are for historical reasons and/or done because of parser compatibilities etc. but it's still a pain.


I used to live and breath C++ early 2000s, but haven't touched it since. I can't make sense of modern C++.


> I can't make sense of modern C++

A lot of it is about making metaprogramming a lot easier to write and to read.

No more enable_if kludges since if constexpr (and concepts for specific stuff); and using concepts allows to better communicate intent (e.g.: template<typename I> can become template<std::integral I> if you want the template to be used only with integer types, and so on)


Thankfully, you can still write C++ just fine without the "modern" stuff and have not only readable code, but also sane compile times. The notion, explicitly mentioned in the article, that all this insane verbosity also adds 5 seconds to your build for a single executor invocation is just crazy to me (it is far longer than my entire build for most projects).


I am confused. Out of curiosity WDYM by 5 seconds being far longer than your entire build for most projects? That sounds crazy low.


It's not crazy, it's just what happens if you write mostly C with some conveniences where they actually make sense instead of "modern C++". I generally write very performance sensitive code, so it's naturally fairly low on abstraction, but usually most of my projects take between one and two seconds to build (that's a complete rebuild with a unity build, I don't do incremental builds). Those that involve CUDA take a bit longer because nvcc is very slow, but I generally build kernels separately (and in parallel) with the rest of the code and just link them together at the end.


Sure, C++ is heavy for compilation, there's simply more by the compiler to do, but code repository building under 5 seconds is at the very low end of tail so making the point about someone bearing with the 5 seconds longer build time is sort of moot.

I wrote a lot of plain C and a lot of C++ (cumulatively probably close to MLoC) and I can't remember any C code that would compile in such a short time unless it was a library code or some trivial example.


You may want to watch some of Herb Sutters videos. He is one of the few sane people left in the upper echelon of C++ supporters


Only if by power you mean performance. Otherwise C++ is not a very ”powerful” language.

I’d like to see an example of a task that can be done with less verbosity in C++ than say, Python, using only the standard library


    ++foo; // Increment value of foo by one

    foo += 1 # Increment value of foo by one


Something more complicated maybe ? Like say parse arguments from a command line ?


If you want to see everything and pick and choose what's interesting use RSS to get everything - https://news.ycombinator.com/rss


Oh look, another piece of shit AI slop I won't use. Next!


Two intelligence assets talking to each other. Both have quite similar backgrounds with dubious credentials. A history of lying and obfuscation. I wouldn't trust anything Fridman or Durov say.


I don't know how anyone sits through ten minutes of Lex speaking let alone multiple hours.


It's a steep learning curve, I concur.

Try to sit through the 5 hours of interview with Carmack if you are a coder, or the interview with Aella, if you are more into humanities.

These two are his crown jewels, IMO.


Awwww, so cute. Here's a cookie.


Subscription to Math Academy might be more suitable for that.


Red flags of Math Academy:

- Centred around AI

- Seems geared around edutech (which is what I gather from the site)

Green flags for Napkin:

- Covers advanced undergraduate and graduate topics

- Encourages pencil & paper way of learning (took me way too long to learn this is the best appraoch)


> Centred around AI

Where do you see the centered around AI? I have used it a lot and have not touched a single subject around AI.

> - Seems geared around edutech (which is what I gather from the site)

What is edutech and why is it unsuitable?

Finally, have you _used_ MathAcademy at all?


Where do you see the centered around AI?

From https://www.mathacademy.com/how-it-works:

> Math Academy is an AI-powered, fully-automated online math-learning platform. Math Academy meets each student where they are via an adaptive diagnostic assessment and introduces and reinforces concepts based on each student’s individual strengths and weaknesses.

What is edutech and why is it unsuitable?

I don't want a computer in the loop when I learn math, plain and simple. My preferred style of learning is instructor led with a mix of Socratic method and hand holding. But bar that, reading texts and using a pen and paper.

Finally, have you _used_ MathAcademy at all?

Nope, doesn't look like my cup of tea.


As far as I can tell, most of its value comes from having a reasonably thorough dependency tree of math topics and corresponding exercises (which can be solved with pen and paper) and describing it as "AI" is how you get investors to fund a math textbook.

See also How Math Academy Creates its Knowledge Graph https://www.justinmath.com/how-math-academy-creates-its-know... "We do it manually, by hand."


The “ai” is an expert system yes to calibrate to your ability to answer questions it throws at you. The questions are all human written. I had your initial scepticism as well, I can reassure you that the ai is not an LLM. Also the guy Justin skycak who built it has put a lot of thought into its pedagogy


My experience with MathAcademy is very positive. So is my experience using ChatGPT 5 as a math teacher in learning mode. I'm as fed up with AI slop as most people, but for me this is a domain where it excels.


I'm pretty sure the author hasn't actually followed his own advice. Also, the writing style was atrocious. I wouldn't trust anything said. Seems like the type of post he wrote for himself (nothing wrong with that), but I don't feel that asking a 20-25 year old to have a 5 year plan which is basically 1/4 of their conscious life is something useful or achievable for most.


I think as with everything related to learning if you're conscientious and studious this will be a major boost (no idea, but I plan on trying it out tonight on some math I've been studying). And likewise if you just use it to do your homework without putting in the effort you won't see any benefit or actively degrade.


Bullshit. Tabs are the only sane choice and I don't care what anyone else says.


Until that same AI starts shilling ads and certain viewpoints peddled by their owners in the output... This will happen 100% (ads, the other bit has already happened). The economics of all of these models doesn't work as is. There will be a major squeeze down the line.


Some of us have dipped our toes in local LLMs. To be sure, the ones I can run on my hardware always pale when compared to the online ones. But perhaps in time the ones you can run locally will be good enough.

Or perhaps an Apple or Kagi will host an LLM with no built-in monetization skewing its answers.


you csn run the model, but someone with vastly bigger resources need to train it.


Sure. Hopefully decent pre-ad-injected models will still be around.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: