...because the opening line of the blog post says he's been "building websites with LLMs", and then attempts to cutely redefine that abbreviation as "Lots of Little htMl pages" in a parenthetical.
It's, um. Not the best kind of communication, and very easily leads to this kind of misunderstanding.
JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.
As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.
>A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?
Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.
Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?
Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.
Have you ever developed an enterprise scale frontend applications optimized for conversion targets? It feels like you have not. You may ship your own code in a bundle, yes. All integrations come on top of that. That chatbot, tracker, A/B testing logic etc - all are loaded separately from your service provider CDN.
An user opening a web page is not expecting a full-blown app with multi-second loading times. If that happens, they bounce, and you loose revenue. Web is supposed to have very short time to first content paint and very short time to interactive, the shorter, the better, less than 0.5s is the goal. It can deliver that, if built properly. Many SPAs, bulky JS apps are built this way for developer convenience, not for end users. The only real use case for SPA is when you deal with a lot of local data. A spreadsheet, document or image editor, a diagram tool (but then wasm is probably a better choice).
You may say, you are not building enterprise grade frontend. But if you are small enough, you don’t need SPA either.
Go on. How do I have no idea what I'm talking about? Why is it okay for a website to break simply because the analytics don't load? Why do you think that's good design? How is my personal, lived experience less valuable than yours?
Is it just that you're ashamed that you have made such poorly designed web apps that can't handle a few broken HTTP calls?
Is it just that you can't simply accept that JavaScript is a requirement for the modern web which is what this entire discussion is hinged upon?
You dismissed A/B testing as unnecessary. That is sufficient for this judgement. A/B tests mostly run on the happy path scenario of a customer: An A/B test breaks, the company is losing money at light speed.
The loading-related issues overall may eat 0,5-1% of the revenue. It is not something that should be an afterthought.
Lol, okay. I didn't know that every single customer was going to go through a tunnel as they loaded the page.
I didn't dismiss A/B testing. I'm just saying that, if the analytics don't load on the client, you should already have A loaded and ready to render. It's literally just a matter of a try/catch, and you shouldn't be waiting to load this stuff on the client-side anyways if this is truly supposed to be the "Happy Path".
Yes, I know that legacy software like Google Tag Manager requires client-side integration, but I would argue that is an orthogonal concern. You don't need to use that for your A/B testing. It's pretty easy to integrate this stuff into SSR-- especially if you stream in the HTML. This is why cookies exist.
And, again, none of this changes the central concept of this comment thread: JavaScript is necessary for the modern web experience.
Literally none of those things are necessary for a working website. If your site breaks when your analytics don't load, then that's just horrible design at any scale.
A normal person would immediately think "dang, page didn't load before I entered the tunnel. Guess I'll wait til I'm out again and refresh".
And if they're deliberately going somewhwre where there's no signal for an extended period of time, and really want it to work, they'll ensure they've loaded everything before doing so.
And I say this as someone who is developing a pwa that is for people with low end phones and very inconsistent and/or connections. I'm very cognizant and empathetic to their situation.
Anecdotal evidence does not beat statistics and user research. Bounce rate has inverse correlation to loading speed. People with low intent do not refresh, they simply don‘t come back and look elsewhere or just move on. Telling you this as someone who built first commercial website in 1999 and was a hyperscaler B2C startup CTO. Let‘s not measure the length of credentials.
To clarify, you're saying we should be jumping through convoluted hoops - full page navigation + js to rewrite history, all so that you can avoid a very minimal amount of js to show/hide a nav menu - for low intent people who are frequently entering tunnels?
Something like Datastar would enable this with like two html attributes, and only require 10kb of js (and would also allow for endless other things via declarative html).
> I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web
Downvotes should give you a hint that the few users that know what javascript is, don't like it, and the rest of them, if they learned, most likely won't like it either. Your attitude shows that you don't care.
My attitude shows that JavaScript is necessary for the modern web experience! No one has successfully argued against this yet-- nor have they even really tried! You're all just mad about my tone without even discussing the content as if this was a kindergarten class. This is absurd.
Let me get this right, you're saying that people on HackerNews don't know about JavaScript-- one of the most popular programming languages in the world?
> My attitude shows that JavaScript is necessary for the modern web experience! No one has successfully argued against this yet-- nor have they even really tried!
Exactly! I agree with you 100%! I, and many others, don't like the modern web experience and JS is the foundation that makes it all possible.
Yeah, and I don't like paying taxes or many other aspects of modern society, but I don't reflexively downvote anyone who mentions that you need to pay taxes to participate in modern society.
Archibald is anti-AI. 70+% of his public statements have demonstrated that.
He is more or less aligned with the current most common sentiment in the west which is largely publicly against AI.
But realistically it's just slow adaptation, network effects, etc.
To give an example, before the MLB rolled out the Automated Ball Strike system this year, last year maybe 65+% of the sentiment in discussions about it was negative or in some cases just neutral.
Now that it has rolled out, 95% of the sentiment online about ABS is positive. The main comment by far is, why didn't they do this before, and why don't they do it automatically on all pitches now.
There are certain cognitive and informational flow limitations in society that will cause this to be delayed, just like all major technological advancements.
But once it rolls out, the perspective you hear online will be about digital sovereignty/personal data autonomy, now we aren't required to send our data to an external provider for AI, why wasn't this available before. People will probably assume it was blocked because it reduced a major source of data for advertising or something.
And overall AI and robotics in the future will be seen as the greatest enabling factor for increased equality in society.
It's really just this underlying dislike of and disrespect for technology that much of the western public has. Which may turn out to be one of the reasons that we lose our de facto leadership position in the world.
You're a politician. The sentiment leans anti in this cultural context at this time and so do your statements overall, such as if we look at this one and the rest and tally each one as positive or negative. Underlying you are more anti-AI than neutral. So your reply may have been technically true but it was deliberately misleading.
But you haven't really made a technical argument because your objection is not really technical. It's a type of politics.
It's obviously extremely extremely useful to have a simple API for accessing an LLM. It needs permissions like most things and the ability to limit download sizes/specific or maybe block use of external services if desired.
But anyway people will just fall back to a slightly worse alternative like a wrapper around WebLLM (that wraps WebGPU).
It's probably not politically feasible for you to take a different stance anyway.
>To give an example, before the MLB rolled out the Automated Ball Strike system this year, last year maybe 65+% of the sentiment in discussions about it was negative or in some cases just neutral.
MLB's ABS does not use AI for its ball tracking. And it has specific payoffs particular to its context from four years of testing and wiel defined limits on use cases that don't necessarily generalize to issues surrounding AI and it's tradeoffs.
They've been slowly replacing the flip-disc displays on the buses where I live with LEDs and LCD panels which has been such a shame. There is a beautiful mechanical satisfaction to a panel of flip-discs inverting and I genuinely find them easier to read.
The old panels had diode issues. It wasn’t the mechanism failing — the simplicity of electromagnets means they last an insanely long time, significantly longer than an LED. The diodes were just cheap and undersized. If you have a stuck disc on an old board, 99% chance you just need to replace the diode. If it still flips but gets stuck on one side, a pin has gotten sticky and needs graphite applied.
The Luminator MAX 3000 is an interesting hybrid between a flip dot display and an LED display. I find it very pleasing to the eye and easy to read, particularly at night.
In front of the flip dots is a frame that has a mini-LED that faces and front-lights each flip dot. This gives the appearance that each flip dot is glowing.
Yes, even the ones that have an LED behind each disk which are on in the dark. This display [1] is the same but in the dark [2] you see the LEDs instead.
The LED / LCD displays are probably lighter (less heavy), and someone figured they can save 0.001 gallons of diesel a year fleet wide if they replace displays.
You've confabulated a reason why they replaced them, linked it to initiative then complained about them doing it all in two sentences. A gold medal in mental gymnastics is warranted here!
I use mine for all sorts. I volunteer at a second-hand shop so use it to set up remotes for donated media devices, I've used it to run scripts to apply the same changes to many computers that aren't on a group policy via BadUSB, I've used it for toys-to-life games, and very much more. There are plenty of genuine uses if you're cluey.
reply