Hacker Newsnew | past | comments | ask | show | jobs | submit | Zardoz84's commentslogin

Well... The rusian Spectrum clones had some sucesfull career. And they did a lot of improvements over the original Sinclair and Amstrad designs. The Pentagon and the Scorpion with extra RAM, or the ATM come from pure rusian ingenuity .

don't forget the original Command&Conquer

I remember compiling Linux Kernel on SuSE 6.3 on a AMD 486DX5 133mhz ... good times , and I don't forget to do "make mrpropper"

Didn't do you try again Linux recently?

I had one of these back on the day. A very fine 486

I agree with you. Also, there is people (like me) that like to small commits (that don't break stuf) instead of huge mega commits. If I do something like small broken/wip commits, are only under my working bramch and I do a interactive rebase to merge on good cohesive commits.

The past was bad. But the current is far worse. Tell it to the people disappeared in the ICE concentration camps. Or to any trans people in any bad state.

Windows 3.1, with the aproppiated drivers and modern SVGA card, had accelerated 2d graphics. Accelerated GUIs don't even need GPU or 3d.

What does "GPU" mean here? Previous uses of the term seemed to imply "dedicated hardware for improving rendering performance" which the SVGA stuff would seem to fall squarely under.

The term GPU was first coined by Sony for the PlayStation with its 3D capabilities, and has been associated with 3D rendering since. In some products it stood for Geometry Processing Unit, again referring to 3D. Purely 2D graphics coprocessors generally don’t fall under what is considered a GPU.

It has been associated with 3D rendering, but given that things like the S3 86C911 are listed on the Wikipedia GPU page, saying "Accelerated GUIs don't need GPU" feels like attempting to win an argument by insisting on a term definition that is significantly divergent from standard vulgar usage [1], which doesn't provide any insight to the problem originally being discussed.

[1] Maybe I've just been blindly ignorant for 30 years, but as far as I could tell, 'GPU' seemed to emerge as a more Huffman-efficient encoding for the same thing we were calling a 'video card'


I don’t agree with what you state as the vulgar usage. “Graphics card” was the standard term a long time, even after they generally carried a (3D) GPU. Maybe up to around 2010 or so? There was no time when you had 2D-only graphics cards being called GPUs, and you didn’t consciously buy a discrete GPU if you weren’t interested in (3D) games or similar applications.

In the context of the discussion, the point is that you don’t need high-powered graphics hardware to achieve a fast GUI for most types of applications that WPF would be used for. WPF being slow was due to architectural or implementation choices.


That's the real takeaway - WPF should have degraded gracefully (read, full speed performance without the bling) but it didn't.

We used to call those things a "Video Card", which you put into your computer to get a video signal out.

Back in the day there was a card called an S3 Virge, which we affectionally called a 3D decelerator card, because of its lacklustre 3D performance.


Most people consider GPU to mean "3D accelerator" though technically it refers to any coprocessor that can do work "for" the main system at the same time.

GPU-accelerated GUI usually refers to using the texture mapping capabilities of a 3D accelerator for "2D" GUI work.


I remember these disk from my Spectrum +3 . Indeed more hard and resistant that the 3.5" . Sad, that the format was on the losing side and never evolved beyond the 128k (or was 256k?) that could store on a single side.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: