This will be an interesting one to watch. There's a huge number of tiny Linux distros, but most of them couldn't compete with something released by the Raspberry Pi foundation.
I think their biggest competitor is lubuntu. It's just a stripped down Ubuntu with the LXDE desktop. PIXEL also uses LXDE, and it looks like they've made a few design tweaks to the file manager. They both have Chromium as the default browser. Right now I think lubuntu supports way more machines.
I'm wondering why they didn't just use lubuntu, or just fork it and make a few changes. I've been using lubuntu on a small media PC, and I've been very happy with it.
> "I'm wondering why they didn't just use lubuntu"
They could of, and that might have been a great idea. Although not as headline grabby as "we made a new OS!". I think that Debian probably is slightly better on PIs simply because Raspian has been running off Debian from just after the first Pi's came out, and since Debian historically supported more architectures, so maybe makes more sense from stability standpoint to build off of Debian instead of Ubuntu. I recall the creators of Raspbian saying it was actually quite difficult as it was to port Debian to RPi 1 due to the particular architecture, so starting from Lubuntu might have been a little too difficult.
Ubuntu dropped ARMv6 a long while ago and refuses to consider bringing it back, so it was out of the running as a Raspberry Pi OS, back when just the Pi1 was around.
Ubuntu will run just fine on the Pi2 and 3, just not on the 1 or 0. Pixel is a mildly-tweaked version of LXDE running on Raspbian, which is a mildly-tweaked version of Debian. That's problematic, when part of their brand is providing a good environment for learning (which is helped by keeping the OS as similar as possible between different versions of the hardware).
Lubuntu is worth mentioning as something that has been breathing life into old PCs while being noobie-friendly and with the access to current ubuntu software repos.
The next step above Lubuntu is Xubuntu. If it will fit, use it. I buy old ASUS subnotebooks on eBay for about $40 and put Xubuntu on them when I need a little machine for something.
Drivers can be a problem. However, [UXL]buntu finally fixed the years-old bug where the cursor goes away on devices using built-in Intel graphics. (You could make the cursor come back with CTL-ALT-F1, CTL-ALT-F7, which switches the display to text mode and back.) That prevented anyone not heavily into Linux from using the thing.
I've used XFCE for years and prefer it, even on my well-equipped machines (4-core laptop w/ 32 GB RAM and a new 16-core workstation w/ 128 GB RAM).
It looks "nice enough" and is still flexible enough that I can customize it exactly how I want. I really liked Gnome back in the "good ol' days" (2.x) but I can't stand to use it anymore.
You made me curious.. 128GB RAM on Xubuntu!
Why? Which usage would require so much memory?
Found 16cores quite a lot as well. Really wondering what you do?
Not the person you're replying to, but we have lots of 64gb workstations at work for database and BI work, such a testing large queries in Apache Drill. With the right query, you can use up the 64gb quite quickly. Our servers run 384gb and have 16C/32T.
> Does chromium have a smaller memory footprint than Chrome?
If it does it's negligible after you disable google apps addons and flash.
Had the same experience, my old AMD semphron can't handle Chromium but Firefox runs just fine. That machine has 2GB DDR-I RAM but it is almost nonexistent CPU cache that kills it.
Puppy Linux has already been breathing new new life into old PCs. I have an IBM ThinkPad just like the one featured in the article running Puppy Linux.
It's sad that they're using the RPI brand to distort reality. Even though in a way the RPI project was already twisting things (GPIO + python isn't computing education but I'm nitpicking).
I think that may be looking at it in a different way that doesn't necessarily reflect the impact of all this, in the UK at least.
The way many of my generation was taught IT was Office (with maybe some Macros), a largely useless high-level overview of networking, an out-of-date comparison of "client-server", "mainframe" and "peer-to-peer" computing etc.
It was dry. Unless you already had an interest in the field, it was boring.
It taught you enough about Microsoft Office to do your other school work, it taught you enough about the web and email to get by, and it taught out-dated terminology that you would unlearn in the real world.
It didn't teach troubleshooting of any kind. It wasn't engaging.
Many of us were already ahead of the teachers. If the teacher had a problem, it was usually a student that could give the answer. (I only had one IT teacher who wasn't below the level of a teenager with an active interest in IT.)
But the worst thing was that it didn't get people interested and didn't teach the mindsets required. The underlying logic. The basics of troubleshooting. Binary exclusion. Shit, how to deal with a paper jam!
Nowadays everyone has a smartphone, can follow a YouTube tutorial, can Google stuff, before they're in their teens.
The Pi and the teaching around it gets kids exposed to the lower levels of hardware (or at least a simplified version of). It's interesting and engaging.
There's a far bigger pay-off than "my spreadsheet looks nice". A blinking light gives a greater sense of accomplishment than a ".ppt" file.
The Pi gives the tools and motivation to actually learn the stuff you are talking about. Hook 'em with the fun stuff while still being useful.
Most kids won't need to know "ls" or "mount". They're never going to "modprobe" anything.
They are already taught the very high-level basics of the Internet and networking. But they're never going to use variable length subnet masks.
But many more kids will hopefully be interested enough to pursue careers in IT. And those who don't will at least have an appreciation for it, and will have been taught skills and a mindset invaluable in the increasingly digitally connected world in which we live.
Good point. Anything is better than the office driven computer classes. The PI does indeed provide strong educational benefits to improve insights at the electronic level.
My only complaint, and weighted by the fact that the rpi team did deliver on most feature, quality and cost while so many ventures failed to even finish prototypes, is that the SoC is a monster. So no kid will ever use it to go further than python and gpio. A stupid forth CPU would be as good for electronics, but also teach some mathematical programming ideas (recursion, trees), basically the whole computing fundamentals.
SSDs are what breathe new life into old PCs. And they are quite cheap. My wife's laptop turned from extremely annoying to decent, almost pleasant to use.
Think that's a different class of old PC. They're talking about computers with 512MB of RAM, you're getting up towards 4GB before spending money on an SSD is going to be a sensible I'd say.
OA uses a Thinkpad X40 as the target machine. A 1.8 inch Hitachi drive with PATA interface and with 256 or 512Mb of RAM soldered to board with a socket for one further RAM stick. This is a 12 year old design. Probably best used with what it has rather than spending any money upgrading. A good test target for a 'light' OS.
A Thinkpad X60 or later (dual core) with 2Gb RAM can run a 'full fat' Linux (e.g. Ubuntu or CentOS/Fedora) OK if not slick in my experience (and yes I did once compile a kernel from source on an X61s - took a few hours). An SSD makes a difference in that kind of machine.
Your comment about the kenrel taking a few hours to compile made me chuckle. I went through a Gentoo phase. Nothing like compiling your compiler to compile your compiler to compile your system. Took days.
And before that I had a machine with 24 MB RAM and a terrible habit of recompiling the kernel or running BSD with ports.
at least yours finished. Mine would run just long enough to waste most of my day, only to error out because one lib wanted python-3.2.785.2.r58 but could only find python-3.2.1435.214.r3
I was using Slackware around the time that distros were switching from xfree86 to x.org. Not knowing any better, I was trying to run it as a bleeding edge system. I think I took a week or two manually upgrading each library one at a time to make the jump to the new X server.
Apparently I didn't learn my lesson; the next distro I used was Gentoo.
An SSD and stock Gnome 3 Debian in my X61s (2Gb RAM) with its Atheros wifi card goes very well, so I'm hardly surprised that your machines work well with a less demanding desktop environment and it is good that you find them of use.
Care to share the output of pstree -l -A from a fresh session? That would tell us what processes are running by default (cups&c).
I've found that an SSD is significantly better for most "store bought" computers/laptops.
They advertise "gigahertz", RAM, drive space, and sometimes screen size.
So everything else is left to the cheapest possible part. Updating the 5400 RPM HDD to an SSD has completely changed every single laptop a family member has brought me as "too slow to use".
I've also found it's damn near impossible to buy a cheap laptop with an SSD.
This should definitely be investigated before investing time getting an alternative OS up and running, along with the troubleshooting and training that comes with it.
Upgrade to the latest CPU for the platform. For $25 you could gain an extra 1+ Ghz or more cores.
Max out ram, for DDR2 systems this is usually 4GB - 8GB. 2GB sticks can easily be had for under $10 a stick.
Replace hard drive with SSD. Especially if it's a low RPM hard drive. 128GB SSDs can be had for $35 on sale.
The above upgrades will easily boost performance whatever OS is already running on the hardware, and total time to install should be 1 - 2 hours.
There are a lot of very respectable laptops available (off-lease and such) for relatively cheap.
I picked up a ThinkPad T420 a few months ago on eBay for, IIRC, $180. For $250 all in, I had a nice i5 laptop w/ 16 GB of RAM and a 120 GB SSD. I didn't need it and don't really use it for anything but it's great when I need a spare or somebody else wants to use it. It could easily serve as a great primary machine for an average user (i.e. e-mail and Facebook / web browsing), though.
Unless the system is too old to contain new drivers, or you're switching from x86 to arm, you shouldn't have a problem. I've done this multiple times, including a gradual upgrade from Ubuntu 8.04 to 16.04.
I am actually more worried about security with modern Intel processors than with old ones.
Intel Management Engine makes me very uncomfortable in trusting my machine.
With 512 MB, there's hardly an Internet browsing experience here. The biggest reason why old hardware have gone out of use (relatively quickly) is mostly due to Internet browsers and their need for more memory. So even if you have a computer with 2-4 GB laying around, and you attempt to revive it seeing the news here, it will be soon before you give up the idea altogether.
Even better to have other hardware doing the ad plucking instead of making it run tons of rules against each site. Something like pfsense has plugins to do this, or run all the network traffic through privoxy on your home network, and it'll also block tons of in-app ads on all the mobile devices using wifi downstream of that.
My main home computer is a 2 Go netbook with an Atom CPU. Not only I use it to browse the internet on a daily basis, but it was what I used to develop an award-winning, AI-powered content classification webapp.
I used to do it because I had no choice (alternatives are way too expensive than what I can afford).
A lot of Chromebooks are still shipping with 2GB of RAM and they do a perfectly sufficient job of running the "Internet" for most everyday users. I would argue your lower limit of a reasonable Internet browsing experience is higher than average because of your technical background. 512MB would be limiting for sure, but a machine with 2GB of RAM running a Linux desktop, especially with an SSD if possible, would make a fine everyday laptop.
*edit: Forgot to mention also that ad blocking aids tremendously in making the internet more usable on underpowered machines.
For £35 you can pick up a Windows 10 32-bit tablet with wifi, Bluetooth, HDMI out, USB OTG, 1GB RAM, a MicroSD slot, 32GB eMMC, Intel Atom CPU and a year of Office 365 personal. They're excellent value for money.
2 GB of RAM and a swap is capable of running even ubuntu x64 comfortably. Take away swap and browsing becomes a problem, I wouldn't suggest SSD because of that.
Well, I'd argue that Internet browsing with 2GB is only suitable for reading news and email. This then is a common denominator for layman user, who then wouldn't be using Linux or Pi to begin with. However, this thread is not about Chromebook (and I agree with you there about Chromebook - my wife uses it for email/youtube browsing, and it works perfectly fine).
There's a big difference between Chromebook and Linux/Pi/etc world of hardware/software experience for any person. You can throw a Chromebook to anyone and you wouldn't have to care about it. But try imagining the idea of giving Linux type OS to anyone...we know what happens next.
This is one reason why tablets (particularly starting with iPad) became popular among average crowd.
What happens next? I've given Ubuntu laptops (2010 laptops) to my dad, mom, girlfriend and sister, and they're liking it a lot, especially the fact that they don't have to worry about malware. They're fine with it, it's not like people are born with an innate understanding of the Windows UI and are incapable of learning anything else.
Curious, for the "Ubuntu for Grandma" box, do you use the default partition plan? Because I find that stock Ubuntu has the infuriating habit of filling the boot partition with linux kernels and then freaking out unhelpfully during later updates about the lack of space.
Tried Ubuntu for my great-grandpa. Having to redownload 100s of MB because of prebundled DE was stupid (was coming from SuSe/KDE). Also the upgrade story was not so good at that time, switched him to Debian and never looked back.
Argue all you want, I do all my work and browsing in virtual machines and those usually just have between 1 and 2 Gb RAM and 1 CPU core. Works just great.
2-4GB is not enough for you? And the brand spanking new 2016 Macbook Pros default to 8GB and 2-4 is not enough on an old computer? I routinely browse the web in a 2GB Linux VM with Firefox under Windows, with less than 1GB in use on htop. It's easy to run even Ubuntu with under 200MB of RAM usage if you forgo Unity, and 500 MB with it.
What hobbles the web on older hardware is not "2-4GB" but very slow processors (having to handle all that advertising Javascript, over-animated UIs, and bloated front-end frameworks).
> What hobbles the web on older hardware is not "2-4GB" but very slow processors (having to handle all that advertising Javascript, over-animated UIs, and bloated front-end frameworks).
^ This, as someone else said in this thread - javascript is the new Flash of the web. At least in the Flash days, many web designers were descent enough to include html only versions. When I'm using my old laptop with js turned off many pages become unaccessible and web makers don't even realize that so many people in the World can't afford buying new computers.
512MB is certainly not going to be sufficient for huge web apps and the like, but is plenty for "real oldschool-style web pages" containing mostly text and images.
As I write this my system is consuming 440MB of 4GB, and I have many other (mainly source code) files open which I'm also working on.
512mb - 1024mb of ram is standard in a ton of cheap Android phones and tablets, developers that are mindful of poor connectivity and hardware can still create performant experiences.
512MB on Android is unusable. They have not released such devices in years(maybe no name Chinese brands are still releasing them). Mobile platforms are also extremely honed in to the mobile ecosystem, which cannot be said for desktop OSes.
How many people buy Pi to browse the web? The purpose of the project seems to be
1) Find something to do with a Pi
2) Instead of buying a Pi, run Pi infrastructure software on an old laptop instead
If browsing doesn't work on an old laptop (although it does, but for the sake of argument...) then the bug is observed after step #2 but the actual failure location was in step #1, when "I wanna browse the web" somehow led to using a Pi to do it.
OLPC was meant to come with a display and a keyboard baked in, all in a (fully open) package that you could chuck across the room; a "View Source" button to let you inspect and edit the code to all your apps; UI making it easy to set up ad hoc networks to share/collaborate with anyone around you; and to put this all into the hands of children across the world, not just first world schoolchildren and the offspring of upper-middle class tinkerers.
These just the main points from OLPC's value proposition—we're not even delving into obscure or nuanced stuff yet. How does RaspberryPi's success measure at any one of these things, let alone all of them?
That is a nice interview. Eben Upton understood something pretty basic about manufacturing that the OLPC people aggressively ignored: economies of scale.
I can make 10,000 Raspberry Pi's for the same unit price I can make 1 million Raspberry Pis. There's a curve, of course, but that curve for the Raspberry Pi flattens very quickly.
[...]
I'm only seeing this from the outside but I think the main difficulty that OLPC encountered is that their minimum economic quantity was very high, and so they had to and get these big government orders in order to justify building enough units so they could hit their cost targets. I don't know what the minimum economic quantity for OLPC was but it was probably hundreds of thousands, maybe millions, and they had trouble to achieve that.
When it was early days for OLPC and they were receiving lots of positive press attention, I saw Nicholas Negroponte argue strenuously against selling directly to industry or the public as a distraction that was beneath the dignity of the project. At that moment, it was obvious they had no hope of hitting their $100/laptop target.
Yes - it was the classic "big bang" project, the only options were massive worldwide success or total obscurity. It relied on top-down funding to push it to the intended users. Whereas Pi bootstrapped from tiny initial funding.
The main problem of OLPC was Segway-level hubris, in my opinion.
RPi does basically nothing that OLPC was intended to achieve, though it does a lot that OLPC wasn't intended to. OLPC, unsurprisingly, did a lot of what OLPC was intended to.
RPi did some of the things computing enthusiasts wanted OLPC to do that weren't actually OLPC goals.
No specific allegations. I just find it odd that Broadcom apparently still sell these chips at a low price only (?) to the RasPi foundation - just because of a personal relationship - someone who worked at Broadcom then created the RasPi foundation.
What does Broadcom get out of this exclusive relationship? (Is it still exclusive?) It just smelled odd when I first heard about this way back in 2012 or so when it first launched. Four years later there's lots of volume, but the same arrangement is still going strong. I find this peculiar.
The arrangement is unusual, but I wonder why you're using words like "odd" and "peculiar" to describe it, as if Broadcom has ulterior motives.
Broadcom sells parts cheaply or at cost and in return gets some free advertising and the appearance of being altruistic. It all seems fairly uncomplicated.
>"And more importantly, what do I (as a RPi user) lose out?"
If nothing else, you miss out from the benefits of stronger competition. The Odroid-W is one example of a product that appeared to be cancelled because of the Raspberry Pi Foundation's close ties with Broadcom:
The more lightweight OSes, the better as far as I'm concerned. I've tried all the usual suspects over the years and each has pros and cons. I'm sure this fits the use case for some people out there that others don't.
Just installed it on a Zero I had laying around and I'm surprised by the general responsiveness of the UI. Of course, now that I played with it for five minutes I'll turn off the GUI and go direct to CLI like every other Pi I have. Trying to get weewx up and going on my Zero.
I installed PIXEL on my Raspberry Pi a few days ago. It looked really good, and I was impressed how clearly everything comes out over the composite video output.
I bought an old PC with an Intel Atom 330 @ 1.6 GHz. I equipped it immediately with a SSD and 4 GB RAM but found out that the second bottleneck is really the CPU. It's most obvious when using Firefox. Opening a tab takes a second or two. Even scrolling and clicking a link is a bit exasperating.
DietPI (.com) already runs on a lot of embedded platforms and is incredibly fast (despite its terrible webpage). Sadly there is no x86 image and it seems it will never be due to excessive hardware fragmentation, but the VM image is so fast that it makes one wonder how good it would be if run natively. It has become my choice for very small systems, followed by Armbian for ones with less constraints.
Not sure I like the name though. I can see people newer to the ecosystem and other consumers getting confused between this OS and Google's line of Pixel devices.
im looking at https://www.neverware.com/#introtext-3 to breathe new life into my older laptop. Anyone use it yet? I'm curious if it is as slim as ChromeOS.
I think their biggest competitor is lubuntu. It's just a stripped down Ubuntu with the LXDE desktop. PIXEL also uses LXDE, and it looks like they've made a few design tweaks to the file manager. They both have Chromium as the default browser. Right now I think lubuntu supports way more machines.
I'm wondering why they didn't just use lubuntu, or just fork it and make a few changes. I've been using lubuntu on a small media PC, and I've been very happy with it.