I know that many parts of the world including Asia learn english, thats why I think it should be used as the langua franca, and thats why I said, that I don't care which language it is, as long as it doesn't have weird ligatures or a large amount of characters.
For what it's worth I'd remove all characters that are not from that "computer language" including the german ones.
This is more than programmer convenience, this is about system simplicity, absolute correctness (too many bugs because of unicode and complex layouting) and above all about _forcing_ people to use said lingua franca to become comfortable speaking it.
If you want your own writing system, fine, but put the effort in and build it yourself apart from the common bug free code base that the world shares, it should be less buggy overall anyways, than a system which tries to speak every language and support every writing system.
> This is more than programmer convenience, this is about system simplicity, absolute correctness (too many bugs because of unicode and complex layouting) and above all about _forcing_ people to use said lingua franca to become comfortable speaking it.
The reason the writing system in computers are simple is because the latin characters are simple, and the majority of early computer users (which the majority of was English speakers) didn't feel the need of sophisticated systems.
I (often) find that complexity that is inside the western culture is warranted, while complexity not in western culture is not.
While not a great analogy, think of TTS. English TTS is (at least from what I have heard) not that easy (encodable in logic, but not as simple as mapping characters to audio). That resulted in some sophisticated TTS systems, where it can handle lots of different phonetic structures.
Compare this to Hangul (the Korean character system) where every character is composed of several mini-characters which have a unique mapping to audio. Basically for a minimal viable product you can just convert text to decomposed format(NFD) and map the char points to audio.
Now, let's say that some company (an arbitrary choice, Facebook) decided to make a global TTS product, first in Korea. They just decided to map the char points to audio and post-process the audio. This TTS architecture obviously can't handle English without some major architecture changes. Then Facebook decides, this is a problem in English where the phonetic structure is unnecessarily complex, and won't provide TTS service to English users.
This is something similar to what CJK users face, that people just won't make products that work reliably with CJK.
As I already argued in a different comment, I find Hangul to be more elegant than latin, even though I think that the syllabic block notation would add unnecessary complexity to an implementation.
The thing with Korean is that "nobody" speaks it. Had early computers been developed in Korea and would everybody speak Korean I would argue now that everybody should just learn Korean. But tough luck, it didn't. Our best shot at a lingua franca and simple computer systems(simple in the sense; as little complexity as possible) is with english.
I can understand (but don't fully agree to) a claim like that a F/OSS software should be primarily in English to maximize its community, as many programmers already speak English to some extent out of necessity anyway. But forcing English to ordinary people not using English? Plain absurd.
For starters, while young people may feel more comfortable with occasional English older people don't, and your proposal will make them much more uneasy. Smartphone penetration rate around the world [1] can easily reach 50% for many countries and that would include many people unable to understand English---I live in Korea and older people are technologically no inferior to younger people in that regard. Impossible when you force them to use English.
> I live in Korea and older people are technologically no inferior to younger people in that regard.
You guys truly are wonderful unicorns. How do you do it? (serious question in that regard)
On topic, I think the ultimate solution is to have whatever language input/output be essentially a placeholder, and then some general "switching" system which puts whatever language a user prefers. It should be able to switch on-the-fly totally independently of application state. (same thing for numbers, outputting decimal or hexadecimal should happen on-the-fly, same model value but different user view)
Commercial OS's tend to be pretty invasive these days, but the one thing where we need more of them, and better tooling, is definitely text i/o.
Adults do learn languages well when required, sure. I question that we should. Making additional [1] billions of people learn a new language is not cost-effective compared to a (comparably) small number of software engineers wrestling with i18n (haha). It is sad that English knowledge is very valued among, for example, several non-English [EDIT] workplaces even when it is not required at all.
> Hangul would also be a nice lingua franca writing system, too bad its very local
By the way, Hangul was tailor made for Korean with its relatively simple CGVC syllable structure. I doubt it can be generalized much---Hangul is important because it is one of the first featural alphabets, not because it is a universal (or at least adaptable) featural alphabet where it isn't.
[1] English has about 2 billion speakers (400M native + 750M L2 + 700M foreign), so you need another billion speakers or two to make your proposal real.
[EDIT] was "CJK", but I'm not sure about Chinese and Japanese.
Instead of the total number of speakers the better metric would be the diversity of the speakers.
A language is a better candidate for a lingua franca if its spoken everywhere a little, than somewhere a lot, e.g. Mandarin.
And this is not about cost, but about Bugs in critical infrastructure.
I'd get rid of smartphones and unicode in a heartbeat if it meant bug free command line applications, no malware, and medical and infrastructure that doesn't crash.
I think its great when non english workplaces require english. It gives people an incentive, because humans are a lazy species. And once you learned it you can use it on your next vacation, your next online encounter, and who knows where.
Having computers only do english would have given a similar incentive, and I'm sad we didn't use that opportunity.
English is the lingua franca, but not more than that. Most communication is still done and will always be done in local languages, and computers should work for them.
This is more than programmer convenience, this is about system simplicity, absolute correctness (too many bugs because of unicode and complex layouting) and above all about _forcing_ people to use said lingua franca to become comfortable speaking it.
If you want your own writing system, fine, but put the effort in and build it yourself apart from the common bug free code base that the world shares, it should be less buggy overall anyways, than a system which tries to speak every language and support every writing system.