"My dad was abusive and killed himself and I'm happier now."
If the child is better off with their dad killing themselves, that's a failure of society to protect the child who was being abused, or possibly the child has inherited some sociopathic tendencies.
For most people, the suicide of a parent is traumatic, regardless of whether there was abuse.
One has to wonder about behind the scenes heuristics as it pertains to taking a chance distributing a backdoored version sideloaded into the App Stores. One also wonders about whether the encryption or app are possibly compromised generally (even if the source is vetted and distributions are verified)
Perhaps most of interest though would be how many phones are owned otherwise, to give access to the protester Signal comms anyway
And also metadata must still fly around anyway, no?
Signal does a pretty good job at minimizing the metadata it has access to. For example, the app can tell you who of your contacts has Signal installed but the Signal service itself never gets to see your contacts (https://signal.org/blog/private-contact-discovery/).
The problem is that in many countries, one's phone number is already killer metadata: it is linked to your identity, because you cannot purchase a SIM card without showing ID (a copy of which is made and sent to the authorities). Consequently, a repressive state can determine which of its citizens has installed Signal, and merely using an app known for privacy might already be grounds for persecution.
Apparently Signal is working on identifiers different from a user's phone number, but it is not clear how many people will actually take advantage of this feature.
Presumably those that need to will use that feature. The value is still there as the only way for someone to find out if you have Signal remains the same: brute force. If people who need to keep their Signal use private are using an identifier not tied to their identity, brute forcing will not be useful.
The uncertainty as to how many would use it is likely why it’s been back burnered for so long, but it shouldn’t impact effectiveness. I realize that you may not have been implying it would though.
Signal absolutely could do better in minimizing metadata by simply not requiring a phone number. Despite this obvious, huge, and dangerous shortcoming, I have never seen a single explanation of why Signal needs a phone number for signup.
They give an explanation literally every single time this subject is brought up, but of course on the Internet there's someone who against all possible odds manages to completely ignore years and years of the reasoning being linked to or given by a person at Signal in every single possible thread on Signal possible anywhere on the Internet, but what can you do?
The typical answer is that a secure app is useless if no one actually uses it, and the use of phone numbers is an unfortunate tradeoff that had to be made to allow the general public to easily sign up for Signal and find their friends automatically from their phone's contacts.
Often this answer is accompanied by pure sarcasm where if you are concerned about this feature, you are told that Signal is not for you and "you can go play at being a spy and sharing a secret decoder ring with your friends", as these people regard PGP to be. I wish those Signal advocates could lay off the sarcasm, it just makes the project look bad.
One is for their private contact discovery system[1] and two because they were trying to promote Signal as a default messenger with iMessage like automatic encryption upgrading. A goal to enable people to adopt it even if all their friends weren’t converted yet.
Some emails that turned up on my end: Dr. Dobbs and New Relic, although the leaks occurred from parties to whom these sites had provided my data, including at least unique email addresses.
1) Built a neural network for which consciousness that experiences and expresses pleasure and pain emerges from the neuron’s physical properties (in other words, not a contrived simulation), but is fundamentally different than the DNA/carbon systems upon which we are built (artificially designed and constructed versus conceived organically). If you can ask a computer whether it experiences pleasure or pain, it needs to be designed to do so without being explicitly programmed in contrived fashion.
Or:
2) Augmented: Integrate human (or primate, for instance) nervous systems with artificial intelligence such that the experience of the AI exceeds the capacity of the organic host to differentiate between conscious reality and dreaming, but is still distilled down in a way that allows the human host to have a sufficiently symbiotic interaction as it pertains to the processing of pain and pleasure with the connected AI. The feedback loops between the pain/pleasure experience of the human host would govern the wholistic experience of the connected AI, and the human would experience the conscious aspect. You might not say that the AI is conscious, but the human host would have an intimate sense of the AI being part of an overall consciousness. (Note: must prevent the development of immortality technology for nervous systems, to avoid testing the halting problem for sentient beings.)
Pain is not real and is just the result of signals sent through our nerves to our brain. It has no real meaning other than being a "warning sign" to our brains.
You cannot use pain as a measure for consciousness either because some humans cannot experience pain either.
Pain is real when a person experiences it. It’s part of the Hard Problem, which separates the study of the signals from objective description of the inner experience.
A human who is wired to not experience pain probably has the brain capacity to experience it, with the appropriate modifications. We do agree that all experience is perfectly correlated to a physical states/transitions, so it’s conceivable to arrange organic matter in a way that a conscious entity could experience real (to them) pain. We may not have this technology and it would seem far off for now. But we are scratching the surface.
A monk who can rewire the experience of pain (e.g. while burning to death) still has something meaningful to communicate about their conscious experience beyond the pain receptors transmitting info to their brain. But perhaps if one’s arrangement of brain/nervous system matter isn’t so free as to be trained to overcome or modify instinctual pain response (e.g. brain in a vat), anyone can be forced to experience pain.
Perhaps one can track those who pushed bitcoin early on: those who were connected to founders of Silicon Valley properties we all know and love. The name suggests Japan and Hawaii are obvious nexuses to investigate. Punahou grads, Japanese connected CIA folks perhaps?
Why wouldn’t the wannacry malware writers register the domain first? Should be possible to simply update the name servers or dns records should the kill switch need to be engaged?
It hasn't worked yet
I still fear death
And don't see that changing
Which is a conundrum especially now, being a middle aged burned out white male American college drop out homeless with a ruined reputation
I just want to be left alone not asking help
I gave my small fortune away about seven figures
I live in a vehicle and dream of being dead day in and day out
Wanted out for a long long time
I'm certainly not alone in this feeling but most rational folks will at least take care of themselves and not entirely sabotage their future
Too many unresolved incidents
Bullies in life: childhood, university, adulthood, workplace
No faith in the species
No desire to be alive
I'll probably be gone from suicide at some point before retirement
I think the species is shit
And this naturally is in total conflict with the will of most people
Just want to be left alone so I can finish the job already
Far too many unresolved incidents
Tech makes life worse in many ways by the way
For example I can't escape my ruined reputation one google search away
That will outlive me by a long shot