Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The title really is the thesis. But the author backs it up with opinion.

Like many of these articles, the author seems to apply all these adjectives that impute intent. No technology is "inherently oppressive" nor is any technology "ruthless" because machines themselves have no intent. I believe this anthropomorphic sleight of hand is deliberate because the author does get around to changing the topic to "machine-assisted ruthlessness."

He might as well say that "tools are inherently oppressive" because any tool can be used by oppressing humans to enforce their own oppression or ruthlessness. This is the same as opining that knives are inherently violent. The argument is illogical, and an appeal to emotion.



Author here.

This misses the point that computers are a technology. Like any technology, the choice to adopt a technology is always made by a human actor with some knowledge of the relevant tradeoffs. Ergo, wilful adoption of a technology which (necessarily) possesses the attributes I describe can be taken as tacit consent to those attributes and the collateral damage they cause. There is a human actor hiding behind every life ruined by "computer says no".

Saying that no technology is inherently oppressive is strange to me. Are prisons not by design an intentionally oppressive technology?

The human perception of agency in inanimate objects is curiously variable. To my understanding, this is a factor in why some people are more religious and some people are less so. Since my article is intrinsically written from the capacity to perceive agency as present in an inanimate object, it's perhaps less compelling to people who aren't wired this way, is my guess. This is not a criticism of anyone, just an observation of an interesting human neurological variability. Other constructions of the same argument are of course possible.

In any case, the point here is that a human actor decides to adopt computers as a technology; and that that act generally replaces a human interaction which was previously social, human-to-human, and therefore on some level adaptive and modulated by empathy, whereas a human-computer interaction is inherently uncaring. The adoption of computer technology in society has led to a generalised trend of replacing human-human (social) interactions with human-computer or human-computer-human interactions, which generally remove all opportunities for adaptation or empathic variation.

This leads to my view that computers as a technology in society are inherently oppressive.

Though I suppose one possible qualification would be that this possibly is only the case when applied as a technology by a different party to the party which will be subject to it. A computer in someone's home by their free choice, which is purely controlled by them (which is therefore not any modern computer, I should note) seems like an exception.


My position is that the technology does not possess the attributes you claim it does.

Prisons are not a technology. Prisons are buildings that use various technologies, including optics, metallurgy, concrete, electronics... etc. But prisons are an application.

Adoption of computers, and the decision to use them to oppress or to free, is entirely within in the purview of the adopter. That is, the only people to blame are the ones who set the technology to a specific purpose, those who apply it.

Your essay confuses application with existence, and I don't think that make sense. A car exists. It can be used to drive kids to school, or it can be used to mow down pedestrians. There is nothing inherent in the technology with relation to intent. I think the same is true of computers.

Potential application is not the same as intent.


Prisons are a technology as I define it, as is writing, agriculture, etc.


I disagree that prisons are a technology, as that doesn't fit the term.

However, even if we were to consider prisons a technology, there are good applications and bad applications. Prisons may be used to keep violent predators away from the civil society they might harm. Prisons may also be used as a deterrent against crime, also beneficial to a civil society. That prisons are used as profit centers by corporations who also have regulatory capture is an evil of the people running those corporations and the corrupt officials who look the other way in order to line their pockets. But neither the good, nor the evil has anything to do with concrete, steel, monitoring systems, and plumbing of a prison.

Humans are good or evil, or sometimes a complicated mixture of both. Shifting the animus to the inanimate shifts the responsibility, and I doubt even clear-thinking religious people would be on-board with that. (I'm not, and I'm religious.)

Action, intent, and agency are human things. Ascribing those things to technology hearkens back to animism not rationalism. The spirit of the river made it jump its banks and flood because the river had an ill temper. The computer was ruthless when it calculated the pay of Bob. That's not how our universe works.

I agree with the premise of "machine-assisted ruthlessness." I simply disagree with the notion that the ruthlessness or oppressiveness is inherent to the tech.


I would absolutely consider prisons a technology, most obviously because we can see human societies where prisons are not a viable technology. If a member of a more primitive tribe starts killing people, the pragmatic solution is to kill them, not incarcerate them. Incarceration imposes technological (e.g. can you build buildings strong enough to hold people involuntarily?) prerequisites and high or extremely high logistical costs (see the cost of housing a prisoner for a year in the UK). You ultimately also need guards and the spare labour in society to be able to allocate to that task instead of potentially more important tasks, like obtaining food. Whether that is feasible will in turn be determined by human productivity in the various fields of production, in the case of food which is determined by the availability of agricultural technology (50% of the UK workforce used to be working on agriculture, now only about 2% to my recollection). Incarceration on the scale we see today is a relatively recent phenomenon.

Obviously such technology can be good or bad.

In my view the IT community falls into the trap of a narrow definition of the term, which now has been supplanted by an even narrower definition where technology just means "IT".

See the other thread above for my thoughts on the latter part. The adoption of a technology is done by the decision of a human. Nobody's claiming that computers have imposed themselves on society...


All "technology" (using the broad definition) is a manifestation of human ideas and the evolution of those ideas over time. Human thought preceded the creation of prisons or the repurposing of existing structures as prisons. And even the idea of a "prison" has to be collapsed into a broader category of "secure building", which is a technology that emerged because of just how useful it is to have a robust place to live in and avoid the elements, intruders, etc.

To frame prisons as "inherently oppressive" is to pretend that a prison is not just another building with the label "prison" on the outside built to a certain specification of security. Much like a church is just a building where people gather to practice their faith. I realize that these buildings have some unique features that are optimized for their particular purpose, but at each level you drill down, those technologies almost universally co-emerged because of other human needs.

These buildings do not have intrinsic essences, nor were the underlying techniques of engineering used to build them solely developed for the purpose of incarcerating people or praying to various gods.

> The adoption of a technology is done by the decision of a human

And so is the inception of the original idea that led to the establishment of the technology. The tools we build are a manifestation of collective human consciousness at any given point in time. That goes not just for the ways we use technology, but also for the technologies we invent and decide to build in the first place.

The call is coming from inside the house. The thing we need to be focusing on is our own nature, and how we tend to oppress each other. This is a problem not solved by assigning technology with properties like "oppressive". Technology is not oppressive. People are.

> Nobody's claiming that computers have imposed themselves on society...

Oppression is generally something that humans do to other humans (I agree that they use tools along the way). By saying that computers are inherently oppressive, you're implying that computers are somehow imposing themselves on society. If a human has to be involved for the oppression to occur, the computer is by definition not inherently oppressive.


> Obviously such technology can be good or bad.

I don't think I'm presenting an "IT perspective" as it actually broadens the notion of what technology is to include products and applications. My objections to this conclusion are philosophical not stylistic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: