Wednesday, January 30, 2008

Enter the Medium

I would be interested to know how much McLuhan influenced Burgess’ A Clockwork Orange (1963). The Ludovico treatment depicted within the book and the Kubrick adaptation (1971) seem to have been directly influenced by McLuhan—not to mention Alex’s horrorshow, devotchka-like manipulation of nadstat.

The Ludovico treatment:

Returning to Mcluhan, several of his theories intrigued me (and were echoed in later readings). McLuhan seems to want to assign morals to inanimate objects, which seemed odd (and disagreeable) because inanimate objects are by nature amoral or completely without moral leanings because they are incapable of deciding how they are used—people have to use them for good or evil.

This seems to be the main problem with cyborg theory in general: it wants to assign moral judgment to inanimate objects. The Terminator movies were fun but we lack the ability to ever create a computer that is self-aware. We could make a computer that might pass the Turing test, which means that a computer has been programmed to emulate human moral judgment, but the machine would be nothing more than a reflection of the moral underpinnings of its programmer. Perhaps the closest we will ever come to creating machines that have moral judgment is in the creation of a cybernetic organism—merging a human being with a machine.

Cypoultry:

Again, however, the machine itself is still an amoral object, but it is directly controlled by its human brain—essentially taking the place of the programmer. Thus the medium is still not the message; rather, it is a very direct carrier of the message.

Dr. Ludovico with Cypoultry:

6 comments:

Doc Mara said...

So, I could infer that McLuhan was the most intriguing theorist to you? Will you be doing deeper research into his theory of the Medium is the Message?

Doc Mara said...

> This seems to be the main problem with cyborg theory in general: it wants to assign moral judgment to inanimate objects.

Actually, cyborg theory discusses the merging of human with both animal and non-living systems. Animation is usually assumed to part of both mergers.

> The Terminator movies were fun but we lack the ability to ever create a computer that is self-aware.

Hard to project ourselves to the end of time to make this judgment. Seems like a bit of a stretch to assume either way at this point.

> We could make a computer that might pass the Turing test,

We have already done so. Quite a long time ago, actually

> which means that a computer has been programmed to emulate human moral judgment,

Not exactly. You could ask questions about this. You could be very crafty and give the computer a set of instructions that render a variously moral/amoral character picture. Multiple personality disorder, anyone?

> but the machine would be nothing more than a reflection of the moral underpinnings of its programmer.

Not really. Computers do many things we cannot and would not necessarily have them do. In fact, we are using them to solve problems and to render judgments that we cannot accomplish without them. You might indicate that because of our necessity in their formulation and maintenance, they are really just extending OUR moral judgment--which actually makes them an extension of us and cyborg. It is troubling because without gut bacteria, we would not make it to sexual maturity and die out as a species. Does that make us extensions of gut bacterial moral judgment? Troubling.

> Perhaps the closest we will ever come to creating machines that have moral judgment is in the creation of a cybernetic organism—merging a human being with a machine.

Many in posthumanism believe that we have ALWAYS been posthuman. The concept of moral judgment only exists the moment that we create an inside/outside dichotomy enabled by manipulation and chaining with "other" organisms and mechanisms. I'm not saying I agree, as I don't have all that much time to contemplate it. It just sounds pretty convincing when you consider how the development of the concept of the subjectivity depends so heavily upon hiding our dependency on connections to all sorts of what actor-network theorists call "actants."

But I DO like the cyber-poultry.

Jennie said...

When considering the concept of moral judgment and a computers ability to be moral or self reflective, I think it is important to think about the socially constructed nature of morality. Morality changes from place to place depending on the people who live there and what they believe is acceptable. What is moral behavior in Fargo, ND may be completely immoral in Africa. Therefore, it takes self reflectivity to determine what is moral in a given situation. Since computers can follow a program but not necessarily read social conventions unless they have been programmed to, they are not capable of the human processes we often expect of them.

Sportet said...

If we have ALWAYS been posthuman, doesn't that make posthuman just plain human? If so, should we just stop using the word?

Kat D said...

I as well very much so like cyber-poultry. I think it's interesting that we tend to get stuck in with this idea that McLuhan needed to give morals to inanimate objects. I don't see why (in the context of our understanding of them) that they don't have moral implications. It's our interactions with media (and the messages contained therein) that allow us to make almost any moralistic judgments at all. Computers can contribute to our ethics and continue to shape them as we contextualize ourselves within that medium.

Jennie said...

When considering the concept of moral judgment and a computers ability to be moral or self reflective, I think it is important to think about the socially constructed nature of morality. Morality changes from place to place depending on the people who live there and what they believe is acceptable. What is moral behavior in Fargo, ND may be completely immoral in Africa. Therefore, it takes self reflectivity to determine what is moral in a given situation. Since computers can follow a program but not necessarily read social conventions unless they have been programmed to, they are not capable of the human processes we often expect of them.