Sunday, February 19, 2006

It's Gary's Turn to Lead the Discussion!

This week, Gary leads the class. He's going to provide some interesting insights into how these ideas work through AI and current cultural issues....

If you want to visit the links Gary provides, you can find them in the comment area of the last post (questions for 2/6/06)...when I transferred the material, the links didn't take. My bad.

Anyway, post on! These give us excellent food for thought for Monday night's class.


1) Considering how “generalized surveillance” has led to (our) “disciplinary society,” (365) what are some instances in which we can observe the effect of Panopticism – a “state of conscious and permanent visibility that assures the automatic functioning of power” (360) – at present? (Within the current context of the Patriot Act and wiretapping in the U.S., recall Foucault’s point about how the Panoptic discipline-mechanism operates separate from law (370) and the ostensible limits on power.)

--or--

Bentham imagined “a network of mechanisms” throughout society in place of the prison (364) -- what can we conclude about how we elect to use network technologies that are potentially (and often essentially) Panoptic, while others employ similar devices and methods for counter and inverse surveillance?

(See also the idea of sous-veillance)

2) Elliott mentions Kramer’s label of “cosmetic psychopharmacology” (376) regarding the recent trend in medication (e.g. Prozac). With the combination of the capitalist healthcare industry and our consumerist society, will ethical issues influence/affect our attitudes and uses of enhancement treatments/technology? Can ethics even play a significant role (in opposing enhancement tech.) when “cultural complicity” (374) and “authenticity” (377) are ideological critiques in the first place?

3) Here is a link to the ELIZA program that Joseph Wiezenbaum developed in the 1966, a primitive A.I. simulation of a Rogerian therapist. The Jabberwacky chatterbots named “George” and “Joan” yield much more interesting and more fluid conversations.

From reading both the Dreyfus brothers’ skeptical criticism and Kurzweil’s idealistic view, will A.I. remain limited even as “expert systems” (407-8), or will progress eventually result in A.I. having self-awareness and spirituality (393)? How do you predict A.I. will develop, and more importantly, how will it function in (or as part of) our society? Can we reasonably presume that the future will be neither utopic nor dystopic (a la The Matrix) regarding this issue?

4) Our relationships with Artificial Intelligence may be significant in future years (more like those depicted by the film A.I.), but presently we don’t regard the computer as a “subjective” agent who “does things to us” (423), for example. Despite publishing this article in 2002 (source), which ideas of Turkle’s seem outdated and/or irrelevant, not necessarily due to time but more so our current socio-techno trends and views?

(Some readers may even question/challenge Turkle’s ideas fundamentally, on the basis of her employing a theory (psychoanalysis) that seems obsolete for this context, after all. I gave her an honest read, but is there anything applicable to “computer culture” that we can salvage?)

5) Outside of his scientific context, how does Ihde’s idea of “technoconstruction” (485, plus Kaplan’s intro on 432) apply to our uses of technology (mainly personal computers) regarding writing, information, media, communication, community, society, etc. (i.e. within a social sciences/humanities context instead)? Could his term be a useful label for a digital/network paradigm or episteme (knowledge/way of thinking) in our present age?

3 Comments:

Anonymous Anonymous said...

1) It’s an interesting dilemma. How much freedom are we willing to give up to be safe? The windows into our lives multiply as technological progress speeds forward. Eventually, we’ll have to draw a line somewhere and say this is enough. Is all public surveillance okay? Certainly, most people would be against private surveillance. (being spied on in the privacy of your home) Does that extend to the digital world? Your in your home, but still interacting with a public network.

Everyone draws the line in different places. I’m used to seeing cameras everywhere I go outside so it doesn’t really bother me. I don’t mind them being in school hallways but wouldn’t want them in the classroom. I don’t care about seeing them in the hallways of a business complex but wouldn’t want a camera staring at me as I sit at my desk. Will future generations care about any of these things? Will they just accept total access to their personal lives without a second thought? It seems to me that we’re headed down that road.

2) I think it depends on the situation. Sometimes ethics stop these enhancement treatments and sometimes they don’t. It should also be noted that just because one side is thought of as ethical by some or even the majority, does not automatically put them in the right. As times change, so does morality. I don’t think there’s one sweeping argument that can be made. With any new technology, there will always be those who try to stop its progress, for the right or wrong reasons. There will always be early adopters who make genuine good use of the technology. There will always be those who abuse it. It seems to me that the one thing that remains constant is the acceptance of the younger generations. Things my parents disapprove of, I see no problem with. I can’t imagine having the ability to “build” my own baby. But maybe that’s something my grandkids won’t think twice about.

3) Even though it’s hard for me to imagine, I’m sure we will get to a point where A.I. is a reality. It’s difficult to predict where this new technology will lead us. Will humans continue to move up the ladder in the working world while we let the lesser machines do all our dirty work? Will we let this intelligence take nearly complete control over our lives? If A.I. does advance to the point where it can truly learn on its own and make its own decisions, doesn’t that make it a life form deserving of rights? Will we see ourselves become ruthless oppressors as in so many cheesy sci-fi movies? I wonder if we will come to a point where we’ve become far too reliant on the intelligence of our creations. But that could just be my fear of the unknown talking.

5) I don’t fully understand the question. It has to do with making to us visible what we once had little to no access to. In writing, we have access to scholarly work from all over the world. We can also see the work of many individuals whose voices would not have been heard otherwise. We can make visible our own words to the world. We are exposed to new ideas, cultures and people. As our ability to communicate increases so does our awareness and exposure to different facets of the world we live in.

7:35 PM  
Blogger Bill said...

This comment has been removed by a blog administrator.

4:41 PM  
Blogger Bill said...

1. Given the McCarthy Era forward, I would think our society has had panoptic expectation of the government. Whether or not the government is violating the law or even ethical behavior all holds subjectivity to the situation.
Some instances we can observe the effect would simply be in the society's general acceptance of government monitoring. The laws of the land have been bent, sometimes illegally, by our ask-for-forgiveness-later government in order to "protect the nation." But the public's response was of stifled objection; people don't want big brother listening, but they do what his protection.

2. As far as "psychopharmacology" goes, I think the trend is far from dying out by itself. Until we hit a economic slump or the trend proves harmful, I don't think we'll see recession from this movement, at least at a large scale.
And as far as ethics applying to of the pressure to synchronizing with the cultural norm or fixing what nature got wrong, I think the ethics still play a large role. There is a big difference between getting a nose job because you look like the wicked witch versus a larger nose then normal. The matter can be subjective (maybe the slightly larger nose has ruined this person's life) but I think motive plays a large role in whether you "should" alter your body.
Specifically looking at authenticity, I’ve personally always been a little leery of the "ghost in the wrong machine" concept. Though many people opt for reassignment surgery, I think that we’re treating a psychological issue with cosmetic surgery. Many people do feel that they’ve been placed in the wrong suit but is it right to alter that person’s body so unnaturally to accommodate feelings?

3. Throwing out the concept of AI taking over in a Matrix/Terminator fashion, I think AI still won’t see the level of self-actualization that Hollywood throws on it. As far as its role, I think AI will replace a lot of tedious and expensive functions but never fully. I think that society holds too little faith of machines to place all our trust in them.

4. Personally, I think Turkle’s views on the computer playing the role of enhancing our psyche as applying to a very small percentage of the population (at least not the majority). I think she was running with a trend in a small population and pushing it onto the future of computer use.

4:41 PM  

Post a Comment

<< Home