Nein MC's profile picture

Published by

published
updated

Category: Web, HTML, Tech

Some quotes by Joseph Weizenbaum

The salvation of the world depends only on the individual whose world it is. At least, every individual must act as if the whole future of the world, of humanity itself, depends on him. Anything less is a shirking of responsibility and is itself a dehumanizing force, for anything less encourages the individual to look upon himself as a mere actor in a drama written by anonymous agents, as less than a whole person, and that is the beginning of passivity and aimlessness.
-- Joseph Weizenbaum
I have spoken here of what ought and ought not to be done, of what is morally repugnant, and of what is dangerous. I am, of course, aware of the fact that these judgements of mine have themselves no moral force except on myself. Nor, as I have already said, do I have any intention of telling other people what tasks they should and should not undertake. I urge them only to consider the consequences of what they do do. And here I mean not only, not even primarily, the direct consequences of their actions on the world about them. I mean rather the consequences on themselves, as they construct their rationalizations, as they repress the truths that urge them to different courses, and as they chip away at their own autonomy. That so many people ask what they must do is a sign that the order of being and doing has become inverted. Those who know who and what they are do not need to ask what they should do. And those who must ask will not be able to stop asking until they begin to look inside themselves. It it is everyone's task to show by example what questions one can ask of oneself, and to show that one can live with the few answers there are.
-- Joseph Weizenbaum, "Computer Power and Human Reason: From Judgment to Calculation" (1976)
I cannot tell why the spokesmen I have cited want the developments I forecast to become true. Some of them have told me that they work on them for the morally bankrupt reason that "If we don't do it, someone else will." They fear that evil people will develop superintelligent machines and use them to oppress mankind, and that the only defense against these enemy machines will be superintelligent machines controlled by us, that is, by well-intentioned people. Others reveal that they have abdicated their autonomy by appealing to the "principle" of technological inevitability. But, finally, all I can say with assurance is that these people are not stupid. All the rest is mystery.
-- Joseph Weizenbaum
These men were able to give the counsel they gave because they were operating at an enormous psychological distance from the people who would be maimed and killed by the weapons systems that would result from the ideas they communicated to their sponsors. The lesson, therefore, is that the scientist and technologist must, by acts of will and of the imagination, actively strive to reduce such psychological distances, to counter the forces that tend to remove him from the consequences of his actions. He must -- it is as simple as this -- think of what he is actually doing. He must learn to listen to his own inner voice. He must learn to say "No!"

Finally, it is the act itself that matters. When instrumental reason is the sole guide to action, the acts it justifies are robbed of their inherent meanings and thus exist in an ethical vacuum. I recently heard an officer of a great university publicly defend an important policy decision he had made, one that many of the university's students and faculty opposed on moral grounds, with the words: "We could have taken a moral stand, but what good would that have done?" But the moral good of a moral act inheres in the act itself. That is why an act can itself ennoble or corrupt the person who performs it. The victory of instrumental reason in our time has brought about the virtual disappearance of this insight and thus perforce the delegitimation of the very idea of nobility.
-- Joseph Weizenbaum, "Computer Power and Human Reason: From Judgment To Calculation" (1976)
People have a series of rationalizations. People say for example that science and technology have their own logic, that they are in fact autonomous. This particular rationalization is profoundly false. It is not true that science marches on in defiance of human will, independent of human will, that just is not the case. But it is comfortable, as I said: it leads to the position that "if I don't do it, someone else will."

Of course if one takes that as an ethical principle then obviously it can serve as a license to do anything at all. "People will be murdered; if I don't do it, someone else will." "Women will be raped; if I don't do it, someone else will." That is just a license for violence.

Other people say, and I think this is a widely used rationalization, that fundamentally the tools we work on are "mere" tools; This means that whether they get use for good or evil depends on the person who ultimately buys them and so on.

There's nothing bad about working in computer vision, for example. Computer vision may very well some day be used to heal people who would otherwise die. Of course, it could also be used to guide missiles, cruise missiles for example, to their destination, and all that. You see, the technology itself is neutral and value-free and it just depends how one uses it. And besides -- consistent with that -- we can't know, we scientists cannot know how it is going to be used. So therefore we have no responsibility.

Well, that is false. It is true that a computer, for example, can be used for good or evil. It is true that a helicopter can be used as a gunship and it can also be used to rescue people from a mountain pass. And if the question arises of how a specific device is going to be used, in what I call an abstract ideal society, then one might very well say one cannot know.

But we live in a concrete society, [and] with concrete social and historical circumstances and political realities in this society, it is perfectly obvious that when something like a computer is invented, then it is going to be adopted will be for military purposes. It follows from the concrete realities in which we live, it does not follow from pure logic. But we're not living in an abstract society, we're living in the society in which we in fact live.

If you look at the enormous fruits of human genius that mankind has developed in the last 50 years, atomic energy and rocketry and flying to the moon and coherent light, and it goes on and on and on -- and then it turns out that every one of these triumphs is used primarily in military terms. So it is not reasonable for a scientist or technologist to insist that he or she does not know -- or cannot know -- how it is going to be used.


7 Kudos

Comments

Displaying 0 of 0 comments ( View all | Add Comment )