Search the
Stanford Review

Subscribe to
our newsletter

Feedback Advertising Information
Letter to the Editor
Comments for the Webmaster
Other contact information
Subscribe
(paper edition)

Donate
Make payments with PayPal - it's fast, free and secure!

In This Issue
Front page
Editor's Note
News
Opinion
Ethics and the Academy
Smoke Signals
The Last Page

Columnists
Alec Rawls
Bob McGrew
David Myszewski
John Gibbs
Johnny Openshaw
Henry Towsner
Kimberly Torrence
Ryan Parks
Scott Rasmussen

Stanford Review Graphic
Volume XXIV, Issue 5 June 5, 2000
Stanford Review - Archive - Volume XXIV - Issue #5 - Ethics and the Academy

Ethics and the Academy
Flattery And Privacy: Persuasive Technology
By David Myszewski
Staff Writer

Technology has always changed the lives of human beings, but only recently has technology been a means to literally change human behavior. From alcoholism to teenage pregnancy, a new type of technology --"persuasive technology"-- aims to positively affect behavior.

Dr. B.J. Fogg, director of Stanford's Persuasive Technology Lab, defines persuasive technology as "a computing system, device, or application intentionally designed to change a person's attitudes or behavior in a predetermined way."

While something like a television commercial or advertisement on a web site may persuade, it does not qualify as persuasive technology. Persuasive technology is much more powerful than that.

Daniel Berdichevsky, the associating managing director of VentureNova and executive direction of DemiDec Resources, whose educational background includes both Stanford and Harvard, makes the distinction between technology which has the effect of persuading and technology specifically designed to dynamically persuade:

"Persuasive tools such as the megaphone (and the printing press) also date back several decades (or centuries). But persuasive technologies with 'smarts' -- dynamic, personal, enabled by computers -- are much newer on the scene. Apple's smiley-face was an early persuasive technology, as were other aspects of the Mac OS."

While technology is clearly different than one human persuading another, Erik Neuenschwander, a master's student in philosophy at Stanford and associate manager of the Stanford Persuasive Technology Lab, and Berdichevsky believe the methods are virtually the same.

"The methods employed by persuasive technology are essentially similar to those employed by persuasive people. Humans can persuade through flattery. Work has shown that computers can flatter too. Humans can persuade through conditioning, by rewarding and punishing desirable and undesirable behaviors. Again, so can computers."

One of the more well-known persuasive technologies is a program called "Baby Think It Over." The program is designed for educational and medical professionals to help young adults make responsible choices of parenting.

The program uses a computerized baby doll, approximately the same size and weight as a real baby, which cries at random intervals and is used as a simulation of parenthood. The "baby," while similar to "egg babies" which are used in school health and home economics classes, is much more realistic and persuasive.

Even though programs like this are only the beginning of the capabilities of persuasive technology, there are inherent limitations of using technology to change an individual. As technologically advanced computers are today, there are still many "human" things that they simply cannot accomplish.

Mr. Berdichevsky explains his and Mr. Neuenschwander's position on the subject: "Humans are more effective in situations that require a great deal of empathy -- directly talking someone into changing an opinion, for instance.

"We believe technologies can encourage changes in behaviors more easily than in attitudes, and persuasive technologies often work gradually over a period of time -- as when they leverage the principles of conditioning."

Simple persuasive technologies like the "Baby Think It Over" program present few drawbacks, but more advanced persuasive technology has the potential of opening the door to frightening risks.

Imagine an application that has to access a database on the Internet in order to dynamically determine the next course of action in the individual's persuasion. Berdichevsky notes the valuable ability of such technology. "Persuasive technologies leveraging online databases are likely to be very effective, because they'll be able to customize their persuasive methods and messages for individual users."

If such technology were used, the possible threats are startling. An application could send your information to the server so that the server can act on it and determine an appropriate response. All of a sudden, your personal information is sent with the possibility of others viewing it.

Only this time it's not your credit card number, your address, or your home phone number. This time it's your entire psychological profile.

In short, privacy is of the utmost concern. Anyone who develops such a persuasive technology should keep privacy as a primary goal.

Mr. Berdichevsky hopes that developers will not rush towards the completion of a technology without privacy in mind. "I do hope that privacy concerns slow the creation of these technologies, so that we can a deep breath and develop them in a way respectful of privacy concerns -- for instance, by implementing 'opt in' instead of 'opt out' systems whenever possible." However, he believes that technology will be created regardless of the potential privacy concerns. "History suggests, however, that while privacy concerns will flare up now and then (as they did with regard to the ID numbers on Pentium chips), for the most part, technologies will be developed without much attention to them."

If that is the case and technology advances regardless of the potential harms, some ethical issues emerge. The questions "What if the technology changes someone's behavior for the worse?" and "Who is ultimately responsible if something goes wrong?" are only the beginning of what Berdichevsky and Neuenschwander call "the ethics of persuasive technology," a subject which they wrote a seventeen page paper about.

In one example, they compare persuasive technology to "a slot machine with a compelling multimedia narrative." If such a device were to cause one to spend all of one's money on it, they argue, the responsibility isn't placed on the gambling machine.

Instead, the "responsibility for the computerized slot machine's motivations, methods, and outcomes falls squarely on its creators and purchasers, while responsibility for the gambler's choice to gamble distributes on both these parties and the gambler him or herself--just as if it were a human being doing the persuading."

In other words, one would not blame a persuasive technology itself for a problem any more than one would blame a web browser for sending credit card information on an insecure site. Both the user and the developer of the program are at fault.

But Berdichevsky and Neuenschwander assert that they should only be held responsible for "reasonably predictable outcomes."

The final question that one has to ask about the persuasive technology is "Just how truthful does the technology have to be?" For example, as humans we expect people who are trying to persuade us to twist the truth or leave out potentially important details in order to persuade us. We expect used car salesman and telemarketers to engage in such action.

Berdichevsky and Neuenschwander are firm about this issue. "Computers . . . can lie with equanimity. Moreover, people tend to trust in the information that computers deliver to them," they argue.

"We have no reason to believe that a device monitoring our heart rate will deliberately misreport it. Imagine, however, a scale meant to encourage healthier dieting habits. It might be programmed to tell a teenage girl she weighs less than she actually does, in order to minimize the chance of her developing an eating disorder."

Because it is important that humans trust computers not only for the persuasive technology but for other applications in society, they believe that developers of persuasive technology should not hold any pretenses.

Like all technology, the degree to which persuasive technology positively or negatively affects humans is not up to the technology itself - instead, it is human beings who are ultimately responsible.

Page last modified on Wednesday, 01-Mar-2006 23:47:05 MST.