Government panel updating rules for online research

Tackling privacy, protection issues in regulations from Internet's infancy

Share with others:


Print Email Read Later

WASHINGTON -- When is an avatar a person? When is a chat room a public forum?

If you think these are questions for a computer geek convention, think again. They're being debated by government officials and leaders of top research institutions who are updating federal rules that protect research volunteers from harm and exploitation.

The regulations were last updated in 1991 just as the very first websites launched during the Internet's infancy. Technological and scientific changes since then have been astounding, and government regulations haven't kept up, even as researchers are going online to recruit study participants, conduct interviews and scrape social networking sites for data.

Now, an advisory panel of the Office of Human Research Protections is drafting updates to rules that guide research protocol at universities, hospitals and other institutions that receive federal funding for studies involving human subjects. Review boards at each institution approve studies, ensuring research methods comply with federal regulations aimed at protecting volunteer subjects.

For online research, the panel likely will recommend guidelines, not hard-and-fast rules, said committee chairwoman Barbara Bierer, a professor at Harvard Medical School and vice president of research at Boston's Brigham and Women's Hospital.

The way research is conducted in the ever-changing Internet age raises questions at every stage -- even before would-be volunteers decide whether to participate.

It starts with data collection by search engines. If a patient looks for clinical trials on kidney disease, Google might identify her with the illness and target her for dialysis advertisements, advisory committee members said during a recent meeting in Washington.

Some studies are even being conducted in virtual online worlds, where users -- including researchers themselves -- create avatars, or online personas, that interact with each other in make-believe settings. The National Institute of Health has funded such projects on the website Second Life. In one, they created a virtual island where African-American women with diabetes could interact with each other and learn about their disease.

"Are those avatars human subjects?" asked Dean R. Gallant, a member of the advisory committee and assistant dean for research policy and administration at Harvard University. "Is a chat room or a social networking site considered public? Does it matter if you need a password to join the venue? Does it matter if it's moderated?"

At first blush, studies of pretend people in pretend worlds appear to be low-risk, but there have been reports of simulated rape in virtual worlds, and that can have psychological and emotional repercussions on creators of the victimized avatars, said Elizabeth A. Buchanan, director of the Center for Applied Ethics at the University of Wisconsin-Stout, who is helping the committee create guidelines for online research.

"The incidents affected real individuals and entire online communities," she said. "People invest very heavily in their online persona, who they want to be online, and it definitely does impact the real world."

In another example, a New York psychologist wanted to study how disabled women interact with each other online so he pretended to be one of them. Along the way, he established friendships with several women in an online forum.

"He used deception to present himself this way, and that caused very intense psychological harm to these women," Ms. Buchanan said in an telephone interview.

Meanwhile, some academics say that avatars are representations, not people, and therefore not subject to human research protections.

"It gets a little blurry. There's not a clear consensus," Ms. Buchanan said. "We may not know if an avatar is being driven by a human being or if it has been computer-generated. You really need to look at the context and particulars. If an individual is using the avatar as a representation of himself, then we're closer to the side of a human subject."

Technology presents many challenges to both researchers and the institutional boards that approve their studies.

Both are concerned that research subjects -- including minors -- may misrepresent themselves online in order to qualify for studies, Mr. Gallant said. They worry, too, that without face-to-face contact, they won't be able to know whether a subject has really understood risks before agreeing to participate.

Technology also is raising growing concerns about privacy in a world where it's possible to take raw study data, plug it into an Internet search engine, and be led back to the research subject who had been promised confidentiality. Those concerns grew last month when Facebook said it is considering providing researchers limited access to aggregated data it has compiled about its nearly 1 billion users.

Institutional research boards have been engaged in heated and confusing discussion over how to apply real-world regulations in cyberspace.

"It would be good to have some federal guidance from the federal level that would help fill in some gaps and provide some consistent language and some consistent standards," Ms. Buchanan said.

nation - interact

Washington Bureau Chief Tracie Mauriello: 703-996-9292 or tmauriello@post-gazette.com.


You have 2 remaining free articles this month

Try unlimited digital access

If you are an existing subscriber,
link your account for free access. Start here

You’ve reached the limit of free articles this month.

To continue unlimited reading

If you are an existing subscriber,
link your account for free access. Start here