Crowdsourcing speeds medical research

July 15, 2013

A crowdsourced study located more than 1,400 automated external defibrillators in Philadelphia (credit: University of Pennsylvania)

“Human computing power” harnessed from ordinary citizens across the world has the potential to accelerate the pace of health care research of all kinds, a team from the Perelman School of Medicine at the University of Pennsylvania has found.

In fact, crowdsourcing — a research method that allows investigators to engage thousands of people to provide either data or data analysis — could even improve the quality of research while reducing the costs, they suggest.

The Penn researchers previously used crowdsourcing in a study to locate and catalog the locations of more than 1,400 lifesaving automated external defibrillators (AEDs) in public places in Philadelphia, in the MyHeartMap Challenge. They hope to replicate the study in other major cities across the U.S.

Now they have used crowdsourcing again to perform a literature search for health and medical research articles using two free websites: Yahoo! Answers and Quora to collect and analyze 21 health-related studies.  The studies collectively engaged more than 136,000 people, ranging in focus from tracking H1N1 influenza outbreaks in near real time to classifying different types of polyps in the colon.

Guidelines for using crowdsourcing in medical studies

“Studies we reviewed showed that the crowd can be very successful, such as  solving novel complex protein structure problems or identifying malaria infected red blood cells with a similar accuracy as a medical professional,” said the study’s first author Benjamin Ranard, a medical student in the Perelman School of Medicine.

The research team found that the studies centered around four main categories of tasks: problem solving, data processing, surveillance/monitoring and surveying.

However, they found considerable variability in the amount and type of data reported about the crowd and the experimental set up, which would make it difficult for other researchers to replicate or model their work for their own research.

For instance, the articles rarely reported data about the demographics of the crowd participating, including information standard to most clinical trials such as the size of the cohort, age, gender, and geographic location. They also noted that the limited amount of studies they found is surprising given the potential benefits of this approach.

The authors recommend that other health and medical investigators should look at their own research projects and consider involving the public through crowdsourcing.  Whenever research requires human processing that computers alone cannot do, such as visually sorting pictures or other data, they say there is a potential to involve the crowd.

Crowdsourcing can also be used to take advantage of problem solving skills members of the public may have (such as solving three-dimensional puzzles), or to employ the crowd to act as human sensors reporting data about the environment (for example, reporting cases of influenza-like symptoms).

They call for continued study of the scope of crowdsourcing to determine where it might be as useful as traditional data.

To further explore the power of crowdsourcing and other research approaches via social media, the study’s senior author, Raina Merchant, MD, an assistant professor of Emergency Medicine at Penn, was recently appointed director of the Social Media Lab at the Penn Medicine Center for Health Care Innovation. In this role, she will lead a program exploring ways in which new communication channels can enhance Penn’s ability to understand and improve the health and health care of patients and other populations.

The study was funded in part by the National Institutes of Health.