Thomson Reuters’ list of most-cited scientists for 2014 featured 32 researchers affiliated with Duke.

The list includes a total of 3,215 names worldwide who are the top one percent of the world’s most highly cited scientists in their respective fields. The new list, compared to Thomson Reuters’ Highly Cited Researchers 2001, adopts a reformed methodology and continues to evaluate the leading scientific minds in 21 different disciplines.

“It’s a recognition of excelling, and it’s very gratifying,” said said Jennifer West, Fitzpatrick Family University professor of engineering and a Reuters most highly-cited scientist. “It also says a lot about the students and post-docs who have worked in the lab over the years.”

To evaluate Duke’s position on the list, it is necessary to look at relative statistics in comparison to the University’s peer institutions, noted Philip Benfey, Paul Kramer professor at the Institute for Genome Sciences and Policy, director of the Center for Systems Biology and one of the three Duke faculty in the Plant and Animal Science category on the Reuters list.

Harvard University and Massachusetts Institute of Technology are the two leading institutions with 160 and 94 researchers affiliated primarily and secondarily on the list, respectively. Duke, with 32 research affiliates on the list, has relatively the same number of affiliated researchers as Columbia University, Northwestern University, Princeton University and Johns Hopkins University.

'Revolutionary impacts'

Citation is frequently used as a tool to evaluate an author's research impact, said Robert Jackson, chair of global environmental change in the Earth and Ocean Sciences Division of the Nicholas School of the Environment and professor of biology. Jackson's name is also on both lists, in 2001 and 2014.

“Citations aren’t the only metric by any means that’s important, but work that’s highly cited suggests that it captures the community’s interest, and scientists want their research to be useful,” he said.

Campbell Harvey, J. Paul Sticht professor in the Fuqua School of Business, was on Reuters’ list in 2001 as well as this year's list. He noted that quite often, an article is cited by many peers in the field because of its revolutionary impacts.

“If you are doing something that changes the way people think about an important problem, it will likely to get lots of citations,” he said.

For example, Harvey’s article “The Economic Implications of Corporate Financial Reporting” received 2,339 citations, according to Google Scholar. Harvey said this is because his article changed the way people think about accounting.

The topic of research is also an important factor in determining if a publication will be frequently cited, said Mary Story, professor of community and family medicine and global health and associate director for academic programs at the Duke Global Health Institute. Story said her research article on obesity in 2001 attracted people’s attention because of rising awareness of public health.

“There has been so much interest then in looking at not just behavioral changes, but how the environment contributes to unhealthy eating and lack of physical activity,” she said. “Our article was one of the first reviews that really looked at how the environment really contributes to obesity in the academic field.”

Changes in methodology

Reuters made an important change in the way authors are chosen by breaking down the criteria for different research areas.

“One of the issues that came up [in 2001] was that in different disciplines, there are different conventions of practices on citations,” West said. “By breaking it down this time, they are able to look at the top people from each of these areas, and make sure that people in certain disciplines aren’t missed just because of certain conventions in their field.”

West also added that the Reuters list seems credible considering its method of ranking.

“Anything where you are looking at people based on some quantifiable data, rather than on the general impression of other people, it’s always a good thing,” West said.

Although there are separate criteria for different areas of research, the list analyzes each academic field separately and does not take into account the work done by researchers across several disciplines. West added that the lack of specificity of categories might result in overlook of researchers in certain areas.

“For example, their category ‘Engineering’ is not divided into different disciplines of engineering, and it may not be capturing all the people within the school of engineering adequately,” West said.

The Reuters list has also redone the analysis to include more balance for younger researchers, focusing on achievements within a 12-year period from 2001 to 2012 rather than analyzing cumulative publications. This allows for more exposure to young researchers as opposed to older professors who have had more citations over time.

“There is something to be said about the impact of a long-term career,” West said. “I can certainly see posting the ranking calculated both ways. One is over a short period of time, and one is the cumulative record of the entire career, and both of those are important.”

West also noted that analyzing a shorter period of time enables people to see who the "up and comers" are.

The new methodology seems particularly effective at highlighting articles that make more waves, noted Bryan Cullen, James B. Duke professor of molecular genetics and microbiology and director of the Center for Virology.

“Obviously, just being around long enough to cumulate enough papers is not the main idea,” he said. “The new format really emphasizes what they call the ‘high-impact manuscripts,’ I think that’s very sensible.”

Cullen, along with many Duke researchers whose names appear on both lists in 2001 and 2014, belongs to a group of more senior faculty.

“I don’t know if it made a huge difference—I am 62 and still on the list 13 years later,” he joked.