Various approaches to disseminating research

academics-social-mediaI’ve written several times over the past year or so about the valuable role dissemination plays in the research world.  Of course, in a world that is dominated by citation as a metric, it is often difficult to encourage researchers to care about anything else.

That’s not to say that there aren’t attempts to change that.  AltMetrics for instance is a service that was launched with the intention of pooling together various metrics to give a more rounded perspective on the success of research.  It pulls in things like social media mentions, Mendeley usage, and of course citations.

This kind of thing matters, because traditionally, academics have a pretty weak relationship with social media.  Dissemination of findings is often an after thought, with everything focused around the publication in as prestigious a journal as possible.

A recent study underlined the challenge, as it found that few healthcare researchers bother with any kind of social media based dissemination of results.

The study suggested that a big hurdle to greater social media usage is how their institutions perceive social media.  As long as institutions perceive it as a platform with little real academic merit, it will be a challenge to get academics using it to engage more with their stakeholders.

This has created a gap in the market for services such as Faculty of 1000, a venerable veteran of the industry who presented at the recent Health 2.0 conference.They offer a human curation service that aims to identify and evaluate significant papers in the biomedical world.  A bit like this site (in a way), but focusing on medicine.  The faculty in question is a panel of experts in the various subject domains of this broad area.Which is great, except they kind of miss the point by putting their recommendations and commentary behind a paywall.So news of a new tool that has been developed by the Centre of Research Excellence in Rural and Remote Primary Health Care (I know!) sounds very much of interest.The tool looks at the wider impact of each paper, including media exposure, conference presentations and actual applications of the research itself.

The centre, which is a collaboration of various institutions in Australia, hope that their tool will provide researchers with a more rounded understanding of just how good a paper is.

“The centre started with a very simple database that grew into a tool that can handle complex data. It can now record an incredible amount of detail, including the traditional journal articles, books and conference presentations, as well as stakeholder presentations, media contact, and evidence of uptake or use of research,” the team say.

The indicators of impact used by the tool are organized by domain, and include academic impact, policy impact, service delivery and society at large.  They also claim to monitor whether these impacts were initiated by the institution, the researcher or an external entity.

In a slightly more open approach, they’re making their template available to the research community, so hopefully it can be built upon and developed further as a result.

One thing missing from the database however is social media, which, if a recent study are to be believed is something of an oversight.

The study, published by researchers from the University of Wisconsin-Madison, suggested that the h-index is all important, and that there was a direct link between how active the researchers were on social media, and the size of the h-index.

Now, this, if you’re not familiar with it, measures the quality of work produced by the researcher, and their influence, so it’s a decent proxy to use.

“If you talk to reporters and you tweet about your research, your work is more likely to be cited than people who do one or the other,” the researchers say.

“What this shows us is that sharing your science with the public is not hurting the science by stealing time,” they continue. “If the goal is to encourage people, ultimately to be productive scientists, and if directors of labs are discouraging people from engaging in this activity, they’re actually hurting the science itself. Because people who do this are cited more often in scientific journals, they’re making science accessible to broader audiences at the same time.”

So really, whilst curation services, such as Faculty of 1000, are no doubt useful, the study suggests that there is nothing better than actually engaging with people one to one.

Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *