Social media has changed market research, making real time “consumer research” possible. With the growing popularity of #altmetrics (or less twitter-like: alt-metrics) it is also starting to make inroads into measuring (academic) research impact (N.B. for those in the UK: this is not the same as the REF impact – but hey, we only have a few words to chose from!). It addresses a real problem of how to measure research impact, which was extremely slow and often time consuming: popular measures rely on citation counting (such as average citations per year etc). H-Index for example calculates this for individual authors, ISI Web of Knowledge for journals and so on. The logic is that if something gets cited it is good – the main problem: getting cited takes time, often years (there are some good arguments against this logic, but enlarge I’d say it’s probably a good proxy – given enough time).
With more and more on-line resources available, alt-metrics combines social networking tools and applies this to research. Alt-metrics isn’t in itself a number, but rather a selection of tools, which help create impact, awareness and promise to advance research. Here are a few useful tools for researchers today:
SSRN has long been a popular tool for “working papers”. It provides views and download data for submitted papers (for example, see this one) – which gives at least a rudimentary estimate of how popular a paper is (or may be).
Another alt-metric is build on a similar premise as social bookmarking – a bit like an academic version of digg if you want. A main advantage over traditional citations is the immediacy of the measure (well, relative immediacy anyway). Readermeter.org which uses Mendeley data to calculate bookmark rankings for readers – assuming that bookmarking is somewhat similar to expression of esteem or even as good as a virtual citation.
Peerevaluation is a different alt-metric tool – aiming to speed up the peer-review process. The idea is that rather than waiting for weeks on end to get two or three peer-reviews, other website users can review papers more quickly and efficiently. In a way, it’s like crowd-sourcing the peer-review process (which must be welcomed by anyone who has ever encountered an unhelpful reviewer – and who hasn’t encountered one of those?). Let’s hope the site catches on – at the moment I seem to be the only marketing guy there (please join me!)
Existent, and yet-to-be-developed, tools are all potentially useful to increase impact of good research and slowly move away from the sometimes arbitrary and easily manipulated measures used today (such as the Journal Impact Factor – see here), although it will probably take time to become more recognized as a real measure of quality.