The most plausible explanation is that ResearchGate adjusted the algorithm – but without any hints as to why and how that has happened, it leaves the researcher guessing. Metrics are a powerful tool to evaluate science and move it forward. Please review our Comments Policy if you have any concerns on posting a comment below. With metrics becoming ubiquitous in research assessment, as evidenced in the recent HEFCE report “The Metric Tide”, we are poised to see the formulation of many more. At ResearchGate, we're committed to giving you insights into how people read, recommend, and cite your work. As a result, the JIF is rather problematic when evaluating journals; when it comes to single contributions it is even more questionable.
ResearchGate does present its users with a breakdown of the individual parts of the score, i.e., publications, questions, answers, followers (also shown as a pie-chart), and to what extent these parts contribute to your score. Anonymous downvoting has been criticised in the past as it often happens without explanation. It makes an “emblematic demand for harms” yet its will probably change the site’s conduct, a representative says. While no single metric can give you the full picture of the impact your research is having, we believe our new Research Interest score can help you complete the picture. Learn more about how we count reads), What the Research Interest score doesn't include. This is notable because Mendeley readership is widely recognized as the source with the best coverage among altmetrics sources, with plenty of academic research to back up that claim – http://www.mendeley.com/groups/586171/altmetrics/ We’ve even hosted Peter Kraker in the Mendeley offices on a fellowship!
Our analysis shows that the RG Score incorporates the Journal Impact Factor to evaluate individual researchers.
There are a few bits of profile data which could conceivably add to the score; at the season of the investigation, these included ‘affect focuses’ (determined utilizing sway elements of the diaries an individual has distributed in), ‘downloads’, ‘sees’, ‘questions’, ‘answers’, ‘adherents’ and ‘following’. One of the focal points in ResearchGate’s e-mails is a researcher’s latest ResearchGate Score (RG Score). My citation number in Google Scholar are in the thousand and in ResearchGate they shrink badly to less than 311. Impact points proved to be very relevant; for this exploratory sample at least, impact points accounted for a large proportion of the variation in the data (68%). Taking a gander at the pie outlines of RG Score breakdowns, scholastics who have a RG Score on their profile can in this manner be thought of as including a few subgroups, those whose score depends just on their productions. This makes it impossible to compare the RG Score over time, further limiting its usefulness. Impact of Social Sciences – Bringing together bibliometrics research from different disciplines – what can we learn from each other? Furthermore, intransparency makes it very hard for outsiders to detect gaming of the system. data, slides) is definitely a step into the right direction, and the idea of considering interactions when thinking about academic reputation has some merit. Thanks for calling attention to this important issue. Les réseaux sociaux académiques en 2016 | UrfistInfo, #DeleteAcademicSocialNetworks ? The problem with the way that ResearchGate handles this process is that it is not transparent and that there is no way to reconstruct it. Note: This article gives the views of the author, and not the position of the LSE Impact blog, nor of the London School of Economics. That in itself is not necessarily a bad thing. With such high aims, it seemed to be appropriate to take a closer look at the RG Score and to evaluate its capability as a measure of scientific reputation. But even in disciplines that communicate in journals, there is a high variation in the average number of citations which is not accounted for in the JIF. . Researchgate score is just the score of all your activity in this website (e.g. A recent conference aimed to bridge this gap.Peter Kraker, Katrin Weller, Isabella Peters and Elisabeth Lex report on the multitude of topics and viewpoints covered on the quantitative analysis of scientific research. When researchers read, recommend or cite a research item, its Research Interest goes up. Evidence suggests that academics who use ResearchGate tend to view it as an online business card or curriculum vitae, rather than a site for active interaction with others. In the interim, alliance individuals Elsevier and the American Chemical Society have recorded a claim to endeavor to forestall copyrighted material showing up on ResearchGate in future. Therefore, online networks such as Reddit have started to moderate downvotes.