The Yelp of Science?

TraceIndex Is A Reproducibility Rating Platform That Identifies which Research Authors Are Willing To Share Data

There is a huge issue in regards to identifying reproducible published research. This problem results in excessive time wasted on dead-end research, excessive funding, and unwarranted promotions. In turn, this hinders the discovery of new cures, therapies and technological advancements. Researchers are concerned but don’t have a way to voice this concern in a way that effectively helps to identify reproducible science, thus moving the advancement of research forward.

TraceIndex solves this problem by helping to identify which scientists are open to sharing and which are not. The platform encourages researchers to easily leave a five-star rating and review of their experiences when attempting to attain additional information on published research articles. This not only allows facilitators of new research to navigate to more favorable research, but it also acts as a voice for researchers who want to be heard. They can also export any ratings to a social media platform with a click of the button.



As we all know, Ai is truly an asset, but Ai cannot force scientists to disclose datasets nor answer intricate questions about their research. Also, each new researcher, R&D department and pharmaceutical company wishing to attain this information for their own agenda needs to know which existing research can be built upon. In short, new research scientists need answers that will help bring them closer to their specific aspirations. TraceIndex addresses this issue, which is also the first stage of reproducibility. This first stage entails identifying which scientists are willing to ‘play ball.’ It identifies which scientists (including authors, articles and affiliated institutions) are pro-development, pro full-access, pro assistance, and which are not. This requires a human approach. The only way to know if a scientist is willing to assist with questions regarding his or her published work is to ask. Most published articles have an email for this exact purpose. Research students and other users often attempt to contact authors. TraceIndex will act as an output repository, archiving the results of contacting authors. It will encourage users to leave ratings and reviews of their experiences in attempting to attain additional information on existing published articles. By gathering this data, users and visitors to TraceIndex won’t need to sift through millions of dead-end articles. At the same time, they’ll be able to see assistance-friendly research available to help take their work to the next level. We estimate that this will cut research times of students, R&D departments, and pharmaceutical companies anywhere from 20 to 30%.

The creator of TraceIndex is a former of employee of Springer Nature, a well-known science journal publisher.He came up with the idea after losing his father to cancer followed his own cancer scare years later. He felt that he could have a greater impact on research if he stepped away from the traditional style of science research article consumption. TraceIndex acts as a voice for researchers who are attempting reproduce and/or build on existing scientific published works.

TraceIndex asks visitors initial reproducibility questions and allows commenting associated to any publicly published article. The output is the result of a one to five star rating that is contributed by the user to reflect his/her experience while trying to attain data-sets or get article-related questions answered. The five-star rating will be present on the TraceIndex platformed and can be scraped by Google spiders in order to show up in Google search results. Google searches that show five-star ratings are very eye catching and difficult to renounce as a valid opinion. This is TraceIndex’s main function.

Users may come to the site and type in any article’s DOI. Using algorithms and APIs of various publishers, including SpringerNature, TraceIndex prefills the title, correspondence information, and lead affiliate (ie. university). If information doesn't prefill, the platform allows the user to add this information before rating their experience. The Platform is intuitive and makes rating your experience of corresponding with authors easy. It takes a only a few seconds to sign up and leave your rating. Your ratings and comments will be reviewed by the staff before going live. This is done to make sure that spam doesn’t occur on the platform. Already a hit with a open source advocates and scientists, the platform is sure to add value to research.


Leave a comment

You are commenting as guest. Optional login below.

Unless otherwise indicated, content hosted on OpenUP Hub is licensed under an Attribution 4.0 International (CC BY 4.0).