The data and metric functionality provided by abstract and citation database firms enables us to track the influence of internal SISSA authors and demonstrate the relevance of their scientific activities on public policy and to researchers' funders
Citation counts are pulled from different database providers. The impact factors are measured in numerous ways. The metrics vary depending on which tool is used. This is because the datasets overlap from tool to tool and each one also has its own unique content.
There are five primary tools thar offer cross-disciplinary citation tracking to facilitate research impact analysis : Web of Science (Clarivate Analytics), Scopus (Elsevier), Google Scholar (Google), PlumX Metrics, and realtive newcomer Dimensions (Digital Science). The content within the five tools vary quite considerably. Each of these tools has its own slightly different suite of metrics. Both Web of Science and Scopus are proprietary while Google Scholar, like other Google tools, is publicly accessible for free.
The citation count for an article found in Web of Science will be different from the count for the same article in Scopus or the count in Google Scholar or PlumX Metrics. This can raise problems for researchers, funders and publishers for demonstrating the relevance of their activities. To better understand the differences in citation counts from tool to tool, we need to look at the content selection approaches of the big players in the bibliometric field. Web of Science has the largest bibliographic database, having originated from the Institute for Scientific Information (ISI) which was created by E.Garfield in the 1960s. Today Web of Science is "the world's most trusted publisher-independent global citation database" and its core collection contains 1.5 billion cited references dating back to 1900 within 74.8 million records (according to its website). WOS states that it uses 28 quality criteria to ensure new content meets standards of rigour and four impact criteria. Web of Science and Scopus both evaluate content largely at the journal level. Scopus was created by Elsevier in 2004 and is focused on content in journals and proceedings based internationality and requires English-language abstracts and full text using the Roman alphabet. "Scopus indexes content from 24,600 active titles and 5,000 publishers which is rigorously vetted and selected by an independent review board, and uses a rich underlying metadata architecture to connect people, published ideas and institutions"(cit from the web page of Scopus).
Dimension "harversts" data from reliable, but openly available, indexes and databases, and its content comes from CrossRef, PubMed using linked data of not just peer-reviewed journal articles and other outputs related to the research. Content includes datasets, supporting grants, patents, policy documents and clinical trials. Google Scholar and Dimensions do not have evaluators.
SISSA reserchers and administrators are under continued pressure to demonstrate the relevance of their activities. Many SISSA scientists and funders are searching the relevance of their reserch through impact metrics. Where you can find the relevance on reserch impact metrics of published papers of the SISSA authors? In Scopus, in WOS or in SISSA Digital Library-IRIS. Each scientific artcle published, peer reviewd and created by SISSA authors and stored in SISSA Digital Library-IRIS has a citation count from Scopus, Web of Science and with a graph displaying this metric information. In Scopus, WOS and IRIS-SISSA Digital Library there is also a special field that counts the citation ONLY of biomedical and neuroscience articles in PubMed. Only the PhD SISSA theses and books of the SISSA authors archived in SISSA Digital Library doesn't have these citation count.
Another bibliographic citation count is PlumX Metrix.
PlumX Metrics provide insights into the ways people interact with individual pieces of research output (articles, conference proceedings, book chapters, and many more) in the online environment). Where you can find it? Also this citation tool is present in SISSA Digital Library-IRIS but not on all journal articles stored in this database.
PlumX Metrix is also present on the result page of the search of the SISSA Discovery Service at the bottom of each entry list displayed.
This tool collects metrics about the impact of the resource that you are searching. It provides a more complete picture and answers questions about scientific impact for a large part of journal articles and books (including e-books).
PlumX, named Library Journal’s most ambitious database in 2013, gathers metrics across five categories—usage, mentions, captures, social media and citations.
Metrics are gathered around what Plum Analytics calls artifacts—providing more information than journals. PlumX Metrix gives explicit information about where the citations come from.
Metrics for Preprints
The metrics for preprints or to rank the citing data for arXiv articles are made available via the Semantic Scholar, INSPIRE HEP, and NASA ADS. This would help users of arXiv discover influential preprints, make it easy for journal editors to identify preprints worth citing in review articles, and more readily enable bibliometric work to track macro-trends in physics research, such as identification of newly emerging sub-domains in physics.