Ranking of Journals, Institutions, and Countries
Discipline and Domain Analysis
Databases and datasets for evaluation
Evaluation Tools and Indicators
Visibility and impact
Scientific Collaboration and cooperation analysis
National Evaluation Systems
Open access and open publishing
New indices for evaluation
Data Science and Digital Repositories
e-Science in the Cloud
Track 1: Open Science
Science relies on the sharing of ideas, research results, data and methods. Quiet often many
scientific discoveries and innovations fail to reach the society because of the closed scientific
publications. If Science is built based on existing knowledge it needs to reach every one and hence
Open Science leads the knowledge transaction. This track addresses the challenges and issues and
pave the way to develop and nurture open science.
Track 2: Peer Review
Peer review is the process by which experts critically examine the research contributions using
standard metrics. Peer review helps maintain and enhance quality both directly by detecting
weaknesses and errors in specific works. These reviews enable to decide publication, research
grants, employment, promotion, and tenure. Peer review promotes accountability and improves
quality of work.
Track 3: Content and Text mining based metrics
This track enable to present works that use numeric indices to process unstructured (textual)
information, and to build various data mining (statistical and machine learning) algorithms.
Information can be extracted to derive summaries for the words contained in the documents or to
compute summaries for the documents based on the words contained in them. One can analyze
words, clusters of words used in documents, and we could analyze documents and determine
similarities between them or how they are related to other variables. The studies of text mining will
“turn text into numbers” (meaningful indices), which can then be incorporated in other analyses.
Track 4: Technology Transfer Metrics
The technology transfer enables the outreach of the research outcomes to a wider level. The technology transfer measures are of interest to practitioners, program managers, and policymakers in the evaluation of technology transfer programs. Building metrics can lead to a better measurement of effectiveness, efficiency, and return on investment. The introduction of newer metrics will be helpful to identify the benchmarking tools. The proposed track will discuss the currently existing metrics and the research on the development of new metrics as well as their applications.
Chair: Mohammad Hassanzadeh, Tarbiat Modares University, Iran
Track 5: Open Science Metrics
The primary goal of this track is to enable to construct, identify, and specify relevant metrics and indicators for open science. Measauring the extent of openness in science and creating or using metrics is important. Measuring Open science help to realize the innovation. We can thus find how collaboration and knowledge sharing occurs and how institutions work on it. Openness also yield enormous data and knoweldge. This track gauge using available data about opennes s in science.
Traditional discovery methods are insufficient in an overwhelmed high volume of researchers, institutions and research publications and data environment. Unique identifiers play a major role to distinguish people, work, institutions and optimizing their impact and discoverability. Especially in e-world, they support to reduce errors, omissions, and duplication, provide comprehensiveness of the information and make sure the digital footprints of interactions and networking across different platforms, databases and social networks are captured and preserved. Several unique identification systems are being introduced recently such as DOI, ORCID, ROR, ISNI DataCite, etc. Through this track, we propose to discuss the analysis of data or case studies of different unique identifiers and their impact in research metrics and evaluation.
Co-Chairs: J.K. Vijayakumar, King Abdullah University of Science and Technology, Saudi Arabia
All submitted papers will be reviewed by a double-blind (at least three reviewers), and participative
peer review. The review process will enable the selection process of those that will be accepted for
their presentation at the international conference. Authors of accepted papers who registered in
the conference can have access to the evaluations and possible feedback provided by the reviewers
who recommended the acceptance of their papers, so they can accordingly improve the final version
of their papers.
Mentoring support is available to young researchers and authors of developing countries.
The post-conference modified versions of the papers will be published in the following journals.
1.Journal of Digital Information Management
2. Journal of Contemporary Eastern Asia
3. Special Section in Research Evaluation
4. Journal of Scientometric Research
5. Malaysian Journal of Library & Information Science
6. International Journal of Computational Linguistics Research
|Full Paper Submission||October 20, 2019|
|Notification of Acceptance/Rejection||November 05, 2019|
|Registration Due||November 25, 2019|
|Camera Ready Due||November 25, 2019|
|Workshops/Tutorials/Demos||December 03, 2019|
|Main conference||December 02-04, 2019|