Home

First International Conference on Science & Technology Metrics
Bangkok, Thailand
December 02-04, 2019
Building metrics is the key to science and technology measurement and translating the measures to work better. Metric is not confined to draw numbers and analyze. Selecting a metric depends on synthesizing multiple directions and evidence-based data. Producing benchmarks and standards in science and technology evaluation and implementing them with real data, mark the full-fledged system. Quite recently, we are witnessing the emergence of new metrics-based models aiming to reflect science and technology growth.

 

The proposed Science and Technology Metrics (STMet 2019) conference is concerned with how the use of many metrics may involve the assessment of different aspects of science and technology which are best expressed using different approaches. Besides the standard conference track and the Doctoral Symposium, the STMetrics will host a number of workshops and tutorials related to the theme of the conference.  The purpose of the workshops is to provide participants with a friendly, interactive atmosphere for presenting novel ideas and discussing their application of metrics in Science and Technology System.  The goal of the tutorials is to enable the participants to familiarise themselves with theoretical as well as practical aspects and application of metrics.

 

The proposed conference consists of invited talks, presentations, tutorials, workshops and discussions. The STMet has tracks on specialized topics. The STMet address the below themes, not limited.
      • Citation-based Metrics
      • Ranking Systems
      • Discipline and Domain Analysis
      • Databases and Datasets for Evaluation
      • Evaluation Tools and Indicators
      • Web-based Metrics
      • Visibility and Impact
      • Internationalization
      • Text-based Metrics
      • Innovation Indicators
      • Scientific Collaboration and Cooperation Analysis
      • National Evaluation Systems
      • New Indices for Evaluation
      • Altmetrics
      • Web-based Metrics
      • Scientific Visualization
      • Scientific Publications

Track 1: Open Science
Science relies on the sharing of ideas, research results, data and methods. Quiet often many
scientific discoveries and innovations fail to reach the society because of the closed scientific
publications. If Science is built based on existing knowledge it needs to reach every one and hence
Open Science leads the knowledge transaction. This track addresses the challenges and issues and
pave the way to develop and nurture open science.

Track 2: Peer Review
Peer review is the process by which experts critically examine the research contributions using
standard metrics. Peer review helps maintain and enhance quality both directly by detecting
weaknesses and errors in specific works. These reviews enable to decide publication, research
grants, employment, promotion, and tenure. Peer review promotes accountability and improves
quality of work.

Track 3: Content and Text mining based metrics
This track enable to present works that use numeric indices to process unstructured (textual)
information, and to build various data mining (statistical and machine learning) algorithms.
Information can be extracted to derive summaries for the words contained in the documents or to
compute summaries for the documents based on the words contained in them. One can analyze
words, clusters of words used in documents, and we could analyze documents and determine
similarities between them or how they are related to other variables. The studies of text mining will
“turn text into numbers” (meaningful indices), which can then be incorporated in other analyses.

All submitted papers will be reviewed by a double-blind (at least three reviewers), and participative
peer review. The review process will enable the selection process of those that will be accepted for
their presentation at the international conference. Authors of accepted papers who registered in
the conference can have access to the evaluations and possible feedback provided by the reviewers
who recommended the acceptance of their papers, so they can accordingly improve the final version
of their papers.

Mentoring support is available to young researchers and authors of developing countries.

Track 4: Technology Transfer Metrics

The technology transfer enables the outreach of the research outcomes to a wider level. The technology transfer measures are of interest to practitioners, program managers, and policymakers in the evaluation of technology transfer programs. Building metrics can lead to a better measurement of effectiveness, efficiency, and return on investment. The introduction of newer metrics will be helpful to identify the benchmarking tools. The proposed track will discuss the currently existing metrics and the research on the development of new metrics as well as their applications.

Chair: Mohammad Hassanzadeh, Tarbiat Modares University, Iran

Track 5: Open Science Metrics

The primary goal of this track is to enable to construct, identify, and specify relevant metrics and indicators for open science. Measauring the extent of openness in science and creating or using metrics is important. Measuring Open science help to realize the innovation. We can thus find how collaboration and knowledge sharing occurs and how institutions work on it. Openness also yield enormous data and knoweldge. This track gauge using available data about opennes s in science.

The post-conference modified versions of the papers will be published in the following journals.

1. Journal of Digital Information Management
2. International Journal of Computational Linguistics
3.  Journal of Contemporary Eastern Asia

Important Dates

Full Paper Submission September 15, 2019
Notification of Acceptance/Rejection October 15, 2019
Registration Due November 20, 2019
Camera Ready Due November 20, 2019
Workshops/Tutorials/Demos December 03, 2019
Main conference December 02-04, 2019