Analysing Scholarly Communication Metadata of Computer Science Events

From Openresearch
Revision as of 15:19, 28 June 2018 by Said (talk | contribs) (Created page with "{{Paper |Title=Analysing Scholarly Communication Metadata of Computer Science Events |Authors=Said Fathalla, Sahar Vahdati, Christoph Lange, Sören Auer, |Series=TPDL |Year=20...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Analysing Scholarly Communication Metadata of Computer Science Events
Analysing Scholarly Communication Metadata of Computer Science Events
Bibliographical Metadata
Keywords: Scientific Events, Scholarly Communication, Semantic Publishing, Metadata Analysis
Year: 2017
Authors: Said Fathalla, Sahar Vahdati, Christoph Lange, Sören Auer
Venue TPDL
Content Metadata
Problem: No data available now.
Approach: No data available now.
Implementation: No data available now.
Evaluation: No data available now.

Abstract

Over the past 30 years, we have observed the impact of the ubiquitous availability of the Internet, email, and web-based services on scholarly communication. The preparation of manuscripts as well as the organization of conferences, from submission to peer review to publication, have become considerably easier and e�cient. A key question now is what were the measurable effects on scholarly communication in computer science? Of particular interest are the following questions: Did the number of submissions to conferences increase? How did the selection processes change? Is there a proliferation of publications? We shed light on some of these questions by analyzing comprehensive scholarly communication metadata from a large number of computer science conferences of the last 30 years. Our transferable analysis methodology is based on descriptive statistics analysis as well as exploratory data analysis and uses crowd-sourced, semantically represented scholarly communication metadata from OpenResearch.org.

Conclusion

In summary, we made the following observations: With the number of submissions to the top conferences having tripled on average in the last three decades, acceptance rates are going down slightly. Most of those conferences that are A- or A*-rated today have a long continuity. In summary, we made the following observations: With the number of submissions to the top conferences having tripled on average in the last three decades, acceptance rates are going down slightly. Most of those conferences that are A- or A*-rated today have a long continuity. Geographical distribution is not generally relevant; some good conferences take place in the same location; others cycle between continents. Good conferences always take place around the same time of the year. This might mean that the community got used to them being important events. Some topics have attracted increasing interest recently e.g., database topics thanks to the `big data' trend. This might be confirmed by further investigations into more recent, emerging events in such fields.

Future work

In further research, we aim to expand the analysis to other fields of science and to smaller events. Also, it is interesting to assess the impact of digitisation with regard to further scholarly communication means, such as journals (which are more important in fields other than computer science), workshops, funding calls and proposal applications as well as awards. Although large parts of our analysis methodology are already automated, we plan to further optimise the process so that analysis can be almost instantly generated from the OpenResearch data basis.

Approach

Positive Aspects: No data available now.

Negative Aspects: No data available now.

Limitations: No data available now.

Challenges: No data available now.

Proposes Algorithm: No data available now.

Methodology: No data available now.

Requirements: No data available now.

Limitations: No data available now.

Implementations

Download-page: No data available now.

Access API: No data available now.

Information Representation: No data available now.

Data Catalogue: {{{Catalogue}}}

Runs on OS: No data available now.

Vendor: No data available now.

Uses Framework: No data available now.

Has Documentation URL: No data available now.

Programming Language: No data available now.

Version: No data available now.

Platform: No data available now.

Toolbox: No data available now.

GUI: No

Research Problem

Subproblem of: No data available now.

RelatedProblem: No data available now.

Motivation: No data available now.

Evaluation

Experiment Setup: No data available now.

Evaluation Method : No data available now.

Hypothesis: No data available now.

Description: No data available now.

Dimensions: No data available now.

Benchmark used: No data available now.

Results: No data available now.