[CFP] ESWC 2019 Call for Challenges

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[CFP] ESWC 2019 Call for Challenges

Agnieszka.Lawrynowicz
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
The Sixteenth Extended Semantic Web Conference (ESWC 2019)
2nd June - 6th June 2019, Portorož, Slovenia
 
Twitter: @eswc_conf 

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

* * * * * * * 
Overview
* * * * * * * 

The ESWC 2019 conference is looking for organizers to define challenges to which participants can submit solutions.

The purpose of challenges is to showcase the maturity of state-of-the-art methods and tools on tasks common to the Semantic Web community and adjacent disciplines, in a controlled setting involving rigorous evaluation.

Semantic Web Challenges are an official track of the conference, ensuring significant visibility for the challenges as well as participants. Challenge participants are asked to present their submissions as well as provide a paper describing their work. These papers undergo a peer review by experts and will be published in the challenge proceedings.

In addition to the publication of proceedings, challenges at ESWC 2019 will benefit from high visibility and direct access to the ESWC audience and community.

* * * * * * * * * * * * * * *
Challenge Proposals
* * * * * * * * * * * * * * *

Challenge organizers are encouraged to submit proposals adhering to the following criteria:
Challenges consist of one or multiple tasks. If multiple tasks are provided, they are preferably independent so that participants may choose which to participate in.
At least one task involves semantics in data. The task(s) should relate to the Semantic Web, but it is highly encouraged to consider tasks which involve other, highly related communities, such as NLP, Recommender Systems, Machine Learning or Information Retrieval.
Task descriptions appeal to a wider audience. We encourage the challenge organizers to propose at least one basic task that can be addressed by a larger audience from their community.
Clear and rigorous definition of the tasks and evaluation metrics.  For each task, you should define a deterministic and objective way to verify to which extent it has been achieved, if possible using at least two objective criteria/metrics (such as precision or recall). The best way is usually to provide detailed examples of input data and expected output. The examples should leave no ambiguity about whether in the task is done.
Valid dataset (if applicable). When proposing the challenge, provide details on the used datasets and on the way it is or will be created. If accepted, you need to provide the datasets together with their provenance. You must make sure you that have the right to use/publish the datasets and state the license. 
If training and evaluation datasets are necessary to complete the task, results must only be available for the training dataset. Challenge organizers need to make certain that the correct results for the evaluation dataset have not previously been published and/or made available to the participants.

Selection criteria for evaluating and selecting  submitted challenge proposals are:

- Potential number to interested participants
- Rigor and transparency of the evaluation procedure
- Relevance to the Semantic Web community
- Endorsements (e.g., from researchers working on the task, from industry players interested in results, from future participants)

* * * * * * * * * * * *
Important Dates
* * * * * * * * * * * * 

- Challenges proposals due: December 15th, 2018
- Notification of acceptance: January 7th, 2018
- Training data ready available and challenges calls for papers sent: February 7th, 2019
- Challenge papers submission deadline (8 pages document): March 31st, 2019
- Notifications sent to participants and invitations to submit task results: April 15th, 2019
- Camera-ready papers for the conference (8 pages document): April 30th, 2019

* * * * * * * * * * * * * *
Submission Details
* * * * * * * * * * * * * *

The challenge proposals should contain at least the following elements:

- A summarizing description of the challenge and its tasks
- How the training/testing data will be built and/or procured
- The evaluation methodology to be used, including clear evaluation criteria and the exact way in which they will be measured. Who will perform the evaluation and how will transparency be assured?
- The anticipated availability of the necessary resources to the participants
- The list of challenge committee members who will evaluate the challenge papers (please indicate which of the listed members already accepted the role)
In case of doubt, feel free to send us your challenge proposal drafts as early as possible – the challenges chairs will provide you with feedback and answers to questions you may have.

Please submit proposals via EasyChair <https://easychair.org/conferences/?conf=eswc2019challenges>.

_______________________________________________
protege-user mailing list
[hidden email]
https://mailman.stanford.edu/mailman/listinfo/protege-user