News

Frequently Asked Questions


Q: Detailed information about the certificates?

A: The top 3 teams will be awarded certification from NLPCC and CCF-NLP. For more information, please visit the official NLPCC website: NLPCC Website


Q: Detailed information about paper submission?

A: The top 3 teams will be invited to submit papers to the NLPCC conference. The submitted papers will undergo a review process, and if they meet NLPCC's publication acceptance standards, they will be published in the Shared Task Track of NLPCC 2023.


Q: Can you explain the rule "prohibit the use of large models with more than 1 billion parameters for training and prediction?"

A: This means that the parameter count of a single pre-trained language model used in your model cannot exceed 1 billion. There is no limit on the parameter count of your own model.


Q: How many labels are there for the polarity of the quadruple?

A: Our annotated dataset includes five labels for polarity: positive, negative, neutral, doubt, and ambiguous. Since instances with the last three labels are minimal, we merge them into a new category called other for training and evaluation. Overall, we use three label types in our training and evaluation: positive, negative, and other. You can find more details about this process in our released preprocessing and evaluation code.



ORGANIZERS

Bobo Li, Fei Li, Donghong Ji

Languag and Cognition Computing Laboratory, Wuhan University

Hao Fei

NeXT++ Research Center, National University of Singapore

Lizi Liao

School of Computing and Information Systems, Singapore Management University

Contact us: