Most Cited Articles

Open access

ISSN: 2666-6510

Graph neural networks: A review of methods and applications

Share article

A survey of transformers

Share article

Pre-trained models: Past, present and future

Share article

PTR: Prompt Tuning with Rules for Text Classification

Share article

Data augmentation approaches in natural language processing: A survey

Share article

Advances and challenges in conversational recommender systems: A survey

Share article

Lawformer: A pre-trained language model for Chinese legal long documents

Share article

A comprehensive survey of entity alignment for knowledge graphs

Share article

Neural machine translation: A review of methods, resources, and tools

Share article

Neural, symbolic and neural-symbolic reasoning on knowledge graphs

Share article

Deep learning for fake news detection: A comprehensive survey

Share article

CPM: A large-scale generative Chinese Pre-trained language model

Share article

GPT understands, too

Share article

WuDaoCorpora: A super large-scale Chinese corpora for pre-training language models

Share article

Sarcasm detection using news headlines dataset

Share article

CPM-2: Large-scale cost-effective pre-trained language models

Share article

A survey on heterogeneous information network based recommender systems: Concepts, methods, applications and resources

Share article

CokeBERT: Contextual knowledge selection and embedding towards enhanced pre-trained language models

Share article

A comprehensive review on resolving ambiguities in natural language processing

Share article

Survey: Transformer based video-language pre-training

Share article

Extracting Events and Their Relations from Texts: A Survey on Recent Research Progress and Challenges

Share article

Learning towards conversational AI: A survey

Share article

Network representation learning: A macro and micro view

Share article

User behavior modeling for Web search evaluation

Share article

Stay Informed

Register your interest and receive email alerts tailored to your needs. Sign up below.