1

The Importance of Being Parameters: An Intra-Distillation Method for Serious Gains

Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient Optimization in Few-Shot Cross-Lingual Transfer

BERT, mBERT, BiBERT? A Study on Contextualized Embeddings for Neural Machine Translation

Everything Is All It Takes: A Multipronged Strategy for Zero-Shot Cross-Lingual Information Extraction

VAE based Text Style Transfer with Pivot Words Enhancement Learning

Gradual Fine-Tuning for Low-Resource Domain Adaptation

Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation

Cross-Lingual BERT Contextual Embedding Space Mapping with Isotropic and Isometric Conditions