Carregant...
Carregant...

Vés al contingut (premeu Retorn)

A differentiable BLEU loss. Analysis and first results

Autor
Casas, N.; Fonollosa, José A. R.; Ruiz, M.
Tipus d'activitat
Presentació treball a congrés
Nom de l'edició
6th International Conference on Learning Representations
Any de l'edició
2018
Data de presentació
2018-05-02
Llibre d'actes
Sixth International Conference on Learning Representations: Monday April 30-Thursday May 03, 2018, Vancouver Convention Center, Vancouver: [proceedings]
Projecte finançador
Autonomous lifelong learning intelligent systems
Tecnologías de aprendizaje profundo aplicadas al procesado de voz y audio
Repositori
http://hdl.handle.net/2117/117201 Obrir en finestra nova
URL
https://openreview.net/forum?id=HkG7hzyvf Obrir en finestra nova
Resum
In natural language generation tasks, like neural machine translation and image captioning, there is usually a mismatch between the optimized loss and the de facto evaluation criterion, namely token-level maximum likelihood and corpus-level BLEU score. This article tries to reduce this gap by defining differentiable computations of the BLEU and GLEU scores. We test this approach on simple tasks, obtaining valuable lessons on its potential applications but also its pitfalls, mainly that these los...
Citació
Casas, N., Fonollosa, José A. R., Ruiz, M. A differentiable BLEU loss. Analysis and first results. A: International Conference on Learning Representations. "ICLR 2018 Workshop Track: 6th International Conference on Learning Representations: Vancouver Convention Center, Vancouver, BC, Canada: April 30-May 3, 2018". 2018.
Paraules clau
BLEU, Differentiable, GLEU, NMT, Seq2seq
Grup de recerca
IDEAI-UPC Intelligent Data Science and Artificial Intelligence
TALP - Centre de Tecnologies i Aplicacions del Llenguatge i la Parla
VEU - Grup de Tractament de la Parla

Participants