10
IRUS Total
Downloads
  Altmetric

Attending to Characters in Neural Sequence Labeling Models

File Description SizeFormat 
C16-1030.pdfPublished version269.31 kBAdobe PDFView/Open
Title: Attending to Characters in Neural Sequence Labeling Models
Authors: Rei, M
Crichton, GKO
Pyysalo, S
Item Type: Conference Paper
Abstract: Sequence labeling architectures use word embeddings for capturing similarity, but suffer when handling previously unseen or rare words. We investigate character-level extensions to such models and propose a novel architecture for combining alternative word representations. By using an attention mechanism, the model is able to dynamically decide how much information to use from a word- or character-level component. We evaluated different architectures on a range of sequence labeling datasets, and character-level extensions were found to improve performance on every benchmark. In addition, the proposed attention-based architecture delivered the best results even with a smaller number of trainable parameters.
Editors: Matsumoto, Y
Prasad, R
Date of Acceptance: 21-Sep-2016
URI: http://hdl.handle.net/10044/1/76305
Publisher: The COLING 2016 Organizing Committee
Start Page: 309
End Page: 318
Journal / Book Title: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 309–318, Osaka, Japan, December 11-17 2016.
Volume: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Copyright Statement: Creative Commons CC-BY
Conference Name: The 26th International Conference on Computational Linguistics (COLING 2016)
Place of Publication: Osaka, Japan
Publication Status: Published
Conference Place: Osaka, Japan
Appears in Collections:Computing
Faculty of Engineering