Repository logo
  • Log In
    Log in via Symplectic to deposit your publication(s).
Repository logo
  • Communities & Collections
  • Research Outputs
  • Statistics
  • Log In
    Log in via Symplectic to deposit your publication(s).
  1. Home
  2. Faculty of Engineering
  3. Computing
  4. Computing
  5. Multimodal attention for neural machine translation
 
  • Details
Multimodal attention for neural machine translation
File(s)
1609.03976v1.pdf (870.27 KB)
Working paper
Author(s)
Caglayan, Ozan
Barrault, Loïc
Bougares, Fethi
Type
Working Paper
Abstract
The attention mechanism is an important part of the neural machine
translation (NMT) where it was reported to produce richer source representation
compared to fixed-length encoding sequence-to-sequence models. Recently, the
effectiveness of attention has also been explored in the context of image
captioning. In this work, we assess the feasibility of a multimodal attention
mechanism that simultaneously focus over an image and its natural language
description for generating a description in another language. We train several
variants of our proposed attention mechanism on the Multi30k multilingual image
captioning dataset. We show that a dedicated attention for each modality
achieves up to 1.6 points in BLEU and METEOR compared to a textual NMT
baseline.
Date Issued
2016-09-13
Citation
2016
URI
http://hdl.handle.net/10044/1/74172
URL
http://arxiv.org/abs/1609.03976v1
Publisher
arxiv
Copyright Statement
© 2016 The Authors.
Identifier
http://arxiv.org/abs/1609.03976v1
Subjects
cs.CL
cs.CL
cs.NE
Notes
10 pages, under review COLING 2016
Publication Status
Published
About
Spiral Depositing with Spiral Publishing with Spiral Symplectic
Contact us
Open access team Report an issue
Other Services
Scholarly Communications Library Services
logo

Imperial College London

South Kensington Campus

London SW7 2AZ, UK

tel: +44 (0)20 7589 5111

Accessibility Modern slavery statement Cookie Policy

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback