At Facebook, we care about building community across the globe. To do so, we need to break down language barriers using machine translation. One of the big challenges is to achieve great translation accuracy in the absence of large quantities of parallel corpora. This is especially true for Neural Machine Translation (NMT) which uses models with a large amount of parameters.
Facebook is pleased to invite the academic community to respond to this call for research proposals on low-resource Neural Machine Translation. Applicants for the research awards will be expected to contribute to the field of low-resource NMT through research into novel, strongly performing models under low-resource training conditions and/or comparable corpora mining techniques for low-resource language pairs.
Applicants should submit a two-page proposal outlining their intended research and a budget overview of how funding will be used. Additionally, participants need to submit a timeline with quarterly milestones.
Awards will be made in amounts ranging from $20,000 to $40,000 per proposal for projects up to one year in duration beginning June 2018. Successful proposals will demonstrate innovative and compelling research that has the potential to significantly advance technology. Award amounts will be determined at the sole discretion of the evaluation committee. Participants need to be prepared to show milestone completion and present results at the sixth-month mark (November 2018). Up to 4 projects will be awarded.
Representatives from each awarded project will be invited to a workshop with other participants in September 2018, and are expected to attend an evaluation meetingin late November 2018. Opportunities for a second round of funding will be determined at the November meeting. Travel costs to Menlo Park CA, USA should be included in the proposed budget. Award recipients will be listed on the Facebook Research website and will be encouraged to openly publish any findings from their work as well as make any code available as open source.