A Method for Animating Children’s Drawings of the Human Figure
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Conference on Empirical Methods in Natural Language Processing (EMNLP)
This paper examines the encoding of analogy in large-scale pre-trained language models, such as BERT and GPT-2. Existing analogy datasets typically focus on a limited set of analogical relations, with a high similarity of the two domains between which the analogy holds. As a more realistic setup, we introduce the Scientific and Creative Analogy dataset (SCAN), a novel analogy dataset containing systematic mappings of multiple attributes and relational structures across dissimilar domains. Using this dataset, we test the analogical reasoning capabilities of several widely-used pre-trained language models (LMs). We find that state-ofthe-art LMs achieve low performance on these complex analogy tasks, highlighting the challenges still posed by analogy understanding.
target source targ_word src_word alternatives analogy_typeatom solar system nucleus sun scienceatom solar system electron planet scienceatom solar system charge mass scienceatom solar system attracts attracts scienceatom solar system revolves revolves scienceatom solar system electromagnetism gravity scienceheat transfer water flow transfers flows scienceheat transfer water flow temperature pressure scienceheat transfer water flow burner water tower scienceheat transfer water flow kettle bucket scienceheat transfer water flow heating filling scienceheat transfer water flow cooling emptying scienceheat transfer water flow thermodynamics hydrodynamics sciencesounds waves wall shore science
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré