Entity Linking via Dual and Cross-Attention Encoders Oshin Agarwal∗ Daniel M. Bikel University of Pennsylvania Google Research
[email protected] [email protected] Abstract the most likely entity from these candidates. Of- ten, priors and alias tables (a.k.a. candidate tables) Entity Linking has two main open areas of are used to generate the set of candidate entities, research: ¬ generate candidate entities with- and work on entity linking has focused on either out using alias tables and generate more contextual representations for both mentions generating better alias tables or reranking the set of and entities. Recently, a solution has been candidate entities generated from alias tables. proposed for the former as a dual-encoder en- Alias tables, however, suffer from many draw- tity retrieval system (Gillick et al., 2019) that backs. They are based on prior probabilities of learns mention and entity representations in a mention referring to an entity using occurrence the same space, and performs linking by se- counts of mention strings. As a result, they are lecting the nearest entity to the mention in heavily biased towards the most common entity this space. In this work, we use this re- trieval system solely for generating candidate and do not take into account complex features such entities. We then rerank the entities by us- as mention context and entity description. Such ing a cross-attention encoder over the tar- features can help not only in better disambigua- get mention and each of the candidate en- tion to existing entities but also when new entities tities.