Tensorflow2.0 Question Answering Anton Nikolaev, Nick Glaser ICS 661 - Final Report

1. Introduction applicable, sections from Wikipedia articles Natural language processing (NLP) containing the answer. In contrast to some is one of the domains where the emergence other QA datasets, NQ also provides of deep learning (DL) has had the largest answer candidates for each question as well impact, improving performance across as a context level indicator. The candidates almost the entire spectrum of NLP. contain the indices representing the One of the kinds of problems that are respective start and end tokens for each currently being solved by DL researchers answer. The context indicator is a binary are reading comprehension/ question value that signals whether a given answering (QA) problems. For our project, candidate answer is also contained within we joined a kaggle competition based on a another candidate (nested) or whether it is novel QA dataset provided by Google the only candidate containing the specific Research titled Natural Questions (NQ). Our passage (top-level). This additional goal was to evaluate the performance of information can help improve model some of the current state-of-the-art NLP accuracy after the training stage, but is not architectures on this dataset. traditionally used during training itself. Overall, the entire dataset contains 2. Problem and Dataset about 300,000 training examples as well as The goal of the QA task is just under 8000 test examples that are essentially two-fold: the algorithm is ultimately used to evaluate our model’s provided with a text passage and a performance on kaggle. corresponding question from an arbitrary An example question from the training data domain with the goal of first understanding is given in Appendix A. For more details on the question and subsequently providing the the dataset, visit: appropriate answer (given that one exists) https://github.com/google-resea extracted from the text passage at hand. rch-datasets/natural-questions For the problem-formulation in the kaggle challenge, the answer could potentially 2.1 Metrics come in four formats: as a long answer, For the competition, correct answers usually being about a paragraph in length; must have the correct answer type selected as a short answer, being no longer than a and, where applicable, must also provide sentence and as short as a few words; as a the exactly correct answer span. If the yes/no answer, which is treated as a correct answer type is selected but the sub-type of the short answer; or, lastly, answer span is incorrect, that will still be there could be no answer in the text which considered a misclassification. the algorithm would also have to correctly Overall, the relevant metric is micro-F1 identify. between all answer types. F1 is the The dataset itself consists of google harmonic mean between the precision and queries taken from actual users and, if recall of the model. If the model has multiple answer types, we instead take the micro complicated. These features however, average of the precision and recall metrics comes with the consequence that there is for each answer type. So, if we were to no way to support infinitely long input have two answer types and wanted to get sequences and instead the micro-F1, we would compute the input-sequence-length has to be constant following: and requires padding/ truncating the data. Nevertheless, empirically it is evident that tp1+tp2 this trade-off is overall beneficial to micro average precision = tp +tp +fp +fp 1 2 1 2 performance. tp +tp micro average recall = 1 2 While we plan on trying other architectures tp1+tp2+fn1+fn2 in the future, thus far, our results are limited

to BERT [2]. In its essence, BERT, like And, thus, finally, the micro-F1 would just many of the other models, is just a very be: deep network with many subsequent

m.a. precision · m.a. recall Transformer layers and many attention micro − F 1 = 2 · m.a. precision + m.a. recall heads per layer. What makes BERT and other large language models so powerful aside from their incredibly large number of trainable parameters (110M for the small 3. Background BERT model), is the very scalable initial The Transformer architecture training procedure. Language models (Figure 1) has two main distinctive features, heavily depend on pretraining on large ​ its bidirectionality and its multi-headed corpora. attention mechanism, and in concept is similar to other encoder-decoder networks [1]. While it is beyond the scope of this report to fully explore the architecture, it is worth highlighting how these features contribute to the impact the architecture has had in the past two years. Both, bidirectionality and multi-head attention, allow the model to establish more complicated dependencies than other commonly used models like RNNs would be able to and are thus better suited to model language data. RNNs, specifically the LSTMs [3] which are the most commonly-used variant of recurrent networks, are strongly limited by the way in which they sequentially process the data which doesn’t always make sense for Figure 1: Transformer Architecture [1] language, where relationships between ​ ​ components are often much more In the case of BERT, this is achieved in two and short answers, we need to have ways, the first of which is reminiscent of multiple outputs for the model. how neural bag-of-word embedding models Overall, we ended up with three outputs. are trained. The input consists of a The first two outputs contain logits paragraph of text with 15% of the words corresponding to each token of the input masked and the model is then required to paragraph. These can be understood as a predict those words from their context. value indicating how likely a given token is The second task is known as to be the start or end token of the correct next-sentence-prediction (NSP). In NSP, the answer to the question. This will be model is given a context paragraph and explained in more detail in the then a potential next sentence. It then has post-processing section. The last output is a to perform binary classification to indicate five-unit vector that indicates what type of whether that sentence truly follows the answer we expect for the input question given context or was simply picked at (short, long, yes, no, no-answer). random from the corpus, with both cases occurring about half of the time. This 4.2 Data Processing training task is perhaps more closely related Preprocessing has to be done to the QA problem that we are interested in. exactly as is detailed in the BERT paper [2] Due to the semi-supervised nature of these since the input format needs to be tasks, it is easy to scale this to very large consistent between the pre-training and training corpora, which are in BERTs case fine-tuning/ prediction phases. The most the Book Corpus (800M words) and all of notable preprocessing component is the English Wikipedia (2,500M words). tokenization of the paragraph, which is done by matching the tokens to a predefined 4. Our Approach vocabulary based on a greedy Currently, most QA datasets have longest-match-first approach. After their performance leaderboards dominated tokenization, the sequence is padded or by Transformer-based models (for a truncated to the length of 512 tokens. collection of datasets and leaderboards During fine-tuning, processing the model visit: outputs is still relatively consistent with what paperswithcode.com/task/questio is described in [2], however, in order to train n-answering). the answer-type output layer an additional ​ We set out to use some of the most loss-term has to be added to account for the prevalent of these architectures and adapt additional outputs. them for the specific formulation of this QA Handling the model outputs requires task. some more work because the competition was set up differently from common QA 4.1 Model tasks as mentioned above. We downloaded one of the publicly First, the logits are analyzed. The model is available pretrained BERT models and then trained to increase the start-end logit scores adapted and fine-tuned it for the QA task. based on the short answer span. So we Because we have to predict both the take the 20 highest scoring start and end answer type and start and indices of long logits respectively, and then create all possible pairs. From the pool of pairs, we ensemble their prediction to hopefully create select the viable candidates, meaning the a more robust overall model. start index is before the end index, and then With that being said, currently our sum their logit scores. The highest overall micro-F1 score is at 0.57, which, at the scoring candidate is then selected as a writing of this report, ranks 36th out of short answer. approximately 850 contestants. We are Next, the long answer is selected out of the confident that the addition of more models; pool of predefined candidates based on two proper answer selection, including proper criteria: it must contain the highest scoring yes/no-answer selection; and ensembling short answer, and it must be a top-level will help further improve that ranking. answer which prevents us from getting multiple matches. 6. Summary Ultimately, which answer type is chosen For sophisticated NLP tasks like depends on both the combined question answering, one of the biggest start-end-logit score as well as the outputs takeaways from this project is the realization in the answer-type vector. The scores are that the pre- and post-processing are some combined and whether they pass an of the largest parts of the puzzle. Given that empirically set threshold or not determines most of the major NLP architectures can what answer type is selected. easily be found on Github and pretrained models are readily available, the truly 5. Analysis difficult part of the project is to find out how So far, the biggest short-comings of to properly leverage these powerful models our approach are due to time-constraints for the given task. That requires intimate and will likely be resolved before the end of understanding of the problem itself, the the competition, however, as of right now, data, and common pre- and post-processing there are several issues with our model. practices. Properly formatting data and While it is fairly successful in selecting the understanding model outputs is also really correct answer type, it does not yet correctly important and can be a large timesink if select yes/no answers. done incorrectly. Lastly, fine-tuning a Additionally, the answer-type selection is pretrained model for the task at hand and very hacky. The current mechanism is the really taking enough time to set all result of trial and error and there are likely parameters correctly can yield large some improvements to be made. performance improvements as well. We have yet to try models beyond BERT. We have trained some models on other QA datasets and plan to fine-tune them on the provided NQ training set, but small compatibility issues as well as compute time have been slowing the project down on that end. Once we have more models fully up and running for the task, we do plan to not only score them individually on the task, but also Bibliography

[1] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser. Attention Is All You Need, 2017; arXiv:1706.03762. [2] Jacob Devlin, Ming-Wei Chang, Kenton Lee. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, 2018; arXiv:1810.04805. [3] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997

Appendix A

{"example_id": "-1220107454853145579", "question_text": "who is the south african high commissioner in ", "Document_text": "High Commission of South Africa , London - wikipedia

High Commission of South Africa , London <\/H1> High Commission of South Africa in London <\/Th> <\/Tr> <\/Td> <\/Tr>
Location <\/Th> , London <\/Td> <\/Tr>
Address <\/Th> Trafalgar Square , London , WC2N 5DP <\/Td> <\/Tr>
Coordinates <\/Th> 51 \u00b0 30 \u2032 30 '' N 0 \u00b0 07 \u2032 37 '' W \ufeff \/ \ufeff 51.5082 \u00b0 N 0.1269 \u00b0 W \ufeff \/ 51.5082 ; - 0.1269 Coordinates : 51 \u00b0 30 \u2032 30 '' N 0 \u00b0 07 \u2032 37 '' W \ufeff \/ \ufeff 51.5082 \u00b0 N 0.1269 \u00b0 W \ufeff \/ 51.5082 ; - 0.1269 <\/Td> <\/Tr>
High Commissioner <\/Th> Vacant <\/Td> <\/Tr> <\/Table> Balcony of South Africa House

The High Commission of South Africa in London is the diplomatic mission from South Africa to the United Kingdom . It is located at South Africa House , a building on Trafalgar Square , London . As well as containing the offices of the High Commissioner , the building also hosts the South African consulate . It has been a Grade II * Listed Building since 1982 . <\/P>

Contents <\/H2>
  • 1 History <\/Li>
  • 2 See also <\/Li>
  • 3 References <\/Li>
  • 4 External links <\/Li> <\/Ul>

    History ( edit ) <\/H2>

    South Africa House was built by Holland , Hannen & Cubitts in the 1930s on the site of what had been Morley 's Hotel until it was demolished in 1936 . The building was designed by Sir Herbert Baker , with architectural sculpture by Coert Steynberg and Sir Charles Wheeler , and opened in 1933 . The building was acquired by the government of South Africa as its main diplomatic presence in the UK . During World War II , Prime Minister Jan Smuts lived there while conducting South Africa 's war plans . <\/P>

    In 1961 , South Africa became a republic , and withdrew from the Commonwealth due to its policy of racial segregation . Accordingly , the building became an Embassy , rather than a High Commission . During the 1980s , the building , which was one of the only South African diplomatic missions in a public area , was targeted by protesters from around the world . During the 1990 , the building was set alight by rioters , although not seriously damaged . <\/P>

    The first fully free democratic elections in South Africa were held on the 27 April 1994 , and 4 days later , the country rejoined the Commonwealth , 33 years to the day after it withdrew upon becoming a republic . Along with country 's diplomatic missions in other Commonwealth countries , the mission once again became a High Commission . <\/P>

    Today , South Africa House is no longer a controversial site , and is the focal point of South African culture in the UK . South African President Nelson Mandela appeared on the balcony of South Africa House in 1996 , as part of his official UK state visit . In 2001 , Mandela again appeared on the balcony of South Africa House to mark the seventh anniversary of Freedom Day , when the apartheid system was officially abolished . <\/P>

    See also ( edit ) <\/H2>
    • List of diplomatic missions of South Africa <\/Li>
    • High Commission of Canada to the United Kingdom <\/Li>
    • High Commission of Uganda , London <\/Li> <\/Ul>

      References ( edit ) <\/H2>
      <\/Td> Wikimedia Commons has media related to South Africa House , London . <\/Td> <\/Tr> <\/Table>
      1. ^ Jump up to : `` The London Diplomatic List '' ( PDF ) . 14 December 2013 . Archived from the original ( PDF ) on 11 December 2013 . <\/Li>
      2. Jump up ^ . `` Details from listed building database ( 1066238 ) '' . National Heritage List for England . Retrieved 28 September 2015 . <\/Li>
      3. Jump up ^ Cubitts 1810 -- 1975 , published 1975 <\/Li>
      4. Jump up ^ `` The east side of Trafalgar Square '' . BHO . Retrieved 22 November 2015 . <\/Li>
      5. Jump up ^ Palliser , David Michael ; Clark , Peter ; Daunton , Martin J. ( 2000 ) . The Cambridge Urban History of Britain : 1840 -- 1950 . Cambridge University Press . p. 126 . <\/Li>
      6. ^ Jump up to : South Africa returns to the Commonwealth fold , The Independent , 31 May 1994 <\/Li>
      7. Jump up ^ Burns , Danny ( 1992 ) . Poll tax rebellion . AK Press . p. 90 . <\/Li>
      8. Jump up ^ United Kingdom of Great Britain and Northern Ireland , Department of International Relations and Cooperation <\/Li>
      9. Jump up ^ Hero 's welcome for Mandela at concert . BBC News . April 30 , 2001 . <\/Li> <\/Ol>

        External links ( edit ) <\/H2>
        • Official site <\/Li> <\/Ul>
          • <\/Li>
          • <\/Li>
          • <\/Li> <\/Ul> Diplomatic missions in the United Kingdom <\/Th> <\/Tr>
          Africa <\/Th>
          • Algeria <\/Li>
          • Angola <\/Li>
          • Botswana <\/Li>
          • Burundi <\/Li>
          • Cameroon <\/Li>
          • Democratic Republic of the Congo <\/Li>
          • Egypt <\/Li>
          • Equatorial Guinea <\/Li>
          • Eritrea <\/Li>
          • Ethiopia <\/Li>
          • Gabon <\/Li>
          • The Gambia <\/Li>
          • Ghana <\/Li>
          • Guinea <\/Li>
          • Ivory Coast <\/Li>
          • Kenya <\/Li>
          • Lesotho <\/Li>
          • Liberia <\/Li>
          • Libya <\/Li>
          • Malawi <\/Li>
          • Mauritania <\/Li>
          • Mauritius <\/Li>
          • Morocco <\/Li>
          • Mozambique <\/Li>
          • Namibia <\/Li>
          • Nigeria <\/Li>
          • Rwanda <\/Li>
          • Senegal <\/Li>
          • Seychelles <\/Li>
          • Sierra Leone <\/Li>
          • South Africa <\/Li>
          • South Sudan <\/Li>
          • Sudan <\/Li>
          • Swaziland <\/Li>
          • Tanzania <\/Li>
          • Togo <\/Li>
          • Tunisia <\/Li>
          • Uganda <\/Li>
          • Zambia <\/Li>
          • Zimbabwe <\/Li> <\/Ul> <\/Td> <\/Tr>
          Americas <\/Th>
          • Antigua and Barbuda <\/Li>
          • Argentina <\/Li>
          • The Bahamas <\/Li>
          • Barbados <\/Li>
          • Belize <\/Li>
          • Bolivia <\/Li>
          • Brazil <\/Li>
          • Canada <\/Li>
          • Chile <\/Li>
          • Colombia <\/Li>
          • Costa Rica <\/Li>
          • Cuba <\/Li>
          • Dominica <\/Li>
          • Dominican Republic <\/Li>
          • Ecuador <\/Li>
          • El Salvador <\/Li>
          • Grenada <\/Li>
          • Guatemala <\/Li>
          • Guyana <\/Li>
          • Haiti <\/Li>
          • Honduras <\/Li>
          • Jamaica <\/Li>
          • Mexico <\/Li>
          • Nicaragua <\/Li>
          • Panama <\/Li>
          • Paraguay <\/Li>
          • Peru <\/Li>
          • Saint Kitts and Nevis <\/Li>
          • Saint Lucia <\/Li>
          • Saint Vincent and the Grenadines <\/Li>
          • Trinidad and Tobago <\/Li>
          • United States of America <\/Li>
          • Uruguay <\/Li>
          • Venezuela <\/Li> <\/Ul> <\/Td> <\/Tr>
          Asia <\/Th>
          • Afghanistan <\/Li>
          • Armenia <\/Li>
          • Azerbaijan <\/Li>
          • Bahrain <\/Li>
          • Bangladesh <\/Li>
          • Brunei <\/Li>
          • Cambodia <\/Li>
          • China <\/Li>
          • East Timor <\/Li>
          • Georgia <\/Li>
          • India <\/Li>
          • Indonesia <\/Li>
          • Iran <\/Li>
          • Iraq <\/Li>
          • Israel <\/Li>
          • Japan <\/Li>
          • Jordan <\/Li>
          • Kazakhstan <\/Li>
          • Kuwait <\/Li>
          • Kyrgyzstan <\/Li>
          • Laos <\/Li>
          • Lebanon <\/Li>
          • Malaysia <\/Li>
          • Maldives <\/Li>
          • Mongolia <\/Li>
          • Myanmar <\/Li>
          • Nepal <\/Li>
          • North Korea <\/Li>
          • Oman <\/Li>
          • Pakistan <\/Li>
          • The Philippines <\/Li>
          • Qatar <\/Li>
          • Saudi Arabia <\/Li>
          • Singapore <\/Li>
          • South Korea <\/Li>
          • Sri Lanka <\/Li>
          • Syria <\/Li>
          • Tajikistan <\/Li>
          • Thailand <\/Li>
          • Turkey <\/Li>
          • Turkmenistan <\/Li>
          • United Arab Emirates <\/Li>
          • Uzbekistan <\/Li>
          • Vietnam <\/Li>
          • Yemen <\/Li> <\/Ul> <\/Td> <\/Tr>
          Europe <\/Th>
          • Albania <\/Li>
          • Austria <\/Li>
          • Belarus <\/Li>
          • Belgium <\/Li>
          • Bosnia and Herzegovina <\/Li>
          • Bulgaria <\/Li>
          • Croatia <\/Li>
          • Cyprus <\/Li>
          • Czech Republic <\/Li>
          • Denmark <\/Li>
          • Estonia <\/Li>
          • Finland <\/Li>
          • France <\/Li>
          • Germany <\/Li>
          • Greece <\/Li>
          • Hungary <\/Li>
          • Iceland <\/Li>
          • Ireland <\/Li>
          • Italy <\/Li>
          • Kosovo <\/Li>
          • Latvia <\/Li>
          • Lithuania <\/Li>
          • Luxembourg <\/Li>
          • Macedonia <\/Li>
          • Malta <\/Li>
          • Moldova <\/Li>
          • Monaco <\/Li>
          • Montenegro <\/Li>
          • The Netherlands <\/Li>
          • Norway <\/Li>
          • Poland <\/Li>
          • Portugal <\/Li>
          • Romania <\/Li>
          • Russia <\/Li>
          • Serbia <\/Li>
          • Slovakia <\/Li>
          • Slovenia <\/Li>
          • Spain <\/Li>
          • Sweden <\/Li>
          • Switzerland <\/Li>
          • Ukraine <\/Li>
          • Vatican City ( Apostolic Nunciature ) <\/Li> <\/Ul> <\/Td> <\/Tr>
          Oceania <\/Th>
          • Australia <\/Li>
          • Fiji <\/Li>
          • New Zealand <\/Li>
          • Papua New Guinea <\/Li>
          • Tonga <\/Li> <\/Ul> <\/Td> <\/Tr>
          States with limited recognition <\/Th>
          • North Cyprus <\/Li>
          • Palestine <\/Li>
          • Taiwan <\/Li> <\/Ul> <\/Td> <\/Tr>
          De facto independent states <\/Th>
          • Somaliland <\/Li> <\/Ul> <\/Td> <\/Tr>
          British Overseas Territories <\/Th>
          • Anguilla <\/Li>
          • Bermuda <\/Li>
          • British Virgin Islands <\/Li>
          • Cayman Islands <\/Li>
          • Falkland Islands <\/Li>
          • Gibraltar <\/Li>
          • Montserrat <\/Li>
          • Saint Helena <\/Li>
          • Tristan da Cunha <\/Li>
          • Turks and Caicos Islands <\/Li> <\/Ul> <\/Td> <\/Tr>
          Other economies with their own representations <\/Th> Hong Kong <\/Td> <\/Tr>
          International organisations <\/Th>
          • Arab League <\/Li>
          • European Union <\/Li>
          • International Organisation for Migration <\/Li>
          • United Nations
            • UNHCR <\/Li>
            • World Food Programme <\/Li> <\/Ul> <\/Li>
            • World Bank <\/Li> <\/Ul> <\/Td> <\/Tr> <\/Table>
              • <\/Li>
              • <\/Li>
              • <\/Li> <\/Ul> Trafalgar Square , London <\/Th> <\/Tr>
              Buildings <\/Th>
              Current <\/Th>
              • Clockwise from North : <\/Li>
              • St Martin - in - the - Fields <\/Li>
              • South Africa House <\/Li>
              • Drummonds Bank <\/Li>
              • <\/Li>
              • Uganda House
                • Embassy of Burundi <\/Li>
                • High Commission of Uganda <\/Li> <\/Ul> <\/Li>
                • Canadian Pacific building <\/Li>
                • Admiralty ( pub ) <\/Li>
                • <\/Li> <\/Ul> <\/Td> <\/Tr>
              Former <\/Th> <\/Td> <\/Tr>
              Statues <\/Th>
              Plinths <\/Th>
              • SE : Henry Havelock <\/Li>
              • SW : Charles Napier <\/Li>
              • NE : George IV <\/Li>
              • NW : Fourth plinth <\/Li> <\/Ul> <\/Td> <\/Tr>
              Busts <\/Th>
              • Lord Beatty <\/Li>
              • Lord Jellicoe <\/Li>
              • Lord Cunningham <\/Li> <\/Ul> <\/Td> <\/Tr>
              Other <\/Th>
              • Charles I
                • <\/Li> <\/Ul> <\/Li>
                • Nelson 's Column <\/Li>
                • James II <\/Li>
                • George Washington <\/Li> <\/Ul> <\/Td> <\/Tr> <\/Table> <\/Td> <\/Tr>
              Adjacent streets <\/Th>
              People <\/Th>
              • <\/Li>
              • Commons <\/Li> <\/Ul> <\/Td> <\/Tr> <\/Table> Retrieved from `` https:\/\/en.wikipedia.org\/w\/index.php?title=High_Commission_of_South_Africa,_Londo n&oldid=850142361 '' Categories :
                • Diplomatic missions in London <\/Li>
                • Trafalgar Square <\/Li>
                • Diplomatic missions of South Africa <\/Li>
                • Herbert Baker buildings and structures <\/Li>
                • South Africa -- United Kingdom relations <\/Li>
                • South Africa and the Commonwealth of Nations <\/Li>
                • Grade II * listed buildings in the City of <\/Li>
                • Buildings and structures completed in 1933 <\/Li> <\/Ul>
                  • <\/Li>
                  • <\/Li> <\/Ul>

                    <\/H2>

                    <\/H3>
                    • <\/Li>
                    • Talk <\/Li>
                    • <\/Li>
                    • <\/Li>
                    • <\/Li> <\/Ul>

                      <\/H3>
                      • <\/Li>
                      • <\/Li> <\/Ul>

                        <\/H3>
                          <\/Ul>

                          <\/H3>
                          • <\/Li>
                          • <\/Li>
                          • <\/Li> <\/Ul>

                            <\/H3>
                              <\/Ul>

                              <\/H3>

                              <\/H3>
                              • <\/Li>
                              • Contents <\/Li>
                              • <\/Li>
                              • <\/Li>
                              • <\/Li>
                              • <\/Li>
                              • <\/Li> <\/Ul>

                                <\/H3>
                                • <\/Li>
                                • About Wikipedia <\/Li>
                                • <\/Li>
                                • <\/Li>
                                • <\/Li> <\/Ul>

                                  <\/H3>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li>
                                  • <\/Li> <\/Ul>

                                    <\/H3>
                                    • <\/Li>
                                    • <\/Li>
                                    • <\/Li> <\/Ul>

                                      <\/H3>
                                      • <\/Li> <\/Ul>

                                        <\/H3>
                                        • Afrikaans <\/Li> <\/Ul> Edit links
                                          • This page was last edited on 13 July 2018 , at 22 : 10 ( UTC ) . <\/Li>
                                          • <\/Li> <\/Ul>
                                            • <\/Li>
                                            • About Wikipedia <\/Li>
                                            • <\/Li>
                                            • <\/Li>
                                            • <\/Li>
                                            • <\/Li>
                                            • <\/Li>
                                            • <\/Li> <\/Ul>
                                              • <\/Li>
                                              • <\/Li> <\/Ul>", "long_answer_candidates": [{"end_token":136,"start_token":18,"top_level":true},{"end_token":30,"start_token":19,"to p_level":false},{"end_token":45,"start_token":34,"top_level":false},{"end_token":59,"start_ token":45,"top_level":false},{"end_token":126,"start_token":59,"top_level":false},{"end_to ken":135,"start_token":126,"top_level":false},{"end_token":211,"start_token":141,"top_lev el":true},{"end_token":336,"start_token":240,"top_level":true},{"end_token":425,"start_tok en":336,"top_level":true},{"end_token":488,"start_token":425,"top_level":true},{"end_toke n":570,"start_token":488,"top_level":true}]}

              Architects <\/Th>
              Fourth plinth sculptors <\/Th>
              • Elmgreen and Dragset <\/Li>
              • Katharina Fritsch
                • Hahn \/ Cock <\/Li> <\/Ul> <\/Li>
                  • One & Other <\/Li> <\/Ul> <\/Li>
                  • <\/Li>
                  • Thomas Sch\u00fctte <\/Li>
                  • Yinka Shonibare <\/Li>
                  • <\/Li>
                  • Rachel Whiteread <\/Li>
                  • Bill Woodrow <\/Li> <\/Ul> <\/Td> <\/Tr> <\/Table> <\/Td> <\/Tr>
              Events <\/Th>
              • Poll Tax Riots <\/Li> <\/Ul> <\/Td> <\/Tr>
              Miscellaneous <\/Th>
              • Christmas tree <\/Li> <\/Ul> <\/Td> <\/Tr>