11
Out-of-Distribution Detection using Multiple Semantic Label Representations Gabi Shalev Bar-Ilan University, Israel [email protected] Yossi Adi Bar-Ilan University, Israel [email protected] Joseph Keshet Bar-Ilan University, Israel [email protected] Abstract Deep Neural Networks are powerful models that attained remarkable results on a variety of tasks. These models are shown to be extremely efficient when training and test data are drawn from the same distribution. However, it is not clear how a network will act when it is fed with an out-of-distribution example. In this work, we consider the problem of out-of-distribution detection in neural networks. We propose to use multiple semantic dense representations instead of sparse representa- tion as the target label. Specifically, we propose to use several word representations obtained from different corpora or architectures as target labels. We evaluated the proposed model on computer vision, and speech commands detection tasks and compared it to previous methods. Results suggest that our method compares favorably with previous work. Besides, we present the efficiency of our approach for detecting wrongly classified and adversarial examples. 1 Introduction Deep Neural Networks (DNNs) have gained lots of success after enabling several breakthroughs in notably challenging problems such as image classification [12], speech recognition [1] and machine translation [4]. These models are known to generalize well on inputs that are drawn from the same distribution as of the examples that were used to train the model [43]. In real-world scenarios, the input instances to the model can be drawn from different distributions, and in these cases, DNNs tend to perform poorly. Nevertheless, it was observed that DNNs often produce high confidence predictions for unrecognizable inputs [33] or even for a random noise [13]. Moreover, recent works in the field of adversarial examples generation show that due to small input perturbations, DNNs tend to produce high probabilities while being greatly incorrect [11, 6, 17]. When considering AI Saftey, it is essential to train DNNs that are aware of the uncertainty in the predictions [2]. Since DNNs are ubiquitous, present in nearly all segments of technology industry from self-driving cars to automated dialog agents, it becomes critical to design classifiers that can express uncertainty when predicting out-of-distribution inputs. Recently, several studies proposed different approaches to handle this uncertainty [13, 25, 23, 19]. In [13] the authors proposed a baseline method to detect out-of-distribution examples based on the models’ output probabilities. The work in [25] extended the baseline method by using temperature scaling of the softmax function and adding small controlled perturbations to inputs [14]. In [23] it was suggested to add another term to the loss so as to minimize the Kullback-Leibler (KL) divergence between the models’ output for out-of-distribution samples and the uniform distribution. Ensemble of classifiers with optional adversarial training was proposed in [19] for detecting out-of- distribution examples. Despite their high detection rate, ensemble methods require the optimization of several models and therefore are resource intensive. Additionally, each of the classifiers participated in the ensemble is trained independently and the representation is not shared among them. 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montréal, Canada.

Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

Out-of-Distribution Detection using MultipleSemantic Label Representations

Gabi ShalevBar-Ilan University, [email protected]

Yossi AdiBar-Ilan University, Israel

[email protected]

Joseph KeshetBar-Ilan University, [email protected]

Abstract

Deep Neural Networks are powerful models that attained remarkable results on avariety of tasks. These models are shown to be extremely efficient when trainingand test data are drawn from the same distribution. However, it is not clear how anetwork will act when it is fed with an out-of-distribution example. In this work,we consider the problem of out-of-distribution detection in neural networks. Wepropose to use multiple semantic dense representations instead of sparse representa-tion as the target label. Specifically, we propose to use several word representationsobtained from different corpora or architectures as target labels. We evaluatedthe proposed model on computer vision, and speech commands detection tasksand compared it to previous methods. Results suggest that our method comparesfavorably with previous work. Besides, we present the efficiency of our approachfor detecting wrongly classified and adversarial examples.

1 Introduction

Deep Neural Networks (DNNs) have gained lots of success after enabling several breakthroughs innotably challenging problems such as image classification [12], speech recognition [1] and machinetranslation [4]. These models are known to generalize well on inputs that are drawn from the samedistribution as of the examples that were used to train the model [43]. In real-world scenarios, theinput instances to the model can be drawn from different distributions, and in these cases, DNNstend to perform poorly. Nevertheless, it was observed that DNNs often produce high confidencepredictions for unrecognizable inputs [33] or even for a random noise [13]. Moreover, recent worksin the field of adversarial examples generation show that due to small input perturbations, DNNs tendto produce high probabilities while being greatly incorrect [11, 6, 17]. When considering AI Saftey,it is essential to train DNNs that are aware of the uncertainty in the predictions [2]. Since DNNs areubiquitous, present in nearly all segments of technology industry from self-driving cars to automateddialog agents, it becomes critical to design classifiers that can express uncertainty when predictingout-of-distribution inputs.

Recently, several studies proposed different approaches to handle this uncertainty [13, 25, 23, 19].In [13] the authors proposed a baseline method to detect out-of-distribution examples based on themodels’ output probabilities. The work in [25] extended the baseline method by using temperaturescaling of the softmax function and adding small controlled perturbations to inputs [14]. In [23] itwas suggested to add another term to the loss so as to minimize the Kullback-Leibler (KL) divergencebetween the models’ output for out-of-distribution samples and the uniform distribution.

Ensemble of classifiers with optional adversarial training was proposed in [19] for detecting out-of-distribution examples. Despite their high detection rate, ensemble methods require the optimization ofseveral models and therefore are resource intensive. Additionally, each of the classifiers participatedin the ensemble is trained independently and the representation is not shared among them.

32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montréal, Canada.

Page 2: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

In this work, we replace the traditional supervision during training by using several word embeddingsas the model’s supervision, where each of the embeddings was trained on a different corpus or with adifferent architecture. More specifically, our classifier is composed of several regression functions,each of which is trained to predict a word embedding of the target label. At inference time, we gainrobustness in the prediction by making decision based on the output of the regression functions.Additionally, we use the L2-norm of the outputs as a score for detecting out-of-distribution instances.

We were inspired by several studies. In [26] the authors presented a novel technique for robusttransfer learning, where they proposed to optimize multiple orthogonal predictors while using a sharedrepresentation. Although being orthogonal to each other, according to their results, the predictorswere likely to produce identical softmax probabilities. Similarly, we train multiple predictors thatshare a common representation, but instead of using the same supervision and forcing orthogonalitybetween them, we use different supervisions based on word representations. The idea of using wordembeddings as a supervision was proposed in [8] for the task of zero-shot learning. As opposed toours, their model was composed of a single predictor. Last, [39] found a link between the L2-norm ofthe input representation and the ability to discriminate in a target domain. We continue this threadhere, where we explore the use of the L2-norm for detecting out-of-distribution samples.

The contributions of this paper are as follows:

• We propose using several different word embeddings as a supervision to gain diversity andredundancy in an ensemble model with a shared representation.

• We propose utilizing the semantic structure between word embeddings to produce semanticquality predictions.

• We propose using the L2-norm of the output vectors for detecting out-of-distribution inputs.• We examined the use of the above approach for detecting adversarial examples and wrongly

classified examples.

The outline of this paper is as follows. In Section 2, we formulate the notations in the paper. InSection 3 we describe our approach in detail. Section 4 summarizes our empirical results. InSections 5 and Section 6 we explore the use of our method for detecting adversarial examples andwrongly classified examples. In Section 7 we list the related works, and we conclude the paper inSection 8.

2 Notations and Definitions

We denote by X ⊆ Rp the set of instances, which are represented as p-dimensional feature vectors,and we denote by Y = {1, . . . , N} the set of class labels. Each label can be referred as a word, andthe set Y can be considered as a dictionary. We assume that each training example (x, y) ∈ X × Yis drawn from a fixed but unknown distribution ρ. Our goal is to train a classifier that performs wellon unseen examples that are drawn from the distribution ρ, and can also identify out-of-distributionexamples, which are drawn from a different distribution, µ.

Our model is based on word embedding representations. A word embedding is a mapping of a wordor a label in the dictionary Y to a real vector space Z ⊆ RD, so that words that are semanticallyclosed have their corresponding vectors close in Z . Formally, the word embedding is a functione : Y → Z from the set of labels Y to an abstract vector space Z . We assume that distances inthe embedding space Z are measured using the cosine distance which is defined for two vectorsu,v ∈ Z as follows:

dcos(u,v) =1

2

(1− u · v‖u‖ ‖v‖

). (1)

Two labels are considered semantically similar if and only if their corresponding embeddings areclose in Z , namely, when dcos(e(y1), e(y2)) is close to 0. When the cosine distance is close to 1, thecorresponding labels are semantically far apart.

3 Model

Our goal is to build a robust classifier that can identify out-of-distribution inputs. In communicationtheory, robustness is gained by adding redundancy in different levels of the transmission encoding [20].

2

Page 3: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

x<latexit sha1_base64="5wjrnOk9QrryL9pe/o3mQy34hyA=">AAADi3icbZLdbtMwFMfPGj5GYbDBJTcWFdKuqgQkQGjSJiEkLsdKt25NVTmOs1nzR2Q7pZWVN+AWnoYLXoMH4R6nKWqaciRLx+d3Pv62TpJzZmwY/t7pBHfu3ru/+6D78NHe4yf7B0/PjSo0oUOiuNKjBBvKmaRDyyyno1xTLBJOL5LbDxW/mFFtmJJf7CKnE4GvJcsYwdaHzuL5dL8X9sOloW0nWjm945+Hf34BwOn0oGPiVJFCUGkJx8aMozC3E4e1ZYTTshsXhuaY3OJrOvauxIKaiVtKLdFLH0lRprQ/0qJltFnhsDBmIRKfKbC9MW1WBf/HxoXN3k0ck3lhqST1oKzgyCpUvRulTFNi+cI7mGjmtSJygzUm1v9ON5b0K1FCYJm6eF66uBqQZG5elpuMrhlts8yzRPG00q+4yyquaTOjWFcX23S2pjNP0Ubz0QoSzN2oPfmyAS/b8KoBr7anDhp40K71q8TNP1mJO2vzlCiP02mdoYWr7u0krkzVI8Ha/x/nbTywGjO5oWLdr2Zl6dc0ai/ltnP+qh+F/ehz2Dt5A7XtwnN4AYcQwVs4gU9wCkMgkME3+A4/gr3gdfA+OKpTOzurmmewYcHHvyeiORY=</latexit><latexit sha1_base64="uVP+j1JwXQcs2UtoXpFx1Oi8FWA=">AAADi3icbZLdahNBFMenWT9qtNrqpTeLQag3YVdBRQQLInhZG9OmzYYwM3u2HTofy8xsTBj2DQSv9Gm88i18EO+dzUay2Xhg4Mz5nY//DIfknBkbRb93OsGNm7du797p3r23d//B/sHDU6MKTWFIFVd6RLABziQMLbMcRrkGLAiHM3L9vuJnM9CGKfnZLnKYCHwpWcYotj50ksyn+72oHy0t3HbildN79/Pwz69vybPj6UHHJKmihQBpKcfGjOMotxOHtWWUQ9lNCgM5ptf4EsbelViAmbil1DJ86iNpmCntj7ThMtqscFgYsxDEZwpsr0ybVcH/sXFhs9cTx2ReWJC0HpQVPLQqrN4dpkwDtXzhHUw181pDeoU1ptb/TjeR8IUqIbBMXTIvXVINIJmbl+UmgzWDNss8I4qnlX7FXVZxDc2MYl1dbNPZms48DTeaj1aQYu5G7cnnDXjehhcNeLE9ddDAg3atXyVu/ski7qTNU6o8Tqd1hhauureTuDJVD4K1/z/O23hgNWZyQ8W6X83K0q9p3F7Kbef0eT+O+vGnqHf0EtW2ix6jJ+gQxegVOkIf0TEaIooy9BV9Rz+CveBF8CZ4W6d2dlY1j9CGBR/+AmGJOiw=</latexit><latexit sha1_base64="uVP+j1JwXQcs2UtoXpFx1Oi8FWA=">AAADi3icbZLdahNBFMenWT9qtNrqpTeLQag3YVdBRQQLInhZG9OmzYYwM3u2HTofy8xsTBj2DQSv9Gm88i18EO+dzUay2Xhg4Mz5nY//DIfknBkbRb93OsGNm7du797p3r23d//B/sHDU6MKTWFIFVd6RLABziQMLbMcRrkGLAiHM3L9vuJnM9CGKfnZLnKYCHwpWcYotj50ksyn+72oHy0t3HbildN79/Pwz69vybPj6UHHJKmihQBpKcfGjOMotxOHtWWUQ9lNCgM5ptf4EsbelViAmbil1DJ86iNpmCntj7ThMtqscFgYsxDEZwpsr0ybVcH/sXFhs9cTx2ReWJC0HpQVPLQqrN4dpkwDtXzhHUw181pDeoU1ptb/TjeR8IUqIbBMXTIvXVINIJmbl+UmgzWDNss8I4qnlX7FXVZxDc2MYl1dbNPZms48DTeaj1aQYu5G7cnnDXjehhcNeLE9ddDAg3atXyVu/ski7qTNU6o8Tqd1hhauureTuDJVD4K1/z/O23hgNWZyQ8W6X83K0q9p3F7Kbef0eT+O+vGnqHf0EtW2ix6jJ+gQxegVOkIf0TEaIooy9BV9Rz+CveBF8CZ4W6d2dlY1j9CGBR/+AmGJOiw=</latexit><latexit sha1_base64="irjMf6AI3iH2bTbZS+MwUqDa7yU=">AAADi3icbZLdbtMwFMe9BtgoDDa45MaiQuKqSkAa07SLSQhpl2OlW7emqhznZLPmj8h2Sisrb7BbeDfeBqcNappyJEvH5/c/H7ZOknNmbBj+2ekET54+29173n3xcv/V64PDN1dGFZrCkCqu9CghBjiTMLTMchjlGohIOFwnD18rfj0DbZiSP+wih4kgd5JljBLrQ5fxfHrQC/vh0vC2E9VOD9V2MT3smDhVtBAgLeXEmHEU5nbiiLaMcii7cWEgJ/SB3MHYu5IIMBO3HLXEH3wkxZnS/kiLl9FmhiPCmIVIvFIQe2/arAr+j40Lmx1PHJN5YUHSVaOs4NgqXL0bp0wDtXzhHUI187Niek80odb/TjeW8JMqIYhMXTwvXVw1SDI3L8tNBmsGbZZ5liieVvMr7rKKa2gqinV2sU1nazrzFG8UH9WQEu5G7c43DXjThrcNeLvdddDAg3auXyVu/o2VuMs2T6nyOJ2uFFq46t4WcWWqGgnR/v84b+OB1YTJjSnW9VasLP2aRu2l3HauPvWjsB99D3tnR/XC7qF36D36iCL0BZ2hc3SBhoiiDD2iX+h3sB98Dk6C05W0s1PnvEUbFnz7C6u2Nj4=</latexit>

✓shared<latexit sha1_base64="YAQrKWj1GCueT3QFCnq/dDhMRvw=">AAADnnicbZJLb9NAEMe3MY+S8kjpkcuKColTZHMAjpWQgAuiNKRNG0fWej1uVt2HtbsOjVb+IJy4wkfi23QdB8VxGGml/85vXhpNWnBmbBj+3esF9+4/eLj/qH/w+MnTZ4PD5+dGlZrCmCqu9CQlBjiTMLbMcpgUGohIOVykNx9qfrEAbZiS3+2ygJkg15LljBLrXcngKLZzsCRxsRbYzImGrEoGx+EwXBneFdFaHJ8cfPqJvJ0mhz0TZ4qWAqSlnBgzjcLCzhzRllEOVT8uDRSE3pBrmHopiQAzc6vpK/zKezKcK+2ftHjlbWc4IoxZitRHCmLnpstq5//YtLT5+5ljsigtSNo0ykuOrcL1KnDGNFDLl14QqpmfFVO/AUKtX1g/lvCDKiGIzFx8W7m4bpDm7raqthlsGHRZ7lmqeFbPr7jLa66hHVFusstdutjQhad4q/hkDSnhbtLtfNmCl1141YJXu11HLTzq5vrr4ubfWKk76/KMKo+zpInQwtX/bhBXpq6REu33x3kXj6wmTG5NsanXsKo+06h7lLvi/M0wCofRN3+vb1Fj++gFeoleowi9QyfoMzpFY0TREv1Cv9GfAAcfgy/B1ya0t7fOOUJbFkzuAP7yPws=</latexit><latexit sha1_base64="Qz1Eq4zFzajuxUDxqvvOwrSu0Jw=">AAADnnicbZLLbtNAFIanMZeScknpspsRFRKryGYBLCshARtEIaR1G0fWeHzcjDoXa2YcGo38IKzYwoPwELwN4zgoicORRvrnfOemo5OVnBkbhn/2esGdu/fu7z/oHzx89PjJ4PDpuVGVpjCmiisdZ8QAZxLGllkOcamBiIzDRXbztuEXc9CGKfnVLkqYCnItWcEosd6VDo4SOwNLUpdogc2MaMjrdHASDsOl4V0RrcTJ6cH778e/4/wsPeyZJFe0EiAt5cSYSRSWduqItoxyqPtJZaAk9IZcw8RLSQSYqVtOX+Pn3pPjQmn/pMVL72aGI8KYhch8pCB2Zrqscf6PTSpbvJk6JsvKgqRto6Li2CrcrALnTAO1fOEFoZr5WTH1GyDU+oX1EwnfqBKCyNwlt7VLmgZZ4W7repvBmkGXFZ5liufN/Iq7ouEaNiOqdXa1S+drOvcUbxWPV5AS7uJu58sNeNmFVxvwarfraAOPurn+urj5N1bmvnR5TpXHedpGaOGafzeIK9PUyIj2++O8i0dWEya3pljXa1ndnGnUPcpdcf5yGIXD6LO/11eotX10jJ6hFyhCr9Ep+oDO0BhRtEA/0E/0K8DBu+Bj8KkN7e2tco7QlgXxX8b0QKc=</latexit><latexit sha1_base64="Qz1Eq4zFzajuxUDxqvvOwrSu0Jw=">AAADnnicbZLLbtNAFIanMZeScknpspsRFRKryGYBLCshARtEIaR1G0fWeHzcjDoXa2YcGo38IKzYwoPwELwN4zgoicORRvrnfOemo5OVnBkbhn/2esGdu/fu7z/oHzx89PjJ4PDpuVGVpjCmiisdZ8QAZxLGllkOcamBiIzDRXbztuEXc9CGKfnVLkqYCnItWcEosd6VDo4SOwNLUpdogc2MaMjrdHASDsOl4V0RrcTJ6cH778e/4/wsPeyZJFe0EiAt5cSYSRSWduqItoxyqPtJZaAk9IZcw8RLSQSYqVtOX+Pn3pPjQmn/pMVL72aGI8KYhch8pCB2Zrqscf6PTSpbvJk6JsvKgqRto6Li2CrcrALnTAO1fOEFoZr5WTH1GyDU+oX1EwnfqBKCyNwlt7VLmgZZ4W7repvBmkGXFZ5liufN/Iq7ouEaNiOqdXa1S+drOvcUbxWPV5AS7uJu58sNeNmFVxvwarfraAOPurn+urj5N1bmvnR5TpXHedpGaOGafzeIK9PUyIj2++O8i0dWEya3pljXa1ndnGnUPcpdcf5yGIXD6LO/11eotX10jJ6hFyhCr9Ep+oDO0BhRtEA/0E/0K8DBu+Bj8KkN7e2tco7QlgXxX8b0QKc=</latexit><latexit sha1_base64="wc3erbP5Dh01pgdrwpLGKifKKxc=">AAADnnicbZJLb9NAEMe3MY8SXik9clkRIXGKbA6UYyWkiguiENKmjSNrvR43q+7D2l2njVb+LFzhI/FtWCdGcRxGWum/85uXRpMWnBkbhn8OesGDh48eHz7pP332/MXLwdGrC6NKTWFCFVd6mhIDnEmYWGY5TAsNRKQcLtPbTzW/XII2TMkfdlXAXJAbyXJGifWuZHAc2wVYkrhYC2wWRENWJYNhOArXhvdF1Ighauw8OeqZOFO0FCAt5cSYWRQWdu6ItoxyqPpxaaAg9JbcwMxLSQSYuVtPX+G33pPhXGn/pMVrbzvDEWHMSqQ+UhC7MF1WO//HZqXNP84dk0VpQdJNo7zk2CpcrwJnTAO1fOUFoZr5WTH1GyDU+oX1Ywl3VAlBZObi+8rFdYM0d/dVtctgy6DLcs9SxbN6fsVdXnMN7Yhym13u0+WWLj3FO8WnDaSEu2m381ULXnXhdQte73cdt/C4m+uvi5t/Y6Xue5dnVHmcJZsILVz97wZxZeoaKdF+f5x38dhqwuTOFNt6G1bVZxp1j3JfXLwfReEo+hYOTz80B3uIXqM36B2K0Ak6RZ/ROZogilboJ/qFfgc4OAu+BF83ob2DJucY7Vgw/QuCfz3g</latexit>

✓1excl

<latexit sha1_base64="4J5pV5+06gHdt98S1WuhiZzWIQI=">AAADnnicbZLLbtNAFIanMZcSbimV2LAZUSGximwWhWWlShUbRCGkdRsHazw+bkadizUzDolGlngTtvBIPAMvwTgJSuJwpJH+Od+56ehkJWfGhuHvvU5w5+69+/sPug8fPX7ytHfw7MKoSlMYUsWVjjNigDMJQ8ssh7jUQETG4TK7PW345RS0YUp+sfMSxoLcSFYwSqx3pb3DxE7Akq9R6hItMMwor9PeUdgPF4Z3RbQSRyfPoz/fEULn6UHHJLmilQBpKSfGjKKwtGNHtGWUQ91NKgMlobfkBkZeSiLAjN1i+hq/8p4cF0r7Jy1eeDczHBHGzEXmIwWxE9NmjfN/bFTZ4t3YMVlWFiRdNioqjq3CzSpwzjRQy+deEKqZnxXTCdGEWr+wbiLhG1VCEJm7ZFa7pGmQFW5W19sM1gzarPAsUzxv5lfcFQ3XsBlRrbOrXTpd06mneKt4vIKUcBe3O19twKs2vN6A17tdBxt40M7118XNv7Ey97nNc6o8ztNlhBau+beDuDJNjYxovz/O23hgNWFya4p1vSWrmzON2ke5Ky7e9KOwH33y93qMlraPXqCX6DWK0Ft0gt6jczREFM3RD/QT/QpwcBZ8CD4uQzt7q5xDtGVB/BeQSj+W</latexit><latexit sha1_base64="zqn2ayt59GS5dw5y5pZlanBQl5w=">AAADnnicbZLdahNBFMenWT9q/EoteCPIYBG8CrteqJcFQbwRa2PatNm4zM6ebYbOxzIzGxOGvfQ5vNU38FV8Bl/C2WwkycYDA/85v/PF4aQFZ8aG4e+9TnDj5q3b+3e6d+/df/Cwd/DozKhSUxhSxZUepcQAZxKGllkOo0IDESmH8/T6bc3PZ6ANU/KzXRQwEeRKspxRYr0r6R3GdgqWfIkSF2uBYU55lfSOwn64NLwropU4On4c/fn29NfpSXLQMXGmaClAWsqJMeMoLOzEEW0Z5VB149JAQeg1uYKxl5IIMBO3nL7Cz70nw7nS/kmLl97NDEeEMQuR+khB7NS0We38HxuXNn8zcUwWpQVJm0Z5ybFVuF4FzpgGavnCC0I187NiOiWaUOsX1o0lfKVKCCIzF88rF9cN0tzNq2qbwZpBm+WepYpn9fyKu7zmGjYjynV2uUtnazrzFG8VH60gJdyN2p0vNuBFG15uwMvdroMNPGjn+uvi5t9YqTtt84wqj7OkidDC1f92EFemrpES7ffHeRsPrCZMbk2xrtewqj7TqH2Uu+LsZT8K+9Enf6+vUGP76Al6hl6gCL1Gx+g9OkFDRNECfUc/0M8AB++CD8HHJrSzt8o5RFsWjP4CDfRAww==</latexit><latexit sha1_base64="zqn2ayt59GS5dw5y5pZlanBQl5w=">AAADnnicbZLdahNBFMenWT9q/EoteCPIYBG8CrteqJcFQbwRa2PatNm4zM6ebYbOxzIzGxOGvfQ5vNU38FV8Bl/C2WwkycYDA/85v/PF4aQFZ8aG4e+9TnDj5q3b+3e6d+/df/Cwd/DozKhSUxhSxZUepcQAZxKGllkOo0IDESmH8/T6bc3PZ6ANU/KzXRQwEeRKspxRYr0r6R3GdgqWfIkSF2uBYU55lfSOwn64NLwropU4On4c/fn29NfpSXLQMXGmaClAWsqJMeMoLOzEEW0Z5VB149JAQeg1uYKxl5IIMBO3nL7Cz70nw7nS/kmLl97NDEeEMQuR+khB7NS0We38HxuXNn8zcUwWpQVJm0Z5ybFVuF4FzpgGavnCC0I187NiOiWaUOsX1o0lfKVKCCIzF88rF9cN0tzNq2qbwZpBm+WepYpn9fyKu7zmGjYjynV2uUtnazrzFG8VH60gJdyN2p0vNuBFG15uwMvdroMNPGjn+uvi5t9YqTtt84wqj7OkidDC1f92EFemrpES7ffHeRsPrCZMbk2xrtewqj7TqH2Uu+LsZT8K+9Enf6+vUGP76Al6hl6gCL1Gx+g9OkFDRNECfUc/0M8AB++CD8HHJrSzt8o5RFsWjP4CDfRAww==</latexit><latexit sha1_base64="ad5rXwNAa/LW+gDXIs2HtKBhuAI=">AAADnnicbZJLj9MwEMe9DY+lvLrskYtFhcSpSjgAx5WQEBfEQulud5sSOc5ka60fke2UVlY+C1f4SHwbnDaoacpIlv6e37w0mrTgzNgw/HPUC+7cvXf/+EH/4aPHT54OTp5dGFVqChOquNLTlBjgTMLEMsthWmggIuVwmd6+r/nlErRhSn6z6wLmgtxIljNKrHclg9PYLsCS71HiYi0wrCivksEwHIUbw4ciasQQNXaenPRMnClaCpCWcmLMLAoLO3dEW0Y5VP24NFAQektuYOalJALM3G2mr/BL78lwrrR/0uKNt53hiDBmLVIfKYhdmC6rnf9js9Lm7+aOyaK0IOm2UV5ybBWuV4EzpoFavvaCUM38rJguiCbU+oX1Ywk/qBKCyMzFq8rFdYM0d6uq2mewY9BluWep4lk9v+Iur7mGdkS5yy4P6XJHl57iveLTBlLC3bTb+aoFr7rwugWvD7uOW3jczfXXxc2/sVL3tcszqjzOkm2EFq7+d4O4MnWNlGi/P867eGw1YXJvil29LavqM426R3koLl6PonAUfQmHZ2+agz1Gz9EL9ApF6C06Qx/ROZogitboJ/qFfgc4+BB8Cj5vQ3tHTc4p2rNg+hfDDT2k</latexit>

✓2excl

<latexit sha1_base64="L7MZMrF13/wJcd1gsYREjZr7DAc=">AAADnnicbZLLbtNAFIanMZcSbimV2LAZUSGxiuwugGUlpIoNohDSuo2DNR4fN6POxZoZp4lGlngTtvBIPAMvwTgJSuJwpJH+Od+56ehkJWfGhuHvvU5w5+69+/sPug8fPX7ytHfw7NyoSlMYUsWVjjNigDMJQ8ssh7jUQETG4SK7ed/wiylow5T8aucljAW5lqxglFjvSnuHiZ2AJd+OU5dogWFGeZ32jsJ+uDC8K6KVODp5Hv35jhA6Sw86JskVrQRISzkxZhSFpR07oi2jHOpuUhkoCb0h1zDyUhIBZuwW09f4lffkuFDaP2nxwruZ4YgwZi4yHymInZg2a5z/Y6PKFu/GjsmysiDpslFRcWwVblaBc6aBWj73glDN/KyYTogm1PqFdRMJt1QJQWTuklntkqZBVrhZXW8zWDNos8KzTPG8mV9xVzRcw2ZEtc6udul0Taee4q3i8QpSwl3c7ny5AS/b8GoDXu12HWzgQTvXXxc3/8bK3Jc2z6nyOE+XEVq45t8O4so0NTKi/f44b+OB1YTJrSnW9Zasbs40ah/lrjg/7kdhP/rs7/UNWto+eoFeotcoQm/RCfqAztAQUTRHP9BP9CvAwWnwMfi0DO3srXIO0ZYF8V+Tez+X</latexit><latexit sha1_base64="7MpCRnJhGiU8gxhNPAXOXrcEg0w=">AAADnnicbZLdahNBFMenWbU1fqUWvBFksAhehd1e1F4WBPFGrI1p02ZjmJ092wydj2VmNiYMe+lzeKtv4Kv4DL6Es9lIko0HBv5zfueLw0lyzowNw987reDO3Xu7e/fbDx4+evyks//0wqhCU+hTxZUeJMQAZxL6llkOg1wDEQmHy+T2bcUvp6ANU/KznecwEuRGsoxRYr1r3DmI7QQs+XI0drEWGGaUl+POYdgNF4a3RbQUh6fPoj/fXvw6Pxvvt0ycKloIkJZyYswwCnM7ckRbRjmU7bgwkBN6S25g6KUkAszILaYv8SvvSXGmtH/S4oV3PcMRYcxcJD5SEDsxTVY5/8eGhc1ORo7JvLAgad0oKzi2ClerwCnTQC2fe0GoZn5WTCdEE2r9wtqxhK9UCUFk6uJZ6eKqQZK5WVluMlgxaLLMs0TxtJpfcZdVXMN6RLHKLrbpdEWnnuKN4oMlpIS7QbPz1Rq8asLrNXi93bW3hnvNXH9d3PwbK3HnTZ5S5XE6riO0cNW/GcSVqWokRPv9cd7EPasJkxtTrOrVrKzONGoe5ba4OOpGYTf65O/1GNW2h56jl+g1itAbdIreozPURxTN0Xf0A/0McPAu+BB8rENbO8ucA7RhweAvESVAxA==</latexit><latexit sha1_base64="7MpCRnJhGiU8gxhNPAXOXrcEg0w=">AAADnnicbZLdahNBFMenWbU1fqUWvBFksAhehd1e1F4WBPFGrI1p02ZjmJ092wydj2VmNiYMe+lzeKtv4Kv4DL6Es9lIko0HBv5zfueLw0lyzowNw987reDO3Xu7e/fbDx4+evyks//0wqhCU+hTxZUeJMQAZxL6llkOg1wDEQmHy+T2bcUvp6ANU/KznecwEuRGsoxRYr1r3DmI7QQs+XI0drEWGGaUl+POYdgNF4a3RbQUh6fPoj/fXvw6Pxvvt0ycKloIkJZyYswwCnM7ckRbRjmU7bgwkBN6S25g6KUkAszILaYv8SvvSXGmtH/S4oV3PcMRYcxcJD5SEDsxTVY5/8eGhc1ORo7JvLAgad0oKzi2ClerwCnTQC2fe0GoZn5WTCdEE2r9wtqxhK9UCUFk6uJZ6eKqQZK5WVluMlgxaLLMs0TxtJpfcZdVXMN6RLHKLrbpdEWnnuKN4oMlpIS7QbPz1Rq8asLrNXi93bW3hnvNXH9d3PwbK3HnTZ5S5XE6riO0cNW/GcSVqWokRPv9cd7EPasJkxtTrOrVrKzONGoe5ba4OOpGYTf65O/1GNW2h56jl+g1itAbdIreozPURxTN0Xf0A/0McPAu+BB8rENbO8ucA7RhweAvESVAxA==</latexit><latexit sha1_base64="vv4xnB8HI7o4PgV7wWFi4UdmEKk=">AAADnnicbZJLb9NAEMe3MY8SXik9clkRIXGK7B4ox0pIVS+IQkjrNg7Wej1uVt2HtbsOiVb+LFzhI/FtWCdBSRxGWum/85uXRpOVnBkbhn8OOsGDh48eHz7pPn32/MXL3tGrK6MqTWFEFVc6zogBziSMLLMc4lIDERmH6+z+Y8OvZ6ANU/KbXZQwEeROsoJRYr0r7R0ndgqWfD9JXaIFhjnlddrrh4NwaXhfRGvRR2u7TI86JskVrQRISzkxZhyFpZ04oi2jHOpuUhkoCb0ndzD2UhIBZuKW09f4rffkuFDaP2nx0rud4YgwZiEyHymInZo2a5z/Y+PKFh8mjsmysiDpqlFRcWwVblaBc6aBWr7wglDN/KyYTokm1PqFdRMJP6gSgsjcJfPaJU2DrHDzut5lsGHQZoVnmeJ5M7/irmi4hu2IapNd7dPZhs48xTvF4zWkhLu43flmC9604e0WvN3vOtzCw3auvy5u/o2Vua9tnlPlcZ6uIrRwzb8dxJVpamRE+/1x3sZDqwmTO1Ns6q1Y3Zxp1D7KfXF1MojCQfQl7J+9Xx/sIXqN3qB3KEKn6AxdoEs0QhQt0E/0C/0OcHAefAo+r0I7B+ucY7RjQfwXxj49pQ==</latexit>

✓Kexcl

<latexit sha1_base64="tj079oxGL0tvEhaqG8UfH0l5SLs=">AAADnnicbZJbaxNBFMenWS9tvKUWfPFlsAg+hV0f1MeCIIKI1Zh222xcZmfPNkPnsszMxoRhwW/iq34kP0O/hLNJJMnGAwP/Ob9z43CykjNjw/DPXie4dfvO3f2D7r37Dx4+6h0+PjOq0hSGVHGl44wY4EzC0DLLIS41EJFxOM+u3zb8fAraMCW/2nkJY0GuJCsYJda70t5RYidgybcPqUu0wDCjvE57x2E/XBjeFdFKHJ88iW5+IIRO08OOSXJFKwHSUk6MGUVhaceOaMsoh7qbVAZKQq/JFYy8lESAGbvF9DV+7j05LpT2T1q88G5mOCKMmYvMRwpiJ6bNGuf/2KiyxZuxY7KsLEi6bFRUHFuFm1XgnGmgls+9IFQzPyumE6IJtX5h3UTCd6qEIDJ3yax2SdMgK9ysrrcZrBm0WeFZpnjezK+4KxquYTOiWmdXu3S6plNP8VbxeAUp4S5ud77YgBdteLkBL3e7DjbwoJ3rr4ubf2Nl7kub51R5nKfLCC1c828HcWWaGhnRfn+ct/HAasLk1hTrektWN2catY9yV5y97EdhP/rs7/UVWto+eoqeoRcoQq/RCXqPTtEQUTRHP9Ev9DvAwbvgY/BpGdrZW+UcoS0L4r/jRD+w</latexit><latexit sha1_base64="fmKrGb0LxROBthB9UEsEm24r+EA=">AAADnnicbZLdahNBFMenWbU1fqUWvBFksAhehd1eqJeFgggi1sa0abMxzM6ebYbOxzIzGxOGvfQ5vNU38FV8Bl/C2SSSzcYDA/85v/PF4SQ5Z8aG4e+dVnDr9p3dvbvte/cfPHzU2X98blShKfSp4koPEmKAMwl9yyyHQa6BiITDRXJzUvGLKWjDlPxs5zmMBLmWLGOUWO8adw5iOwFLvrwfu1gLDDPKy3HnMOyGC8PbIlqJw+Mn0Z9vz36dnY73WyZOFS0ESEs5MWYYhbkdOaItoxzKdlwYyAm9Idcw9FISAWbkFtOX+IX3pDhT2j9p8cJbz3BEGDMXiY8UxE5Mk1XO/7FhYbM3I8dkXliQdNkoKzi2ClerwCnTQC2fe0GoZn5WTCdEE2r9wtqxhK9UCUFk6uJZ6eKqQZK5WVluMlgzaLLMs0TxtJpfcZdVXEM9olhnF9t0uqZTT/FG8cEKUsLdoNn5sgYvm/CqBq+2u/ZquNfM9dfFzb+xEnfW5ClVHqfjZYQWrvo3g7gyVY2EaL8/zpu4ZzVhcmOKdb0lK6szjZpHuS3Oj7pR2I0++Xt9hZa2h56i5+glitBrdIzeoVPURxTN0Xf0A/0McPA2+BB8XIa2dlY5B2jDgsFfYO5A3Q==</latexit><latexit sha1_base64="fmKrGb0LxROBthB9UEsEm24r+EA=">AAADnnicbZLdahNBFMenWbU1fqUWvBFksAhehd1eqJeFgggi1sa0abMxzM6ebYbOxzIzGxOGvfQ5vNU38FV8Bl/C2SSSzcYDA/85v/PF4SQ5Z8aG4e+dVnDr9p3dvbvte/cfPHzU2X98blShKfSp4koPEmKAMwl9yyyHQa6BiITDRXJzUvGLKWjDlPxs5zmMBLmWLGOUWO8adw5iOwFLvrwfu1gLDDPKy3HnMOyGC8PbIlqJw+Mn0Z9vz36dnY73WyZOFS0ESEs5MWYYhbkdOaItoxzKdlwYyAm9Idcw9FISAWbkFtOX+IX3pDhT2j9p8cJbz3BEGDMXiY8UxE5Mk1XO/7FhYbM3I8dkXliQdNkoKzi2ClerwCnTQC2fe0GoZn5WTCdEE2r9wtqxhK9UCUFk6uJZ6eKqQZK5WVluMlgzaLLMs0TxtJpfcZdVXEM9olhnF9t0uqZTT/FG8cEKUsLdoNn5sgYvm/CqBq+2u/ZquNfM9dfFzb+xEnfW5ClVHqfjZYQWrvo3g7gyVY2EaL8/zpu4ZzVhcmOKdb0lK6szjZpHuS3Oj7pR2I0++Xt9hZa2h56i5+glitBrdIzeoVPURxTN0Xf0A/0McPA2+BB8XIa2dlY5B2jDgsFfYO5A3Q==</latexit><latexit sha1_base64="4BFOsL05clzxtIs/Lp6AxvYs2EM=">AAADnnicbZJbb9MwFMe9hssot4498mJRIfFUJTwwHichTZMQYlC6ZWtK5DgnqzVfItsprax8Fl7hI/FtcNqitilHsvT3+Z2bjk5WcmZsGP456AT37j94ePio+/jJ02fPe0cvLo2qNIURVVzpOCMGOJMwssxyiEsNRGQcrrK7Dw2/moE2TMlvdlHCRJBbyQpGifWutHec2ClY8v1j6hItMMwpr9NePxyES8P7IlqLPlrbRXrUMUmuaCVAWsqJMeMoLO3EEW0Z5VB3k8pASegduYWxl5IIMBO3nL7Gr70nx4XS/kmLl97tDEeEMQuR+UhB7NS0WeP8HxtXtng/cUyWlQVJV42KimOrcLMKnDMN1PKFF4Rq5mfFdEo0odYvrJtI+EGVEETmLpnXLmkaZIWb1/Uugw2DNis8yxTPm/kVd0XDNWxHVJvsap/ONnTmKd4pHq8hJdzF7c7XW/C6DW+24M1+1+EWHrZz/XVx82+szH1t85wqj/N0FaGFa/7tIK5MUyMj2u+P8zYeWk2Y3JliU2/F6uZMo/ZR7ovLt4MoHERfwv7pu/XBHqKX6BV6gyJ0gk7RObpAI0TRAv1Ev9DvAAdnwafg8yq0c7DOOUY7FsR/ARYWPb4=</latexit>

fK✓K

<latexit sha1_base64="Gx7V+LPC+gaLkbPTAdCgGshmQuY=">AAADmnicbZJLb9NAEMe3MdASHk3hhOBgUSFximwOwLEqF1AuhTRt2ji11utxs+o+rN11aLTyJ+EKH6o3znwK1nFQHIeRVvrv/Oal0SQ5o9oEwd1Ox7t3/8Hu3sPuo8dPnu73Dp6daVkoAiMimVTjBGtgVMDIUMNgnCvAPGFwntx8qvj5HJSmUpyaRQ5Tjq8FzSjBxrni3n6UXQ1iG5kZGHw1KOPeYdAPluZvi3AlDo9e3P35jRA6iQ86OkolKTgIQxjWehIGuZlarAwlDMpuVGjIMbnB1zBxUmAOemqXk5f+G+dJ/Uwq94Txl95mhsVc6wVPXCTHZqbbrHL+j00Kk32cWirywoAgdaOsYL6RfrUGP6UKiGELJzBR1M3qkxlWmBi3rG4k4DuRnGOR2ui2tFHVIMnsbVluMlgzaLPMsUSytJpfMptVXEEzolhnF9t0vqZzR/2N4uMVJJjZcbvzRQNetOFlA15udx028LCd6y6L6X9jJfZbm6dEOpzGdYTitvq3g5jUVY0EK7c/xtp4aBSmYmOKdb2aldWZhu2j3BZn7/ph0A+/unt9j2rbQy/Ra/QWhegDOkKf0QkaIYIK9AP9RL+8V96x98Ub1KGdnVXOc7Rh3ulfNXk+9g==</latexit><latexit sha1_base64="5qJ8/fZ3engjjkGTeHbfNcMioxU=">AAADmnicbZLNbtNAEMe3MdCSFprCCcHBokLiFNkcgGNVLqBcCmnapHFqrdfjZtX9sHbXodHKT8IVHqoHXoCnYJ0ExXEYaaX/zm++NJokZ1SbILjfaXkPHj7a3Xvc3j948vSwc/TsQstCERgQyaQaJlgDowIGhhoGw1wB5gmDy+T2U8UvZ6A0leLczHOYcHwjaEYJNs4Vdw6j7LoX28hMweDrXhl3joNusDB/W4QrcXzy4v7P79HB/ll81NJRKknBQRjCsNbjMMjNxGJlKGFQtqNCQ47JLb6BsZMCc9ATu5i89N84T+pnUrknjL/w1jMs5lrPeeIiOTZT3WSV839sXJjs48RSkRcGBFk2ygrmG+lXa/BTqoAYNncCE0XdrD6ZYoWJcctqRwK+E8k5FqmN7kobVQ2SzN6V5SaDNYMmyxxLJEur+SWzWcUV1COKdXaxTWdrOnPU3yg+XEGCmR02O49qcNSEVzV4td21X8P9Zq67LKb/jZXYb02eEulwGi8jFLfVvxnEpK5qJFi5/THWxH2jMBUbU6zrLVlZnWnYPMptcfGuGwbd8Ku71/doaXvoJXqN3qIQfUAn6DM6QwNEUIF+oJ/ol/fKO/W+eL1laGtnlfMcbZh3/hd5aD9j</latexit><latexit sha1_base64="5qJ8/fZ3engjjkGTeHbfNcMioxU=">AAADmnicbZLNbtNAEMe3MdCSFprCCcHBokLiFNkcgGNVLqBcCmnapHFqrdfjZtX9sHbXodHKT8IVHqoHXoCnYJ0ExXEYaaX/zm++NJokZ1SbILjfaXkPHj7a3Xvc3j948vSwc/TsQstCERgQyaQaJlgDowIGhhoGw1wB5gmDy+T2U8UvZ6A0leLczHOYcHwjaEYJNs4Vdw6j7LoX28hMweDrXhl3joNusDB/W4QrcXzy4v7P79HB/ll81NJRKknBQRjCsNbjMMjNxGJlKGFQtqNCQ47JLb6BsZMCc9ATu5i89N84T+pnUrknjL/w1jMs5lrPeeIiOTZT3WSV839sXJjs48RSkRcGBFk2ygrmG+lXa/BTqoAYNncCE0XdrD6ZYoWJcctqRwK+E8k5FqmN7kobVQ2SzN6V5SaDNYMmyxxLJEur+SWzWcUV1COKdXaxTWdrOnPU3yg+XEGCmR02O49qcNSEVzV4td21X8P9Zq67LKb/jZXYb02eEulwGi8jFLfVvxnEpK5qJFi5/THWxH2jMBUbU6zrLVlZnWnYPMptcfGuGwbd8Ku71/doaXvoJXqN3qIQfUAn6DM6QwNEUIF+oJ/ol/fKO/W+eL1laGtnlfMcbZh3/hd5aD9j</latexit><latexit sha1_base64="fo5KiR9c19JAmR0wVTldwTVI2nU=">AAADmnicbZJLb9NAEMe3MdASHk3hCAeLCIlTZHOAHiu4gHIpTdOmjVNrvR43q+7D2l2HRit/Eq7wofg2XSdGcRxGWum/85uXRpPkjGoTBH/3Ot6jx0/2D552nz1/8fKwd/TqQstCERgTyaSaJFgDowLGhhoGk1wB5gmDy+Tua8UvF6A0leLcLHOYcXwraEYJNs4V9w6j7GYY28jMweCbYRn3+sEgWJm/K8Ja9FFtp/FRR0epJAUHYQjDWk/DIDczi5WhhEHZjQoNOSZ3+BamTgrMQc/savLSf+88qZ9J5Z4w/srbzLCYa73kiYvk2Mx1m1XO/7FpYbLjmaUiLwwIsm6UFcw30q/W4KdUATFs6QQmirpZfTLHChPjltWNBPwkknMsUhvdlzaqGiSZvS/LbQYbBm2WOZZIllbzS2aziitoRhSb7GKXLjZ04ai/VXxSQ4KZnbQ7XzXgVRteN+D1btdRA4/aue6ymP43VmLP2jwl0uE0Xkcobqt/O4hJXdVIsHL7Y6yNR0ZhKram2NRbs7I607B9lLvi4uMgDAbhj6B/8qk+2AP0Br1DH1CIPqMT9A2dojEiqEC/0G/0x3vrffG+e8N1aGevznmNtsw7fwBO8zv6</latexit>

f2✓2

<latexit sha1_base64="BVvO1YdpYTEOVHgbilQJFhA20BE=">AAADmnicbZJLb9NAEMe3MY8SHk3hggQHiwiJU2TnAByr9gLiUkjTpo1Ta70eN6vuw9pdh0YrfxKu8JkQ34Z1HBTHYaSV/ju/eWk0Sc6oNkHwZ6/j3bv/4OH+o+7jJ0+fHfQOn59rWSgCYyKZVJMEa2BUwNhQw2CSK8A8YXCR3J5U/GIBSlMpzswyhxnHN4JmlGDjXHHvIMquh7GNzBwMvh6Wca8fDIKV+bsiXIv+0cv+bx8hdBofdnSUSlJwEIYwrPU0DHIzs1gZShiU3ajQkGNyi29g6qTAHPTMriYv/bfOk/qZVO4J46+8zQyLudZLnrhIjs1ct1nl/B+bFib7OLNU5IUBQepGWcF8I/1qDX5KFRDDlk5goqib1SdzrDAxblndSMB3IjnHIrXRXWmjqkGS2buy3GawYdBmmWOJZGk1v2Q2q7iCZkSxyS526WJDF476W8Una0gws5N258sGvGzDqwa82u06auBRO9ddFtP/xkrstzZPiXQ4jesIxW31bwcxqasaCVZuf4y18cgoTMXWFJt6NSurMw3bR7krzoeDMBiEX929vke17aNX6A16h0L0AR2hT+gUjRFBBfqBfqJf3mvv2PvsfalDO3vrnBdoy7yzvwBQPTo=</latexit><latexit sha1_base64="7pyYROFbxRTiMb2nS3NAdIWIOwc=">AAADmnicbZJLb9NAEMe3MY8SHk3hggQHiwiJU2TnAFyQKriAuBTStGnj1Fqvx82q+7B216HRyp+Ea/lMiG/DOg6K4zDSSv+d37w0miRnVJsg+LPX8e7cvXd//0H34aPHTw56h09PtSwUgTGRTKpJgjUwKmBsqGEwyRVgnjA4S64/VfxsAUpTKU7MMocZx1eCZpRg41xx7yDKLoexjcwcDL4clnGvHwyClfm7IlyL/tHz/u/uh/z2OD7s6CiVpOAgDGFY62kY5GZmsTKUMCi7UaEhx+QaX8HUSYE56JldTV76r50n9TOp3BPGX3mbGRZzrZc8cZEcm7lus8r5PzYtTPZ+ZqnICwOC1I2ygvlG+tUa/JQqIIYtncBEUTerT+ZYYWLcsrqRgB9Eco5FaqOb0kZVgySzN2W5zWDDoM0yxxLJ0mp+yWxWcQXNiGKTXezSxYYuHPW3ik/WkGBmJ+3O5w143oYXDXix23XUwKN2rrsspv+NldjvbZ4S6XAa1xGK2+rfDmJSVzUSrNz+GGvjkVGYiq0pNvVqVlZnGraPclecDgdhMAi/uXt9i2rbRy/QK/QGhegdOkKf0TEaI4IK9BPdol/eS++j98X7Wod29tY5z9CWeSd/AacSPnU=</latexit><latexit sha1_base64="7pyYROFbxRTiMb2nS3NAdIWIOwc=">AAADmnicbZJLb9NAEMe3MY8SHk3hggQHiwiJU2TnAFyQKriAuBTStGnj1Fqvx82q+7B216HRyp+Ea/lMiG/DOg6K4zDSSv+d37w0miRnVJsg+LPX8e7cvXd//0H34aPHTw56h09PtSwUgTGRTKpJgjUwKmBsqGEwyRVgnjA4S64/VfxsAUpTKU7MMocZx1eCZpRg41xx7yDKLoexjcwcDL4clnGvHwyClfm7IlyL/tHz/u/uh/z2OD7s6CiVpOAgDGFY62kY5GZmsTKUMCi7UaEhx+QaX8HUSYE56JldTV76r50n9TOp3BPGX3mbGRZzrZc8cZEcm7lus8r5PzYtTPZ+ZqnICwOC1I2ygvlG+tUa/JQqIIYtncBEUTerT+ZYYWLcsrqRgB9Eco5FaqOb0kZVgySzN2W5zWDDoM0yxxLJ0mp+yWxWcQXNiGKTXezSxYYuHPW3ik/WkGBmJ+3O5w143oYXDXix23XUwKN2rrsspv+NldjvbZ4S6XAa1xGK2+rfDmJSVzUSrNz+GGvjkVGYiq0pNvVqVlZnGraPclecDgdhMAi/uXt9i2rbRy/QK/QGhegdOkKf0TEaI4IK9BPdol/eS++j98X7Wod29tY5z9CWeSd/AacSPnU=</latexit><latexit sha1_base64="ZkZJ7HwRYMNb0oQzgwVdVg+OUJY=">AAADmnicbZJLb9NAEMe3MdASHk3hCAeLCIlTZOcAPVZwAXFpSdOmjVNrvR43q+7D2l2HRit/Eq7wofg2XSdGcRxGWum/85uXRpPkjGoTBH/3Ot6jx0/2D552nz1/8fKwd/TqQstCERgTyaSaJFgDowLGhhoGk1wB5gmDy+TuS8UvF6A0leLcLHOYcXwraEYJNs4V9w6j7GYY28jMweCbYRn3+sEgWJm/K8Ja9FFtp/FRR0epJAUHYQjDWk/DIDczi5WhhEHZjQoNOSZ3+BamTgrMQc/savLSf+88qZ9J5Z4w/srbzLCYa73kiYvk2Mx1m1XO/7FpYbLjmaUiLwwIsm6UFcw30q/W4KdUATFs6QQmirpZfTLHChPjltWNBPwkknMsUhvdlzaqGiSZvS/LbQYbBm2WOZZIllbzS2aziitoRhSb7GKXLjZ04ai/VXxSQ4KZnbQ7XzXgVRteN+D1btdRA4/aue6ymP43VmJ/tHlKpMNpvI5Q3Fb/dhCTuqqRYOX2x1gbj4zCVGxNsam3ZmV1pmH7KHfFxXAQBoPwLOiffKwP9gC9Qe/QBxSiT+gEfUWnaIwIKtAv9Bv98d56n71v3vd1aGevznmNtsw7fwCwTDvI</latexit>

f1✓1

<latexit sha1_base64="5CApFcHOIFrHTDJPxdBgpa4+jNw=">AAADmnicbZJLb9NAEMe3MY8SHk3hggSHFRESp8juAThWcAFxKaRp08aptV6Pm1X3Ye2uQ6OVPwlX+EyIb8M6DkriMNJK/53fvDSatODM2DD8s9cJ7ty9d3//Qffho8dPDnqHT8+MKjWFEVVc6XFKDHAmYWSZ5TAuNBCRcjhPbz7W/HwO2jAlT+2igKkg15LljBLrXUnvIM6vosTFdgaWXEVV0uuHg3BpeFdEK9E/ft7/jRFCJ8lhx8SZoqUAaSknxkyisLBTR7RllEPVjUsDBaE35BomXkoiwEzdcvIKv/aeDOdK+yctXno3MxwRxixE6iMFsTPTZrXzf2xS2vz91DFZlBYkbRrlJcdW4XoNOGMaqOULLwjVzM+K6YxoQq1fVjeW8J0qIYjMXHxbubhukObutqq2GawZtFnuWap4Vs+vuMtrrmEzolxnl7t0vqZzT/FW8fEKUsLduN35YgNetOHlBrzc7TrcwMN2rr8sbv6NlbpvbZ5R5XGWNBFauPrfDuLK1DVSov3+OG/jodWEya0p1vUaVtVnGrWPclecHQ2icBB99ff6FjW2j16gV+gNitA7dIw+oRM0QhSV6Af6iX4FL4MPwefgSxPa2VvlPENbFpz+BfnpPTg=</latexit><latexit sha1_base64="ae4b/6UTZkw265Bunik0BonFqv0=">AAADmnicbZJLb9NAEMe3MY8SHk3hggQHiwiJU2T3AFwqVeUC4lJI06aNU2u9Hjer7sPaXYdGK38SruUzIb4N6zgojsNIK/13fvPSaJKcUW2C4M9Ox7t3/8HD3Ufdx0+ePtvr7T8/07JQBEZEMqnGCdbAqICRoYbBOFeAecLgPLn5VPHzOShNpTg1ixymHF8LmlGCjXPFvb0ouwpjG5kZGHwVlnGvHwyCpfnbIlyJ/tHL/u/uYX53Eu93dJRKUnAQhjCs9SQMcjO1WBlKGJTdqNCQY3KDr2HipMAc9NQuJy/9t86T+plU7gnjL73NDIu51gueuEiOzUy3WeX8H5sUJvs4tVTkhQFB6kZZwXwj/WoNfkoVEMMWTmCiqJvVJzOsMDFuWd1IwA8iOccitdFtaaOqQZLZ27LcZLBm0GaZY4lkaTW/ZDaruIJmRLHOLrbpfE3njvobxccrSDCz43bniwa8aMPLBrzc7jps4GE7110W0//GSuz3Nk+JdDiN6wjFbfVvBzGpqxoJVm5/jLXx0ChMxcYU63o1K6szDdtHuS3ODgZhMAi/uXt9j2rbRa/QG/QOhegDOkKf0QkaIYIK9BPdoV/ea+/Y++J9rUM7O6ucF2jDvNO/oLo+cw==</latexit><latexit sha1_base64="ae4b/6UTZkw265Bunik0BonFqv0=">AAADmnicbZJLb9NAEMe3MY8SHk3hggQHiwiJU2T3AFwqVeUC4lJI06aNU2u9Hjer7sPaXYdGK38SruUzIb4N6zgojsNIK/13fvPSaJKcUW2C4M9Ox7t3/8HD3Ufdx0+ePtvr7T8/07JQBEZEMqnGCdbAqICRoYbBOFeAecLgPLn5VPHzOShNpTg1ixymHF8LmlGCjXPFvb0ouwpjG5kZGHwVlnGvHwyCpfnbIlyJ/tHL/u/uYX53Eu93dJRKUnAQhjCs9SQMcjO1WBlKGJTdqNCQY3KDr2HipMAc9NQuJy/9t86T+plU7gnjL73NDIu51gueuEiOzUy3WeX8H5sUJvs4tVTkhQFB6kZZwXwj/WoNfkoVEMMWTmCiqJvVJzOsMDFuWd1IwA8iOccitdFtaaOqQZLZ27LcZLBm0GaZY4lkaTW/ZDaruIJmRLHOLrbpfE3njvobxccrSDCz43bniwa8aMPLBrzc7jps4GE7110W0//GSuz3Nk+JdDiN6wjFbfVvBzGpqxoJVm5/jLXx0ChMxcYU63o1K6szDdtHuS3ODgZhMAi/uXt9j2rbRa/QG/QOhegDOkKf0QkaIYIK9BPdoV/ea+/Y++J9rUM7O6ucF2jDvNO/oLo+cw==</latexit><latexit sha1_base64="QrHBFByEPXCN9qRvTYkdahrwFOs=">AAADmnicbZJLT9tAEMeXuA+aPgjtsRysRpV6iuweaI+oXFr1QhsCgThY6/WYrNiHtbsORCt/Eq70Q/XbdJ0YxXE60kr/nd+8NJokZ1SbIPi70/GePH32fPdF9+Wr12/2evtvz7QsFIERkUyqcYI1MCpgZKhhMM4VYJ4wOE9ujit+PgelqRSnZpHDlONrQTNKsHGuuLcXZVdhbCMzA4OvwjLu9YNBsDR/W4S16KPaTuL9jo5SSQoOwhCGtZ6EQW6mFitDCYOyGxUackxu8DVMnBSYg57a5eSl/9F5Uj+Tyj1h/KW3mWEx13rBExfJsZnpNquc/2OTwmRfp5aKvDAgyKpRVjDfSL9ag59SBcSwhROYKOpm9ckMK0yMW1Y3EnBLJOdYpDa6K21UNUgye1eWmwzWDNoscyyRLK3ml8xmFVfQjCjW2cU2na/p3FF/o/i4hgQzO253vmjAiza8bMDL7a7DBh62c91lMf04VmJ/t3lKpMNpvIpQ3Fb/dhCTuqqRYOX2x1gbD43CVGxMsa63YmV1pmH7KLfF2edBGAzCX0H/6LA+2F30Hn1An1CIvqAj9B2doBEiqED36AH98Q68b94P7+cqtLNT57xDG+ad/gOp9DvG</latexit>

e1(y)<latexit sha1_base64="0ONiBXpeCcQFypWW0U5C7rHw3EI=">AAADkHicbZJLbxMxEMfdLNASXm2RuHBZUSGVS7TLAXprERfEqRDSpM2GyOudba34sbK9oZG1H4IrSIivxbfBmw3KxmEkS+P5/edha9KCUW2i6M9OJ7hz997u3v3ug4ePHj/ZPzi80LJUBAZEMqlGKdbAqICBoYbBqFCAecpgmM7e13w4B6WpFF/MooAJx9eC5pRg40LDBL7Gx4tX0/2jqBctLdx24pVzdPas+PUbIXQ+PejoJJOk5CAMYVjrcRwVZmKxMpQwqLpJqaHAZIavYexcgTnoiV3OW4UvXSQLc6ncESZcRtsZFnOtFzx1So7NjfZZHfwfG5cmP5lYKorSgCBNo7xkoZFh/fgwowqIYQvnYKKomzUkN1hhYtwXdRMB34jkHIvMJreVTeoGaW5vq2qTwZqBz3LHUsmyen7JbF5zBW1Fuc4ut+l8TeeOhhvFRytIMLMjv/NlC1768KoFr7a79lu47+e6fWL631ip/ezzjEiHs2mjUNzWd1/EpK5rpFi5/2PMx32jMBUbU6zrNayq3JrG/lJuOxeve3HUiz+5fX2DGttDz9ELdIxi9BadoQ/oHA0QQTP0Hf1AP4PD4CQ4Dd410s7OKucp2rDg4194vDm6</latexit><latexit sha1_base64="Qnv5yYiVE9fYANkM28gLKOWtyCI=">AAADkHicbZJLb9NAEMe3MdASXm2RuHCxqJDKJbI5QG4t6qXqqRDSpI1DtF6P21X2Ye2uQ6OVP0RvCCS+F4IPwzoOiuMw0kqz8/vPY1cTZ4xqEwS/tlrevfsPtnceth89fvL02e7e/oWWuSLQJ5JJNYyxBkYF9A01DIaZAsxjBoN4elLywQyUplJ8NvMMxhxfC5pSgo0LDSL4Eh7O30x2D4JOsDB/0wmXzsHxi+zntz+/u+eTvZaOEklyDsIQhrUehUFmxhYrQwmDoh3lGjJMpvgaRs4VmIMe28W8hf/aRRI/lcodYfxFtJ5hMdd6zmOn5Njc6CYrg/9jo9yk3bGlIssNCFI1SnPmG+mXj/cTqoAYNncOJoq6WX1ygxUmxn1ROxLwlUjOsUhsdFvYqGwQp/a2KNYZrBg0WepYLFlSzi+ZTUuuoK7IV9n5Jp2t6MxRf634cAkJZnbY7HxZg5dNeFWDV5tdezXca+a6fWL631ix/dTkCZEOJ5NKobgt700Rk7qsEWPl/o+xJu4ZhalYm2JVr2JF4dY0bC7lpnPxthMGnfCj29d3qLId9BK9QocoRO/RMTpF56iPCJqiO/Qd/fD2va535H2opK2tZc5ztGbe2V8fcjuh</latexit><latexit sha1_base64="Qnv5yYiVE9fYANkM28gLKOWtyCI=">AAADkHicbZJLb9NAEMe3MdASXm2RuHCxqJDKJbI5QG4t6qXqqRDSpI1DtF6P21X2Ye2uQ6OVP0RvCCS+F4IPwzoOiuMw0kqz8/vPY1cTZ4xqEwS/tlrevfsPtnceth89fvL02e7e/oWWuSLQJ5JJNYyxBkYF9A01DIaZAsxjBoN4elLywQyUplJ8NvMMxhxfC5pSgo0LDSL4Eh7O30x2D4JOsDB/0wmXzsHxi+zntz+/u+eTvZaOEklyDsIQhrUehUFmxhYrQwmDoh3lGjJMpvgaRs4VmIMe28W8hf/aRRI/lcodYfxFtJ5hMdd6zmOn5Njc6CYrg/9jo9yk3bGlIssNCFI1SnPmG+mXj/cTqoAYNncOJoq6WX1ygxUmxn1ROxLwlUjOsUhsdFvYqGwQp/a2KNYZrBg0WepYLFlSzi+ZTUuuoK7IV9n5Jp2t6MxRf634cAkJZnbY7HxZg5dNeFWDV5tdezXca+a6fWL631ix/dTkCZEOJ5NKobgt700Rk7qsEWPl/o+xJu4ZhalYm2JVr2JF4dY0bC7lpnPxthMGnfCj29d3qLId9BK9QocoRO/RMTpF56iPCJqiO/Qd/fD2va535H2opK2tZc5ztGbe2V8fcjuh</latexit><latexit sha1_base64="V3E+6xQWTj51FLMdFjAeVlbhcC0=">AAADkHicbZJLbxMxEMfdLI8SXm05crGIkMol2uVQeqMVF8SpENKkzYbI651trfixsr1pI2s/BFf4ZHwbvMmibDaMZGk8v/88bE2Sc2ZsGP7Z6wQPHj56vP+k+/TZ8xcvDw6PLo0qNIUhVVzpcUIMcCZhaJnlMM41EJFwGCXzTxUfLUAbpuR3u8xhKsiNZBmjxPrQKIYf0fHy3eygF/bDleFdJ6qdHqrtYnbYMXGqaCFAWsqJMZMozO3UEW0Z5VB248JATuic3MDEu5IIMFO3mrfEb30kxZnS/kiLV9FmhiPCmKVIvFIQe2varAr+j00Km51OHZN5YUHSdaOs4NgqXD0ep0wDtXzpHUI187Nieks0odZ/UTeWcEeVEESmLr4vXVw1SDJ3X5bbDDYM2izzLFE8reZX3GUV19BUFJvsYpcuNnThKd4qPq4hJdyN252vGvCqDa8b8Hq366CBB+1cv0/c/Bsrcd/aPKXK43S2VmjhqntbxJWpaiRE+//jvI0HVhMmt6bY1FuzsvRrGrWXcte5fN+Pwn70NeydndQLu49eozfoGEXoAzpDn9EFGiKK5ugn+oV+B0fBafAxOF9LO3t1ziu0ZcGXv3W6N7Y=</latexit>

e2(y)<latexit sha1_base64="/cZAQAaeVrRdWj644IPf8gj45NY=">AAADkHicbZJLbxMxEMfdLI8SXm2RuHBZUSGVS7TbQ+mNIi6IUyGkSZsNke2dba34sbK9oZG1H4IrSIivxbfBmw3KZsNIlsbz+8/D1pCcM2Oj6M9OJ7hz99793Qfdh48eP3m6t39wYVShKQyo4kqPCDbAmYSBZZbDKNeABeEwJLP3FR/OQRum5Be7yGEi8LVkGaPY+tAwga/HR4vX073DqBctLdx24pVzePY8//UbIXQ+3e+YJFW0ECAt5diYcRzlduKwtoxyKLtJYSDHdIavYexdiQWYiVvOW4avfCQNM6X9kTZcRpsZDgtjFoJ4pcD2xrRZFfwfGxc2O504JvPCgqR1o6zgoVVh9fgwZRqo5QvvYKqZnzWkN1hjav0XdRMJ36gSAsvUJbelS6oGJHO3ZbnJYM2gzTLPiOJpNb/iLqu4hqaiWGcX23S+pnNPw43ioxWkmLtRu/NlA1624VUDXm137Tdwv53r94mbf2MR97nNU6o8Tqe1QgtX3dsirkxVg2Dt/4/zNu5bjZncmGJdr2Zl6dc0bi/ltnNx3IujXvzJ7+sJqm0XvUAv0RGK0Rt0hj6gczRAFM3Qd/QD/QwOgtPgbfCulnZ2VjnP0IYFH/8Ce+U5uw==</latexit><latexit sha1_base64="uJ1HR+xcZyqFOO3JxD03CahK7i0=">AAADkHicbZJLb9NAEMe3MY8SXm2RuHCxqJDKJbJ7KLlRxAX1VAhp0sYhWq/H7Sr7sHbXodHKH6I31Ep8LwQfhnUcFMdhpJVm5/efx64mzhjVJgh+bbW8e/cfPNx+1H785Omz5zu7e2da5opAn0gm1TDGGhgV0DfUMBhmCjCPGQzi6ceSD2agNJXiq5lnMOb4UtCUEmxcaBDBt8OD+dvJzn7QCRbmbzrh0tk/fpn9/PHnd/d0stvSUSJJzkEYwrDWozDIzNhiZShhULSjXEOGyRRfwsi5AnPQY7uYt/DfuEjip1K5I4y/iNYzLOZaz3nslBybK91kZfB/bJSbtDu2VGS5AUGqRmnOfCP98vF+QhUQw+bOwURRN6tPrrDCxLgvakcCvhPJORaJja4LG5UN4tReF8U6gxWDJksdiyVLyvkls2nJFdQV+So736SzFZ056q8VHy4hwcwOm53Pa/C8CS9q8GKza6+Ge81ct09M/xsrtl+aPCHS4WRSKRS35b0pYlKXNWKs3P8x1sQ9ozAVa1Os6lWsKNyahs2l3HTODjth0Ak/u309QpVto1foNTpAIXqHjtEndIr6iKApukG36M7b87ree+9DJW1tLXNeoDXzTv4CIps7og==</latexit><latexit sha1_base64="uJ1HR+xcZyqFOO3JxD03CahK7i0=">AAADkHicbZJLb9NAEMe3MY8SXm2RuHCxqJDKJbJ7KLlRxAX1VAhp0sYhWq/H7Sr7sHbXodHKH6I31Ep8LwQfhnUcFMdhpJVm5/efx64mzhjVJgh+bbW8e/cfPNx+1H785Omz5zu7e2da5opAn0gm1TDGGhgV0DfUMBhmCjCPGQzi6ceSD2agNJXiq5lnMOb4UtCUEmxcaBDBt8OD+dvJzn7QCRbmbzrh0tk/fpn9/PHnd/d0stvSUSJJzkEYwrDWozDIzNhiZShhULSjXEOGyRRfwsi5AnPQY7uYt/DfuEjip1K5I4y/iNYzLOZaz3nslBybK91kZfB/bJSbtDu2VGS5AUGqRmnOfCP98vF+QhUQw+bOwURRN6tPrrDCxLgvakcCvhPJORaJja4LG5UN4tReF8U6gxWDJksdiyVLyvkls2nJFdQV+So736SzFZ056q8VHy4hwcwOm53Pa/C8CS9q8GKza6+Ge81ct09M/xsrtl+aPCHS4WRSKRS35b0pYlKXNWKs3P8x1sQ9ozAVa1Os6lWsKNyahs2l3HTODjth0Ak/u309QpVto1foNTpAIXqHjtEndIr6iKApukG36M7b87ree+9DJW1tLXNeoDXzTv4CIps7og==</latexit><latexit sha1_base64="BA3P1cTIdsF5WVMLzcZjNWA5WmM=">AAADkHicbZJLbxMxEMfdLNASXm05crGIkMol2u0BeqMVF8SpENKkzYbI651trfixsr2hkbUfgit8Mr4N3mRRNhtGsjSe338etibJOTM2DP/sdYIHDx/tHzzuPnn67PmLw6PjK6MKTWFIFVd6nBADnEkYWmY5jHMNRCQcRsn8Y8VHC9CGKfnNLnOYCnIrWcYosT40iuH76cny7eywF/bDleFdJ6qdHqrtcnbUMXGqaCFAWsqJMZMozO3UEW0Z5VB248JATuic3MLEu5IIMFO3mrfEb3wkxZnS/kiLV9FmhiPCmKVIvFIQe2farAr+j00Km51NHZN5YUHSdaOs4NgqXD0ep0wDtXzpHUI187Niekc0odZ/UTeW8IMqIYhMXXxfurhqkGTuviy3GWwYtFnmWaJ4Ws2vuMsqrqGpKDbZxS5dbOjCU7xVfFxDSrgbtztfN+B1G9404M1u10EDD9q5fp+4+TdW4r62eUqVx+lsrdDCVfe2iCtT1UiI9v/HeRsPrCZMbk2xqbdmZenXNGov5a5zddqPwn70Jeydv6sX9gC9Qq/RCYrQe3SOPqFLNEQUzdFP9Av9Do6Ds+BDcLGWdvbqnJdoy4LPfwF44ze3</latexit>

eK(y)<latexit sha1_base64="zZSZ+/DGrM92+xH47Zr5ac1ENfA=">AAADkHicbZJLbxMxEMfdLI8SXm2RuHCxqJDKJdrtAXqjiAuCSyGkSZsNkdc721rxY2V7QyNrPwRXkBBfi2+DNxuUzYaRLI3n95+HrUlyzowNwz87neDW7Tt3d+917z94+Ojx3v7BuVGFpjCgiis9SogBziQMLLMcRrkGIhIOw2T2ruLDOWjDlPxiFzlMBLmSLGOUWB8axvD149Hi5XTvMOyFS8PbTrRyDk+f5r9+I4TOpvsdE6eKFgKkpZwYM47C3E4c0ZZRDmU3LgzkhM7IFYy9K4kAM3HLeUv8wkdSnCntj7R4GW1mOCKMWYjEKwWx16bNquD/2Liw2cnEMZkXFiStG2UFx1bh6vE4ZRqo5QvvEKqZnxXTa6IJtf6LurGEb1QJQWTq4pvSxVWDJHM3ZbnJYM2gzTLPEsXTan7FXVZxDU1Fsc4utul8Teee4o3ioxWkhLtRu/NFA1604WUDXm537Tdwv53r94mbf2Ml7nObp1R5nE5rhRauurdFXJmqRkK0/z/O27hvNWFyY4p1vZqVpV/TqL2U2875cS8Ke9Env6+vUG276Bl6jo5QhF6jU/QenaEBomiGvqMf6GdwEJwEb4K3tbSzs8p5gjYs+PAXyuY51A==</latexit><latexit sha1_base64="UiSf8tl1AGan+BKFr9Dkq6/3J0s=">AAADkHicbZJLb9NAEMe3MY8SXm2RuHCxWiGVS2T3ALlRxAXBpRDSpI1DtF6P21X2Ye2uQ6OVPwQ3BFK/VwUfhnWcKo7DSCvNzu8/j11NnDGqTRDcbLW8O3fv3d9+0H746PGTpzu7e6da5opAn0gm1TDGGhgV0DfUMBhmCjCPGQzi6fuSD2agNJXiq5lnMOb4QtCUEmxcaBDBt0+H81eTnYOgEyzM33TCpXNw/Dy7/vn3T/dkstvSUSJJzkEYwrDWozDIzNhiZShhULSjXEOGyRRfwMi5AnPQY7uYt/Bfukjip1K5I4y/iNYzLOZaz3nslBybS91kZfB/bJSbtDu2VGS5AUGqRmnOfCP98vF+QhUQw+bOwURRN6tPLrHCxLgvakcCvhPJORaJja4KG5UN4tReFcU6gxWDJksdiyVLyvkls2nJFdQV+So736SzFZ056q8VHy4hwcwOm53PavCsCc9r8Hyza6+Ge81ct09M344V2y9NnhDpcDKpFIrb8t4UManLGjFW7v8Ya+KeUZiKtSlW9SpWFG5Nw+ZSbjqnR50w6ISf3b6+RpVtoxdoHx2iEL1Bx+gDOkF9RNAU/UC/0G9vz+t6b713lbS1tcx5htbM+/gPcZw7uw==</latexit><latexit sha1_base64="UiSf8tl1AGan+BKFr9Dkq6/3J0s=">AAADkHicbZJLb9NAEMe3MY8SXm2RuHCxWiGVS2T3ALlRxAXBpRDSpI1DtF6P21X2Ye2uQ6OVPwQ3BFK/VwUfhnWcKo7DSCvNzu8/j11NnDGqTRDcbLW8O3fv3d9+0H746PGTpzu7e6da5opAn0gm1TDGGhgV0DfUMBhmCjCPGQzi6fuSD2agNJXiq5lnMOb4QtCUEmxcaBDBt0+H81eTnYOgEyzM33TCpXNw/Dy7/vn3T/dkstvSUSJJzkEYwrDWozDIzNhiZShhULSjXEOGyRRfwMi5AnPQY7uYt/Bfukjip1K5I4y/iNYzLOZaz3nslBybS91kZfB/bJSbtDu2VGS5AUGqRmnOfCP98vF+QhUQw+bOwURRN6tPLrHCxLgvakcCvhPJORaJja4KG5UN4tReFcU6gxWDJksdiyVLyvkls2nJFdQV+So736SzFZ056q8VHy4hwcwOm53PavCsCc9r8Hyza6+Ge81ct09M344V2y9NnhDpcDKpFIrb8t4UManLGjFW7v8Ya+KeUZiKtSlW9SpWFG5Nw+ZSbjqnR50w6ISf3b6+RpVtoxdoHx2iEL1Bx+gDOkF9RNAU/UC/0G9vz+t6b713lbS1tcx5htbM+/gPcZw7uw==</latexit><latexit sha1_base64="2ZQq0xYDttpZLLv8Wa+7uXHOsJg=">AAADkHicbZJLbxMxEMfdLNASXm05crGIkMol2uUAvdGKC4JLIaRJmw2R1zvbWvFjZXtDI2s/BFf4ZHwbvMmibDaMZGk8v/88bE2Sc2ZsGP7Z6wT37j/YP3jYffT4ydNnh0fHl0YVmsKQKq70OCEGOJMwtMxyGOcaiEg4jJL5h4qPFqANU/KbXeYwFeRGsoxRYn1oFMP3zyfL17PDXtgPV4Z3nah2eqi2i9lRx8SpooUAaSknxkyiMLdTR7RllEPZjQsDOaFzcgMT70oiwEzdat4Sv/KRFGdK+yMtXkWbGY4IY5Yi8UpB7K1psyr4PzYpbHY6dUzmhQVJ142ygmOrcPV4nDIN1PKldwjVzM+K6S3RhFr/Rd1Ywg+qhCAydfFd6eKqQZK5u7LcZrBh0GaZZ4niaTW/4i6ruIamothkF7t0saELT/FW8XENKeFu3O581YBXbXjdgNe7XQcNPGjn+n3i5t9Yifva5ilVHqeztUILV93bIq5MVSMh2v8f5208sJowuTXFpt6alaVf06i9lLvO5Zt+FPajL2Hv7G29sAfoBXqJTlCE3qEz9BFdoCGiaI5+ol/od3AcnAbvg/O1tLNX5zxHWxZ8+gvH5DfQ</latexit>

y<latexit sha1_base64="YG+YhEP4k3bUV66vibrBzoc3glc=">AAADinicbZJLbxMxEMfdLI8SCjRw5GJRIXGKdnvgIS6V4MCxJaRNm40i2zvbWvVjZXtDI2s/AeIGH45vgzcblM2GkSyN5/efh62hheDWxfGfvV507/6Dh/uP+o8Pnjx9djh4fm51aRiMmRbaTCixILiCseNOwKQwQCQVcEFvP9X8YgHGcq2+uWUBM0muFc85Iy6Ezpbzw6N4GK8M7zrJ2jk6GfwcoGCn80HPpplmpQTlmCDWTpO4cDNPjONMQNVPSwsFYbfkGqbBVUSCnfnVpBV+HSIZzrUJRzm8irYzPJHWLiUNSkncje2yOvg/Ni1d/n7muSpKB4o1jfJSYKdx/WyccQPMiWVwCDM8zIrZDTGEufA5/VTBd6alJCrz6V3l07oBzf1dVW0z2DDosjwwqkVWz6+Fz2tuoK0oN9nlLl1s6CJQvFV8soaMCD/pdr5swcsuvGrBq92uoxYedXPDJgn7byzqv3Z5xnTA2bxRGOnre1cktK1rUGLC/wnRxSNnCFdbU2zqNayqwpom3aXcdc6Ph0k8TM7Cvr5Fje2jl+gVeoMS9A6doC/oFI0RQ4B+oF/od3QQHUcfoo+NtLe3znmBtiz6/BcwNDbI</latexit><latexit sha1_base64="icsg/DrdlbjmAepD5G5/XZt+Cvc=">AAADinicbZLNbhMxEMfdLNASCiRwg8uKColTtNsDUHGpBAeOLSFt2mwUeb2zrVV/rGxvaGTtEyBu8B68Bo/Ag3DHmw3KxmEkS+P5/efD1qQFo9pE0e+dTnDn7r3dvfvdB/sPHz3u9Z+caVkqAiMimVTjFGtgVMDIUMNgXCjAPGVwnt68r/n5HJSmUnw2iwKmHF8JmlOCjQudLma9g2gQLS3cduKVc3Dc/9b78+zXz5NZv6OTTJKSgzCEYa0ncVSYqcXKUMKg6ialhgKTG3wFE+cKzEFP7XLSKnzpIlmYS+WOMOEy2s6wmGu94KlTcmyutc/q4P/YpDT526mloigNCNI0yksWGhnWzw4zqoAYtnAOJoq6WUNyjRUmxn1ONxHwhUjOschsclvZpG6Q5va2qjYZrBn4LHcslSyr55fM5jVX0FaU6+xym87XdO5ouFF8vIIEMzv2O1+04IUPL1vwcrvrsIWHfq7bJKb/jZXaTz7PiHQ4mzUKxW1990VM6rpGipX7P8Z8PDQKU7Exxbpew6rKrWnsL+W2c3Y4iKNBfOr29TVqbA89Ry/QKxSjN+gYfUQnaIQIAvQVfUc/gv3gMDgK3jXSzs4q5ynasODDX7K2OU8=</latexit><latexit sha1_base64="icsg/DrdlbjmAepD5G5/XZt+Cvc=">AAADinicbZLNbhMxEMfdLNASCiRwg8uKColTtNsDUHGpBAeOLSFt2mwUeb2zrVV/rGxvaGTtEyBu8B68Bo/Ag3DHmw3KxmEkS+P5/efD1qQFo9pE0e+dTnDn7r3dvfvdB/sPHz3u9Z+caVkqAiMimVTjFGtgVMDIUMNgXCjAPGVwnt68r/n5HJSmUnw2iwKmHF8JmlOCjQudLma9g2gQLS3cduKVc3Dc/9b78+zXz5NZv6OTTJKSgzCEYa0ncVSYqcXKUMKg6ialhgKTG3wFE+cKzEFP7XLSKnzpIlmYS+WOMOEy2s6wmGu94KlTcmyutc/q4P/YpDT526mloigNCNI0yksWGhnWzw4zqoAYtnAOJoq6WUNyjRUmxn1ONxHwhUjOschsclvZpG6Q5va2qjYZrBn4LHcslSyr55fM5jVX0FaU6+xym87XdO5ouFF8vIIEMzv2O1+04IUPL1vwcrvrsIWHfq7bJKb/jZXaTz7PiHQ4mzUKxW1990VM6rpGipX7P8Z8PDQKU7Exxbpew6rKrWnsL+W2c3Y4iKNBfOr29TVqbA89Ry/QKxSjN+gYfUQnaIQIAvQVfUc/gv3gMDgK3jXSzs4q5ynasODDX7K2OU8=</latexit><latexit sha1_base64="UFGn3rMe1QKPN7QvGLEGaaKQ+8k=">AAADinicbZJLbxMxEMfdLI8SCrRw5GIRIXGKdnugIC6V4MCxJaRNm40i2zvbWvVjZXtDI2s/AVf4cHwbvMmibDaMZGk8v/88bA0tBLcujv/s9aIHDx893n/Sf3rw7PmLw6OXF1aXhsGYaaHNhBILgisYO+4ETAoDRFIBl/Tuc80vF2As1+q7WxYwk+RG8Zwz4kLofDk/HMTDeGV410kaZ4AaO5sf9WyaaVZKUI4JYu00iQs388Q4zgRU/bS0UBB2R25gGlxFJNiZX01a4bchkuFcm3CUw6toO8MTae1S0qCUxN3aLquD/2PT0uUfZp6ronSg2LpRXgrsNK6fjTNugDmxDA5hhodZMbslhjAXPqefKvjBtJREZT69r3xaN6C5v6+qbQYbBl2WB0a1yOr5tfB5zQ20FeUmu9yliw1dBIq3ik8ayIjwk27nqxa86sLrFrze7Tpq4VE3N2ySsP/Gov5bl2dMB5zN1wojfX3vioS2dQ1KTPg/Ibp45AzhamuKTb01q6qwpkl3KXedi+NhEg+T83hw+r5Z2H30Gr1B71CCTtAp+orO0BgxBOgn+oV+RwfRcfQx+rSW9vaanFdoy6IvfwFmQzXZ</latexit>

GloVe.6B<latexit sha1_base64="YYYHCiFnuTL5brZigISSYw+jAAQ=">AAADl3icbZJLbxMxEMfdLI8SXi2cEBeLCKmnaJdDy60VlaDHlpA0bTZEXu9sa9WPle1NG1n7PbjC5+DA1+CDcMebDUqyYSRL4/nN46/RJDlnxobh761WcO/+g4fbj9qPnzx99nxn98XAqEJT6FPFlR4mxABnEvqWWQ7DXAMRCYfz5Oa44udT0IYp+cXOchgLciVZxiixPvTVxVrgT1wNoLv/oZzsdMJuODe86UQLp3P4c+/PL4TQ6WS3ZeJU0UKAtJQTY0ZRmNuxI9oyyqFsx4WBnNAbcgUj70oiwIzdXHaJ3/pIijOl/ZMWz6OrFY4IY2Yi8ZmC2GvTZFXwf2xU2Oz92DGZFxYkrQdlBcdW4WoHOGUaqOUz7xCqmdeK6TXRhFq/qXYs4ZYqIYhMXXxXurgakGTurizXGSwZNFnmWaJ4WulX3GUV17CaUSyri006XdKpp3it+XABKeFu2Jx8sQIvmvByBV5uTu2t4F6z1p8VN/9kJe5zk6dUeZxO6gwtXPVvJnFlqh4J0X5/nDdxz2rC5JqKZb+aldWZRs2j3HQG77pR2I3Ows7RPqptG71Gb9AeitABOkIn6BT1EUUafUPf0Y/gVXAYfAxO6tTW1qLmJVqz4Owvtl89kA==</latexit><latexit sha1_base64="lrkntAxpwSesfDTwB134uh/gbmQ=">AAADl3icbZJLbxMxEMfdLI8SXi2cEBeLCKlcot0eCrdWIEGPLSFp2myIvN7Z1qofK9sbGln7Gbhyhc/BiW/BB+GONxuUZMNIlsbzm8dfo0lyzowNw99breDW7Tt3t++17z94+Ojxzu6TgVGFptCniis9TIgBziT0LbMchrkGIhIOZ8n1u4qfTUEbpuQnO8thLMilZBmjxPrQZxdrgT9wNYDuwdtystMJu+Hc8KYTLZzO4c+9P7++xq9OJrstE6eKFgKkpZwYM4rC3I4d0ZZRDmU7LgzkhF6TSxh5VxIBZuzmskv80kdSnCntn7R4Hl2tcEQYMxOJzxTEXpkmq4L/Y6PCZm/Gjsm8sCBpPSgrOLYKVzvAKdNALZ95h1DNvFZMr4gm1PpNtWMJX6gSgsjUxTeli6sBSeZuynKdwZJBk2WeJYqnlX7FXVZxDasZxbK62KTTJZ16iteaDxeQEu6GzcnnK/C8CS9W4MXm1N4K7jVr/Vlx809W4j42eUqVx+mkztDCVf9mElem6pEQ7ffHeRP3rCZMrqlY9qtZWZ1p1DzKTWew343CbnQado4OUG3b6Dl6gfZQhF6jI3SMTlAfUaTRN/Qd/QieBYfB++C4Tm1tLWqeojULTv8C8EY+pg==</latexit><latexit sha1_base64="lrkntAxpwSesfDTwB134uh/gbmQ=">AAADl3icbZJLbxMxEMfdLI8SXi2cEBeLCKlcot0eCrdWIEGPLSFp2myIvN7Z1qofK9sbGln7Gbhyhc/BiW/BB+GONxuUZMNIlsbzm8dfo0lyzowNw99breDW7Tt3t++17z94+Ojxzu6TgVGFptCniis9TIgBziT0LbMchrkGIhIOZ8n1u4qfTUEbpuQnO8thLMilZBmjxPrQZxdrgT9wNYDuwdtystMJu+Hc8KYTLZzO4c+9P7++xq9OJrstE6eKFgKkpZwYM4rC3I4d0ZZRDmU7LgzkhF6TSxh5VxIBZuzmskv80kdSnCntn7R4Hl2tcEQYMxOJzxTEXpkmq4L/Y6PCZm/Gjsm8sCBpPSgrOLYKVzvAKdNALZ95h1DNvFZMr4gm1PpNtWMJX6gSgsjUxTeli6sBSeZuynKdwZJBk2WeJYqnlX7FXVZxDasZxbK62KTTJZ16iteaDxeQEu6GzcnnK/C8CS9W4MXm1N4K7jVr/Vlx809W4j42eUqVx+mkztDCVf9mElem6pEQ7ffHeRP3rCZMrqlY9qtZWZ1p1DzKTWew343CbnQado4OUG3b6Dl6gfZQhF6jI3SMTlAfUaTRN/Qd/QieBYfB++C4Tm1tLWqeojULTv8C8EY+pg==</latexit><latexit sha1_base64="bF3W54KGXHSHjr+PeRuGAWNcnFQ=">AAADl3icbZJLb9NAEMe3MY8SXi2cKi4rIiROkc2hcKMCifbYEpKmjUO0Xo/bVfdh7a5Do5W/R6/wrfg2XSdGcRxGWum/85uXRpPknBkbhn93OsGDh48e7z7pPn32/MXLvf1XI6MKTWFIFVd6nBADnEkYWmY5jHMNRCQczpObrxU/n4M2TMkfdpHDVJAryTJGifWuny7WAh9zNYL+4ZdyttcL++HS8LaIatFDtZ3O9jsmThUtBEhLOTFmEoW5nTqiLaMcym5cGMgJvSFXMPFSEgFm6pZjl/id96Q4U9o/afHS28xwRBizEImPFMRemzarnP9jk8Jmn6aOybywIOmqUVZwbBWudoBTpoFavvCCUM38rJheE02o9ZvqxhJ+USUEkamLb0sXVw2SzN2W5SaDNYM2yzxLFE+r+RV3WcU1NCOKdXaxTedrOvcUbxQf15AS7sbtzhcNeNGGlw14ud110MCDdq4/K27+jZW4722eUuVxOltFaOGqfzuIK1PVSIj2++O8jQdWEyY3pljXW7GyOtOofZTbYvShH4X96CzsHR3WB7uL3qC36D2K0Ed0hE7QKRoiijS6Q7/Rn+Ag+Bx8C05WoZ2dOuc12rDg7B46gjq4</latexit>

GloVe.42B<latexit sha1_base64="3TNCOMbiD9hmAviWqyS1TzV4MY0=">AAADmnicbZJLbxMxEMfdLI8SHk3hCAeLCKmnaLdCwI2qHABxKaRJ02ajyOudba36sbK9oZG1n4QrfAsOfA0+CHe82aAkG0ayNJ7fPP4aTZJzZmwY/t5pBbdu37m7e699/8HDR3ud/cdDowpNYUAVV3qUEAOcSRhYZjmMcg1EJBzOkut3FT+bgTZMyVM7z2EiyKVkGaPE+tC0s+diLfB7robQe3l4XE473bAXLgxvO9HS6b79efDnF0LoZLrfMnGqaCFAWsqJMeMozO3EEW0Z5VC248JATug1uYSxdyURYCZuobzEL3wkxZnS/kmLF9H1CkeEMXOR+ExB7JVpsir4PzYubPZm4pjMCwuS1oOygmOrcLUGnDIN1PK5dwjVzGvF9IpoQq1fVjuW8JUqIYhMXXxTurgakGTupiw3GawYNFnmWaJ4WulX3GUV17CeUayqi206W9GZp3ij+WgJKeFu1Jx8vgbPm/BiDV5sT+2v4X6z1l8WN/9kJe5Lk6dUeZxO6wwtXPVvJnFlqh4J0X5/nDdx32rC5IaKVb+aldWZRs2j3HaGh70o7EWfw+7RK1TbLnqKnqMDFKHX6Ah9QCdogCgq0Df0Hf0IngXHwcfgU53a2lnWPEEbFpz+BSMFPfs=</latexit><latexit sha1_base64="XQvrH8rpUWKwOHdCsRX/ZUzS3/M=">AAADmnicbZLNbtNAEMe3MR8lfDSFIxwsIqRyiewKQW+tygEQl0KaNG0cRev1uF11P6zddWi08jPwAFzhLTjxFjwId9ZxUByHkVaand98/DWaOGNUmyD4vdXybt2+c3f7Xvv+g4ePdjq7j4da5orAgEgm1SjGGhgVMDDUMBhlCjCPGZzF129LfjYDpakUp2aewYTjS0FTSrBxoWlnx0aK+++YHELv1f5xMe10g16wMH/TCZdO9/Dn3p9fX6OXJ9Pdlo4SSXIOwhCGtR6HQWYmFitDCYOiHeUaMkyu8SWMnSswBz2xC+WF/8JFEj+Vyj1h/EW0XmEx13rOY5fJsbnSTVYG/8fGuUkPJpaKLDcgSDUozZlvpF+uwU+oAmLY3DmYKOq0+uQKK0yMW1Y7EvCFSM6xSGx0U9ioHBCn9qYo1hmsGDRZ6lgsWVLql8ymJVdQz8hX1fkmna3ozFF/rfloCQlmdtScfF6D5014UYMXm1P7Ndxv1rrLYvqfrNh+bvKESIeTaZWhuC3/zSQmddkjxsrtj7Em7huFqVhTsepXsaI807B5lJvOcL8XBr3wU9A9eo0q20ZP0XO0h0L0Bh2h9+gEDRBBOfqGvqMf3jPv2PvgfaxSW1vLmidozbzTv1zsPxE=</latexit><latexit sha1_base64="XQvrH8rpUWKwOHdCsRX/ZUzS3/M=">AAADmnicbZLNbtNAEMe3MR8lfDSFIxwsIqRyiewKQW+tygEQl0KaNG0cRev1uF11P6zddWi08jPwAFzhLTjxFjwId9ZxUByHkVaand98/DWaOGNUmyD4vdXybt2+c3f7Xvv+g4ePdjq7j4da5orAgEgm1SjGGhgVMDDUMBhlCjCPGZzF129LfjYDpakUp2aewYTjS0FTSrBxoWlnx0aK+++YHELv1f5xMe10g16wMH/TCZdO9/Dn3p9fX6OXJ9Pdlo4SSXIOwhCGtR6HQWYmFitDCYOiHeUaMkyu8SWMnSswBz2xC+WF/8JFEj+Vyj1h/EW0XmEx13rOY5fJsbnSTVYG/8fGuUkPJpaKLDcgSDUozZlvpF+uwU+oAmLY3DmYKOq0+uQKK0yMW1Y7EvCFSM6xSGx0U9ioHBCn9qYo1hmsGDRZ6lgsWVLql8ymJVdQz8hX1fkmna3ozFF/rfloCQlmdtScfF6D5014UYMXm1P7Ndxv1rrLYvqfrNh+bvKESIeTaZWhuC3/zSQmddkjxsrtj7Em7huFqVhTsepXsaI807B5lJvOcL8XBr3wU9A9eo0q20ZP0XO0h0L0Bh2h9+gEDRBBOfqGvqMf3jPv2PvgfaxSW1vLmidozbzTv1zsPxE=</latexit><latexit sha1_base64="81W3eQ71ESO0iRsmI+OybVbe2vc=">AAADmnicbZJLb9NAEMe3MY8SHk3hCIcVERKnyK4q4FiVAyAuhTRp2jiK1utxu+o+rN11aLTyJ+EKH4pvwzoxiuMw0kr/nd+8NJok58zYMPyz1wnu3X/wcP9R9/GTp88OeofPx0YVmsKIKq70JCEGOJMwssxymOQaiEg4XCS3Hyt+sQBtmJLndpnDTJBryTJGifWuee/AxVrgT1yNYXB8dFrOe/1wEK4M74qoFn1U29n8sGPiVNFCgLSUE2OmUZjbmSPaMsqh7MaFgZzQW3INUy8lEWBmbjV5id94T4ozpf2TFq+8zQxHhDFLkfhIQeyNabPK+T82LWz2YeaYzAsLkq4bZQXHVuFqDThlGqjlSy8I1czPiukN0YRav6xuLOEHVUIQmbr4rnRx1SDJ3F1ZbjPYMGizzLNE8bSaX3GXVVxDM6LYZBe7dLGhC0/xVvFJDSnhbtLufNmAl2141YBXu12HDTxs5/rL4ubfWIn73uYpVR6n83WEFq76t4O4MlWNhGi/P87beGg1YXJrik29NSurM43aR7krxkeDKBxE38L+ybv6YPfRS/QavUUReo9O0Gd0hkaIogL9RL/Q7+BVcBp8Cb6uQzt7dc4LtGXB+V+nGTsj</latexit>

FastText<latexit sha1_base64="sVqH8YvI/mj2yRb3D0hV1tgzQm0=">AAADl3icbZJLb9NAEMe3MY8SXi2cEJcVEVJPkc0BuLUSUtVjS5o2bRyi9XrcrroPa3cdEq38PbjC5+DA1+CDcGcdB8VxGGml2fnN46/RJDlnxobh751OcO/+g4e7j7qPnzx99nxv/8WFUYWmMKSKKz1KiAHOJAwtsxxGuQYiEg6Xyd2nil/OQBum5Lld5DAR5EayjFFifeiLi7XAx8TYc5jbcrrXC/vh0vC2E62c3uHPgz+/EEKn0/2OiVNFCwHSUk6MGUdhbieOaMsoh7IbFwZyQu/IDYy9K4kAM3FL2SV+6yMpzpT2T1q8jDYrHBHGLETiMwWxt6bNquD/2Liw2ceJYzIvLEhaD8oKjq3C1Q5wyjRQyxfeIVQzrxXTW6IJtX5T3VjCV6qEIDJ18bx0cTUgydy8LDcZrBm0WeZZonha6VfcZRXX0Mwo1tXFNp2t6cxTvNF8tIKUcDdqT75qwKs2vG7A6+2pgwYetGv9WXHzT1biPrd5SpXH6bTO0MJV/3YSV6bqkRDt98d5Gw+sJkxuqFj3q1lZnWnUPspt5+JdPwr70VnYO3qPattFr9EbdIAi9AEdoRN0ioaIIo2+oe/oR/AqOAyOg5M6tbOzqnmJNiw4+wviDT5A</latexit><latexit sha1_base64="xH4mhS2gc1imMnRteTB2c3bDpuY=">AAADl3icbZJLbxMxEMfdLI8SXi2cEBeLCKlcol0OwK2VKlU9tqRp02ZD5PXOtlb9WNneNJG1n4ErV/gcnPgWfBDueLNBSTaMZGk8v3n8NZok58zYMPy91Qru3X/wcPtR+/GTp8+e7+y+ODeq0BT6VHGlBwkxwJmEvmWWwyDXQETC4SK5Paz4xQS0YUqe2VkOI0GuJcsYJdaHvrhYC3xEjD2DqS3HO52wG84NbzrRwuns/9z78+tr/O5kvNsycapoIUBayokxwyjM7cgRbRnlULbjwkBO6C25hqF3JRFgRm4uu8RvfSTFmdL+SYvn0dUKR4QxM5H4TEHsjWmyKvg/Nixs9mnkmMwLC5LWg7KCY6twtQOcMg3U8pl3CNXMa8X0hmhCrd9UO5ZwR5UQRKYunpYurgYkmZuW5TqDJYMmyzxLFE8r/Yq7rOIaVjOKZXWxSSdLOvEUrzUfLCAl3A2aky9X4GUTXq3Aq82pvRXca9b6s+Lmn6zEfW7ylCqP03GdoYWr/s0krkzVIyHa74/zJu5ZTZhcU7HsV7OyOtOoeZSbzvn7bhR2o9Owc/AB1baNXqM3aA9F6CM6QMfoBPURRRp9Q9/Rj+BVsB8cBcd1amtrUfMSrVlw+hccAz9W</latexit><latexit sha1_base64="xH4mhS2gc1imMnRteTB2c3bDpuY=">AAADl3icbZJLbxMxEMfdLI8SXi2cEBeLCKlcol0OwK2VKlU9tqRp02ZD5PXOtlb9WNneNJG1n4ErV/gcnPgWfBDueLNBSTaMZGk8v3n8NZok58zYMPy91Qru3X/wcPtR+/GTp8+e7+y+ODeq0BT6VHGlBwkxwJmEvmWWwyDXQETC4SK5Paz4xQS0YUqe2VkOI0GuJcsYJdaHvrhYC3xEjD2DqS3HO52wG84NbzrRwuns/9z78+tr/O5kvNsycapoIUBayokxwyjM7cgRbRnlULbjwkBO6C25hqF3JRFgRm4uu8RvfSTFmdL+SYvn0dUKR4QxM5H4TEHsjWmyKvg/Nixs9mnkmMwLC5LWg7KCY6twtQOcMg3U8pl3CNXMa8X0hmhCrd9UO5ZwR5UQRKYunpYurgYkmZuW5TqDJYMmyzxLFE8r/Yq7rOIaVjOKZXWxSSdLOvEUrzUfLCAl3A2aky9X4GUTXq3Aq82pvRXca9b6s+Lmn6zEfW7ylCqP03GdoYWr/s0krkzVIyHa74/zJu5ZTZhcU7HsV7OyOtOoeZSbzvn7bhR2o9Owc/AB1baNXqM3aA9F6CM6QMfoBPURRRp9Q9/Rj+BVsB8cBcd1amtrUfMSrVlw+hccAz9W</latexit><latexit sha1_base64="o1gA43HEtbBE2CNf70FiaACoO3M=">AAADl3icbZJLa9tAEMc3Vh+p+0raU+llqSn0ZKQe0t4aKIQckzpOnFiuWa1GyZJ9iN2Va7Poe+Safqt8m6xsFctyBxb+O795MUySc2ZsGD7sdIInT589333Rffnq9Zu3e/vvzo0qNIUhVVzpUUIMcCZhaJnlMMo1EJFwuEhuf1b8YgbaMCXP7CKHiSDXkmWMEutdv12sBT4ixp7B3JbTvV7YD5eGt0VUix6q7WS63zFxqmghQFrKiTHjKMztxBFtGeVQduPCQE7oLbmGsZeSCDATtxy7xJ+9J8WZ0v5Ji5feZoYjwpiFSHykIPbGtFnl/B8bFzb7PnFM5oUFSVeNsoJjq3C1A5wyDdTyhReEauZnxfSGaEKt31Q3lvCHKiGITF08L11cNUgyNy/LTQZrBm2WeZYonlbzK+6yimtoRhTr7GKbztZ05ineKD6qISXcjdqdLxvwsg2vGvBqu+uggQftXH9W3PwbK3G/2jylyuN0uorQwlX/dhBXpqqREO33x3kbD6wmTG5Msa63YmV1plH7KLfF+dd+FPaj07B3eFAf7C76iD6hLyhC39AhOkYnaIgo0ugO3aO/wYfgR3AUHK9COzt1znu0YcHpI2YwO2g=</latexit>

……

Figure 1: Our proposedK-embeddings model composed ofK-predictors where each contains severalfully-connected layers. The shared layers consisting a deep neural network.

Borrowing ideas from this theory, our classifier is designed to be trained on different supervisionsfor each class. Rather than using direct supervision, our classifier is composed of a set of regressionfunctions, where each function is trained to predict a different word embedding (such as GloVe,FastText, etc.). The prediction of the model is based on the outcome of the regression functions.

Refer to the model depicted schematically in Figure 1. Formally, the model is composed of Kregression functions fkθk : X → Z for 1 ≤ k ≤ K. The input for each function is an instancex ∈ X , and its output is a word embedding vector in Z . Note that the word embedding spaces are notthe same for the different regression functions, and specifically the notion of distance is unique foreach function. Overall, given an instance x, the output of the model is K different word embeddingvectors.

The set of parameters of a regression function k, θk = {θshared, θkexcl.}, is composed of a set ofparameters, θshared, that is shared among all the K functions, and a set of parameters, θkexcl., which isexclusive to the k-th function. Each regression function fkθk is trained to predict a word embeddingvector ek(y) corresponds to the word which represents the target label y ∈ Y . In the next subsections,we give a detailed account of how the model is trained and then present the inference procedurefollowing the procedure to detect out-of-distribution instances.

3.1 Training

In classic supervised learning, the training set is composed of M instance-label pairs. In oursetting, each example from the training set, Strain, is composed of an instance x ∈ X and a setof K different word embeddings {e1(y), . . . , eK(y)} of a target label y ∈ Y . Namely, Strain ={(xi, e1(yi), ..., e

K(yi))}Mi=1.

Our goal in training is to minimize a loss function which measures the discrepancy between thepredicted label and the desired label. Since our intermediate representation is based on wordembeddings we cannot use the cross-entropy loss function. Moreover, we would like to keep thenotion of similarity between the embedding vectors from the same space. Our surrogate loss functionis the sum of K cosine distances between the predicted embedding and the corresponding embeddingvector of the target label, both from the same embedding space. Namely,

¯(x, y; θ) =

K∑k=1

dcos(ek(y),fkθk(x)). (2)

The cosine distance (or the cosine similarity) is a function often used in ranking tasks, where the goalis to give a high score to similar embedding vectors and a low score otherwise [9, 30].

3.2 Inference

At inference time, the regression functions predict K vectors, each corresponds to a vector in adifferent word embedding space. A straightforward solution is to predict the label using hard decisionover the K output vectors by first predict the label of each output and then use a majority vote overthe predicted labels.

3

Page 4: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

Another, more successful procedure for decoding, is the soft decision, where we predict the label ywhich has the minimal distance to all of the K embedding vectors:

y = arg miny∈Y

K∑k=1

dcos(ek(y),fkθk(x)). (3)

In order to distinguish between in- and out-of-distribution examples, we consider the norms of thepredicted embedding vectors. When the sum of all the norms is below a detection threshold α wedenote the example as an out-of-distribution example, namely,

K∑k=1

‖fkθk(x)‖22 < α. (4)

This is inspired by the discussion of [39, Section 4] and motivated empirically in Section 4.3.

4 Experiments

In this section, we present our experimental results. First, we describe our experimental setup.Then, we evaluate our model using in-distribution examples, and lastly, we evaluate our model usingout-of-distribution examples from different domains. We implemented the code using PyTorch [35].It will be available under www.github.com/MLSpeech/semantic_OOD.

4.1 Experimental Setup

Recall that each regression function fkθk is composed of a shared part and an exclusive part. In oursetting, we used state-of-the-art known architectures as the shared part, and three fully-connectedlayers, with ReLU activation function between the first two, as the exclusive part.

We evaluated our approach on CIFAR-10, CIFAR-100 [18] and Google Speech Commands Dataset1,abbreviated here as GCommands. For CIFAR-10 and CIFAR-100 we trained ResNet-18 and ResNet34models [12], respectively, using stochastic gradient descent with momentum for 180 epochs. Weused the standard normalization and data augmentation techniques. We used learning rate of 0.1,momentum value of 0.9 and weight decay of 0.0005. During training we divided the learning rateby 5 after 60, 120 and 160 epochs. For the GCommands dataset, we trained LeNet model [22]using Adam [16] for 20 epochs using batch size of 100 and a learning rate of 0.001. Similarlyto [44] we extracted normalized spectrograms from the original waveforms where we zero-paddedthe spectrograms to equalize their sizes at 160 × 101.

For our supervision, we fetched five word representations for each label. The first two word represen-tations were based on the Skip-Gram model [29] trained on Google News dataset and One BillionWords benchmark [5], respectively. The third and forth representations were based on GloVe [36],where the third one was trained using both Wikipedia corpus and Gigawords [34] dataset, the fourthwas trained using Common Crawl dataset. The last word representations were obtained using Fast-Text [28] trained on Wikipedia corpus. We use the terms 1-embed, 3-embed, and 5-embed to specifythe number of embeddings we use as supervision, i.e., the number of predicted embedding vectors.On 1-embed and 3-embed models we randomly pick 1 or 3 embeddings (respectively), out of the fiveembeddings.

We compared our results to a softmax classifier (baseline) [13], ensemble of softmax classifiers(ensemble) [19] and to [25] (ODIN). For the ensemble method, we followed a similar approach to[19, 24] where we randomly initialized each of the models. For ODIN, we followed the schemeproposed in by the authors where we did a grid search over the ε and T values for each setting. In allof these models we optimized the cross-entropy loss function using the same architectures as in theproposed models.

4.2 In-Distribution

Accuracy In this subsection, we evaluate the performance of our model and compare it to modelsbased on softmax. Similar to [25], we considered CIFAR-10, CIFAR-100 and GCommands datasets

1https://research.googleblog.com/2017/08/launching-speech-commands-dataset.html

4

Page 5: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

Table 1: Accuracy on in-distribution examples and semantical relevance of misclassifications.Dataset Model Accuracy Avg. WUP Avg. LCH Avg. Path

GCommands Baseline 90.3 0.2562 1.07 0.09371-embed 90.42 0.3279 1.23 0.12043-embed 91.04 0.3215 1.22 0.11845-embed 91.13 0.3095 1.19 0.1141Ensemble 90.9 0.2206 0.96 0.0748

CIFAR-10 Baseline 95.28 0.7342 1.7 0.15941-embed 95.11 0.741 1.73 0.16333-embed 94.99 0.7352 1.71 0.16095-embed 95.04 0.7302 1.69 0.157Ensemble 95.87 0.733 1.71 0.1601

CIFAR-100 Baseline 79.14 0.506 1.38 0.12631-embed 77.62 0.51 1.39 0.12773-embed 78.31 0.501 1.38 0.12515-embed 78.23 0.5129 1.4 0.1293Ensemble 81.38 0.5122 1.4 0.1291

as in-distribution examples. We report the accuracy for our models using K = 1, 3 or 5 wordembeddings, and compare it to the baseline and to the ensemble of softmax classifier. Results aresummarized in Table 1.

Semantic Measure Word embeddings usually capture the semantic hierarchy between thewords [29], since the proposed models are trained with word embeddings as supervision, theycan capture such semantics. To measure the semantic quality of the model, we compute three se-mantic measures based on WordNet hierarchy: (i) Node-counting on the shortest path that connectsthe senses in the is-a taxonomy; (ii) Wu-Palmer (WUP) [41], calculates the semantic relatedness byconsidering the depth of the two senses in the taxonomy; and (iii) Leacock-Chodorow (LCH) [21],calculates relatedness by finding the shortest path between two concepts and scaling that value bytwice the maximum depth of the hierarchy. Results suggest that on average our model produces labelswhich have slightly better semantic quality.

4.3 Out-of-Distribution

Out-of-Distribution Datasets For out-of-distribution examples, we followed a similar setting asin [25, 13] and evaluated our models on several different datasets. All visual models were trained onCIFAR-10 and tested on SVHN[32], LSUN[42] (resized to 32x32x3) and CIFAR-100; and trained onCIFAR-100 and were tested on SVHN, LSUN (resized to 32x32x3) and CIFAR-10. For the speechmodels, we split the dataset into two disjoint subsets, the first contains 7 classes2 (denote by SC-7)and the other one contains the remaining 23 classes (denote by SC-23). We trained our models usingSC-23 and test them on SC-7.

Evaluation We followed the same metrics used by [13, 25]: (i) False Positive Rate (FPR) at 95%True Positive Rate (TPR): the probability that an out-of-distribution example is misclassified asin-distribution when the TPR is as high as 95%; (ii) Detection error: the misclassification probabilitywhen TPR is 95%, where we assumed that both out- and in-distribution have an equal prior ofappearing; (iii) Area Under the Receiver Operating Characteristic curve (AUROC); and (iv) AreaUnder the Precision-Recall curve (AUPR) for in-distribution and out-of-distribution examples.

Figure 2 presents the distribution of the L2-norm of the proposed method using 5 word embeddingsand the maximum probability of the baseline model for CIFAR-100 (in-distribution) and SVHN(out-of-distribution). Both models were trained on CIFAR-100 and evaluated on CIFAR-100 test set(in-distribution) and SVHN (out-of-distribution).

Results Table 2 summarizes the results for all the models. Results suggest that our methodoutperforms all the three methods in all but two settings: CIFAR-100 versus CIFAR-10 and CIFAR-

2Composed of the following classes: bed, down, eight, five, four, wow, and zero.

5

Page 6: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8

L2-norm0

2

4

6

8

10

dens

ity

cifar100svhn

Figure 2: Distribution of the L2-norm of the proposed model with 5 word embeddings (left) and of themax probability of the baseline model (right). Both were evaluated on CIFAR-100 (in-distribution)and SVHN (out-of-distribution).

100 versus LSUN. This can be explained by taking a closer look into the structure of the datasetsand the predictions made by our model. For these two settings, the in- and out-of-distributiondatasets share common classes, and some of the classes of the in-distribution dataset appear in theout-of-distribution images. For example: the class bedroom in LSUN was detected as a bed, couch,and wardrobe in CIFAR-100 (56%, 18%, and 8% of the time, respectively); bridge of LSUN wasdetected as a bridge, road, and sea in CIFAR-100 (26%, 13%, and 9%); and tower of LSUN wasdetected as a skyscraper, castle, and rocket in CIFAR-100 (34%, 20%, and 6%), and there are manymore. Similarly, CIFAR-10 and CIFAR-100 contain shared classes such as truck. Recall that theproportion of this class is 10% in CIFAR-10 and 1% in CIFAR-100. Hence, when the model is trainedon CIFAR-100 and evaluated on CIFAR-10, it has 10% in-distribution examples a-priori. Whenthe model is trained on CIFAR-10 and evaluated on CIFAR-100, it has 1% in-distribution examplesa-priori.

5 Adversarial Examples

Next, we evaluated the performance of our approach for detecting adversarial examples [11]. Althoughit is not clear if adversarial examples can be considered as an out-of-distribution, we found that ourmethod is very efficient in detecting these examples. Since our goal is detecting adversarial examplesrather than suggesting a defense against them, we generated the adversarial examples in a black boxsettings.

We compared our method to an ensemble of softmax classifiers [19], both with K = 5 predictors, onthe ImageNet dataset [7]. Both models are based on DenseNet-121 [15], wherein our model the Kregression functions are composed of three fully-connected layers with ReLU as described earlier.We omitted from ImageNet these classes which do not have “off-the-shelf” word representations andwere left with 645 labels3.

We generated adversarial examples in a black box setting using a third model, namely VGG-19 [38]with the fast gradient sign method and ε = 0.007 [11], and measured the detection rate for each ofthe models. Notice that our goal was not to be robust to correctly classify adversarial examples butrather to detect them.

We used the method of detecting out-of-distribution examples to detect adversarial examples, resultsare in Table 3. We further explored the inter-predictor agreements on the predicted class across the Kpredicted embeddings. The histogram of the differences between the maximum and the minimumrankings of the predicted label is presented in Figure 3. It can be observed that for our model theinter-predictor variability is much higher than that of the ensemble model. One explanation forthis behavior is the transferability of adversarial examples across softmax classifiers, which can bereduced by using different supervisions.

3In [8] the authors suggested to use random vectors as labels. We leave the exploration of this researchdirection for future work.

6

Page 7: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

Table 2: Results for in- and out-of-distribution detection for various settings. All values are inpercentages. ↑ indicates larger value is better, and ↓ indicates lower value is better.

In-DistributionOut-of-

Distribution ModelFPR

(95% TPR) ↓Detection

Error ↓ AUROC ↑ AUPR-In ↑ AUPR-Out ↑

SC-23(LeNet) SC-7

Baseline 77.53 41.26 82.74 94.33 50.44ODIN 71.02 38.01 85.49 95.11 57.71-embed 70.64 37.8 85.85 95.21 58.083-embed 67.5 36.24 87.34 95.91 59.975-embed 69.23 37.09 86.93 95.74 58.97Ensemble 72.73 38.85 85.99 95.69 50.71

CIFAR-10(ResNet18) SVHN

Baseline 7.19 6.09 97.2 96.35 98.05ODIN 4.95 4.97 98.65 96.89 99.211-embed 2.41 3.7 99.48 98.77 99.793-embed 4.92 4.95 98.52 97.76 99.075-embed 4.14 4.57 99.1 98.3 99.55Ensemble 6.01 5.5 98.24 97.41 98.94

CIFAR-10(ResNet18) LSUN

Baseline 50.25 27.62 91.28 91.58 89.3ODIN 41.8 23.39 90.35 96.38 75.071-embed 26.11 15.55 95.37 95.81 94.853-embed 29.2 17.09 95.07 95.63 94.385-embed 22.98 13.99 96.05 96.72 94.86Ensemble 46.16 25.58 92.93 94.1 77.57

CIFAR-10(ResNet18) CIFAR-100

Baseline 58.75 31.87 87.76 86.73 85.83ODIN 54.85 29.92 85.59 82.26 85.411-embed 48.72 26.86 89.18 88.91 88.393-embed 50.9 27.95 89.76 90.36 88.135-embed 45.25 25.12 91.23 91.86 89.63Ensemble 56.14 30.57 90.03 90.01 88.27

CIFAR-100(ResNet34) SVHN

Baseline 87.88 46.44 74.11 63.6 84.17ODIN 76.64 40.82 79.86 68.22 90.11-embed 75.79 40.39 81.82 71.52 90.13-embed 74.74 39.87 82.57 75.01 90.45-embed 60.14 32.57 87.42 77.95 93.56Ensemble 85.92 45.46 79.1 69.23 89.3

CIFAR-100(ResNet34) CIFAR-10

Baseline 77.21 41.1 79.18 80.71 75.22ODIN 74.15 39.57 80.40 80.41 77.21-embed 80.82 42.9 75.99 74.27 73.013-embed 78.17 41.77 77.35 77.42 74.395-embed 77.03 41.01 77.7 77.23 74.61Ensemble 73.57 39.28 81.49 83.24 78.16

CIFAR-100(ResNet34) LSUN

Baseline 80.41 42.7 78.02 79.25 73.34ODIN 79.88 42.44 78.94 80.22 73.311-embed 80.99 42.99 76.41 75.08 74.023-embed 81.08 43.03 74.88 72.21 72.385-embed 80.87 42.93 76.08 75.3 72.67Ensemble 79.53 42.26 79.05 92.61 74.06

We labeled an input as an adversarial example unless all the predictors were in full agreement.We calculated the detection rate of adversarial examples and the false rejection rate of legitimateexamples. The ensemble achieved 43.88% detection rate and 11.69% false rejection rate, while theembedding model achieved 62.04% detection rate and 15.16% false rejection rate. Although theensemble method achieves slightly better false rejection rate (3% improvement), the detection rate ofour approach is significantly better (18% improvement). To better qualify that, we fixed the falserejection rate in both methods to be 3%. In this setting, the ensemble reaches 15.41% detection ratewhile our model reaches 28.64% detection rate (13% improvement).

6 Wrongly Classified Examples

Recall the embedding models were trained to minimize the cosine distance between the output vectorand the word representation of the target label according to some embedding space. When we plotthe average L2-norm of the models’ output vectors as a function of the epochs, we have noticed

7

Page 8: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

+ +

Figure 3: Histogram of the differences between the max and the min rankings of the predicted labelin our model (left) and the ensemble model (right) for original and adversarial examples.

0 50 100 150 200 250 300 350

Epochs

2

4

6

8

10

12

14

Avg

L2-n

orm

correct exampleswrong examples

Figure 4: The average L2-norm for wrongly and correctly classified examples as a function of trainingepochs (left). Density of the L2-norm for wrongly, correctly classified, and out-of-distributionsexamples (right).

that the norm of the wrongly classified examples is significantly smaller than those of correctlyclassified examples. These finding goes in hand with the results in [39], which observed that lowerrepresentation norms are negatively associated with the softmax predictions, as shown in the leftpanel of Figure 4. Similarly, we observe a similar behavior for out-of-distribution examples as shownin the right panel of Figure 4. These findings suggest that we can adjust the threshold α accordingly.We leave this further exploration for future work.

7 Related Work

The problem of detecting out-of-distribution examples in low-dimensional space has been well-studied, however those methods found to be unreliable for high-dimensional space [40]. Recently,out-of-distribution detectors based on deep models have been proposed. Several studies requireenlarging or modifying the neural networks [23, 37, 3], and various approaches suggest to use theoutput of a pre-trained model with some minor modifications [13, 25].

There has been a variety of works revolving around Bayesian inference approximations, whichapproximate the posterior distribution over the parameters of the neural network and use them toquantify predictive uncertainty [31, 27]. These Bayesian approximations often harder to implementand computationally slower to train compared to non-Bayesian approaches. The authors of [10]proposed to use Monte Carlo dropout to estimate uncertainty at test time as Bayesian approximation.More recently, [19] introduced a non-Bayesian method, using an ensemble of classifiers for predictiveuncertainty estimation, which proved to be significantly better than previous methods.

8

Page 9: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

Table 3: Results for in- and out-of-distribution detection, for ImageNet (in) and adversarial examples(out)

ModelFPR

(95% TPR) ↓Detection

Error ↓ AUROC ↑ AUPR-In ↑ AUPR-Out ↑Ensemble 57.3 31.15 88.66 98.71 43.465-embed 57.26 31.12 89.58 98.64 47.2

8 Discussion and Future Work

In this paper, we propose to use several semantic representations for each target label as supervisionto the model in order to detect out-of-distribution examples, where the detection score is based on theL2-norm of the output representations.

For future work, we would like to further investigate the following: (i) we would like to explore betterdecision strategy for detecting out-of-distribution examples; (ii) we would like to rigorously analyzethe notion of confidence based on the L2-norm beyond [39]; (iii) we would like to inspect the relationbetween detecting wrongly classified examples and adversarial examples to out-of-distributionexamples.

References[1] Dario Amodei, Rishita Anubhai, Eric Battenberg, Carl Case, Jared Casper, Bryan Catanzaro,

Jingdong Chen, Mike Chrzanowski, Adam Coates, Greg Diamos, et al. Deep speech 2: End-to-end speech recognition in english and mandarin. In International Conference on MachineLearning, pages 173–182, 2016.

[2] Dario Amodei, Chris Olah, Jacob Steinhardt, Paul Christiano, John Schulman, and Dan Mané.Concrete problems in ai safety. arXiv preprint arXiv:1606.06565, 2016.

[3] Jerone TA Andrews, Thomas Tanay, Edward J Morton, and Lewis D Griffin. Transferrepresentation-learning for anomaly detection. ICML, 2016.

[4] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointlylearning to align and translate. arXiv preprint arXiv:1409.0473, 2014.

[5] Ciprian Chelba, Tomas Mikolov, Mike Schuster, Qi Ge, Thorsten Brants, Phillipp Koehn, andTony Robinson. One billion word benchmark for measuring progress in statistical languagemodeling. arXiv preprint arXiv:1312.3005, 2013.

[6] Moustapha M Cisse, Yossi Adi, Natalia Neverova, and Joseph Keshet. Houdini: Fooling deepstructured visual and speech recognition models with adversarial examples. In Advances inNeural Information Processing Systems, pages 6980–6990, 2017.

[7] J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei. ImageNet: A Large-ScaleHierarchical Image Database. In CVPR09, 2009.

[8] Andrea Frome, Greg S Corrado, Jon Shlens, Samy Bengio, Jeff Dean, Tomas Mikolov, et al. De-vise: A deep visual-semantic embedding model. In Advances in neural information processingsystems, pages 2121–2129, 2013.

[9] Tzeviya Fuchs and Joseph Keshet. Spoken term detection automatically adjusted for a giventhreshold. IEEE Journal of Selected Topics in Signal Processing, 11(8):1310–1317, 2017.

[10] Yarin Gal and Zoubin Ghahramani. Dropout as a bayesian approximation: Representing modeluncertainty in deep learning. In international conference on machine learning, pages 1050–1059,2016.

[11] Ian J Goodfellow, Jonathon Shlens, and Christian Szegedy. Explaining and harnessing adversar-ial examples. arXiv preprint arXiv:1412.6572, 2014.

9

Page 10: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

[12] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for imagerecognition. In Proceedings of the IEEE conference on computer vision and pattern recognition,pages 770–778, 2016.

[13] Dan Hendrycks and Kevin Gimpel. A baseline for detecting misclassified and out-of-distributionexamples in neural networks. arXiv preprint arXiv:1610.02136, 2016.

[14] Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network.arXiv preprint arXiv:1503.02531, 2015.

[15] Gao Huang, Zhuang Liu, Kilian Q Weinberger, and Laurens van der Maaten. Densely connectedconvolutional networks. In Proceedings of the IEEE conference on computer vision and patternrecognition, volume 1, page 3, 2017.

[16] Diederik P Kingma and Jimmy Ba. Adam: A method for stochastic optimization. arXiv preprintarXiv:1412.6980, 2014.

[17] Felix Kreuk, Yossi Adi, Moustapha Cisse, and Joseph Keshet. Fooling end-to-end speakerverification by adversarial examples. arXiv preprint arXiv:1801.03339, 2018.

[18] Alex Krizhevsky and Geoffrey Hinton. Learning multiple layers of features from tiny images.2009.

[19] Balaji Lakshminarayanan, Alexander Pritzel, and Charles Blundell. Simple and scalablepredictive uncertainty estimation using deep ensembles. In Advances in Neural InformationProcessing Systems, pages 6405–6416, 2017.

[20] Amos Lapidoth. A foundation in digital communication. Cambridge University Press, 2017.

[21] Claudia Leacock and Martin Chodorow. Combining local context and wordnet similarity forword sense identification. WordNet: An electronic lexical database, 49(2):265–283, 1998.

[22] Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. Gradient-based learningapplied to document recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.

[23] Kimin Lee, Honglak Lee, Kibok Lee, and Jinwoo Shin. Training confidence-calibrated classifiersfor detecting out-of-distribution samples. arXiv preprint arXiv:1711.09325, 2017.

[24] Stefan Lee, Senthil Purushwalkam, Michael Cogswell, David Crandall, and Dhruv Batra. Whym heads are better than one: Training a diverse ensemble of deep networks. arXiv preprintarXiv:1511.06314, 2015.

[25] Shiyu Liang, Yixuan Li, and R Srikant. Enhancing the reliability of out-of-distribution imagedetection in neural networks.

[26] Etai Littwin and Lior Wolf. The multiverse loss for robust transfer learning. In Proceedings ofthe IEEE Conference on Computer Vision and Pattern Recognition, pages 3957–3966, 2016.

[27] David JC MacKay. Bayesian methods for adaptive models. PhD thesis, California Institute ofTechnology, 1992.

[28] Tomas Mikolov, Edouard Grave, Piotr Bojanowski, Christian Puhrsch, and Armand Joulin.Advances in pre-training distributed word representations. In Proceedings of the InternationalConference on Language Resources and Evaluation (LREC 2018), 2018.

[29] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S Corrado, and Jeff Dean. Distributed repre-sentations of words and phrases and their compositionality. In Advances in neural informationprocessing systems, pages 3111–3119, 2013.

[30] Einat Naaman, Yossi Adi, and Joseph Keshet. Learning similarity function for pronunciationvariations. In Proc. of Interspeech, 2017.

[31] Radford M Neal. Bayesian learning for neural networks, volume 118. Springer Science &Business Media, 2012.

10

Page 11: Out-of-Distribution Detection using Multiple …papers.nips.cc/paper/7967-out-of-distribution-detection...Out-of-Distribution Detection using Multiple Semantic Label Representations

[32] Yuval Netzer, Tao Wang, Adam Coates, Alessandro Bissacco, Bo Wu, and Andrew Y Ng.Reading digits in natural images with unsupervised feature learning. In NIPS workshop on deeplearning and unsupervised feature learning, volume 2011, page 5, 2011.

[33] Anh Nguyen, Jason Yosinski, and Jeff Clune. Deep neural networks are easily fooled: Highconfidence predictions for unrecognizable images. In Proceedings of the IEEE Conference onComputer Vision and Pattern Recognition, pages 427–436, 2015.

[34] Robert Parker, David Graff, Junbo Kong, Ke Chen, and Kazuaki Maeda. English gigaword fifthedition, linguistic data consortium. Google Scholar, 2011.

[35] Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan, Edward Yang, Zachary DeVito,Zeming Lin, Alban Desmaison, Luca Antiga, and Adam Lerer. Automatic differentiation inpytorch. In NIPS-W, 2017.

[36] Jeffrey Pennington, Richard Socher, and Christopher Manning. Glove: Global vectors forword representation. In Proceedings of the 2014 conference on empirical methods in naturallanguage processing (EMNLP), pages 1532–1543, 2014.

[37] Thomas Schlegl, Philipp Seeböck, Sebastian M Waldstein, Ursula Schmidt-Erfurth, and GeorgLangs. Unsupervised anomaly detection with generative adversarial networks to guide markerdiscovery. In International Conference on Information Processing in Medical Imaging, pages146–157. Springer, 2017.

[38] Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large-scaleimage recognition. arXiv preprint arXiv:1409.1556, 2014.

[39] Yaniv Taigman, Ming Yang, Marc’Aurelio Ranzato, and Lior Wolf. Web-scale training forface identification. In Proceedings of the IEEE conference on computer vision and patternrecognition, pages 2746–2754, 2015.

[40] Lucas Theis, Aäron van den Oord, and Matthias Bethge. A note on the evaluation of generativemodels. ICLR, 2015.

[41] Zhibiao Wu and Martha Palmer. Verbs semantics and lexical selection. In Proceedings of the32nd annual meeting on Association for Computational Linguistics, pages 133–138. Associationfor Computational Linguistics, 1994.

[42] Fisher Yu, Yinda Zhang, Shuran Song, Ari Seff, and Jianxiong Xiao. Lsun: Constructionof a large-scale image dataset using deep learning with humans in the loop. arXiv preprintarXiv:1506.03365, 2015.

[43] Chiyuan Zhang, Samy Bengio, Moritz Hardt, Benjamin Recht, and Oriol Vinyals. Understandingdeep learning requires rethinking generalization. arXiv preprint arXiv:1611.03530, 2016.

[44] Hongyi Zhang, Moustapha Cisse, Yann N Dauphin, and David Lopez-Paz. mixup: Beyondempirical risk minimization. arXiv preprint arXiv:1710.09412, 2017.

11