{"title": "Spherical Text Embedding", "book": "Advances in Neural Information Processing Systems", "page_first": 8208, "page_last": 8217, "abstract": "Unsupervised text embedding has shown great power in a wide range of NLP tasks. While text embeddings are typically learned in the Euclidean space, directional similarity is often more effective in tasks such as word similarity and document clustering, which creates a gap between the training stage and usage stage of text embedding. To close this gap, we propose a spherical generative model based on which unsupervised word and paragraph embeddings are jointly learned. To learn text embeddings in the spherical space, we develop an efficient optimization algorithm with convergence guarantee based on Riemannian optimization. Our model enjoys high efficiency and achieves state-of-the-art performances on various text embedding tasks including word similarity and document clustering.", "full_text": "Spherical Text Embedding\n\nYu Meng1, Jiaxin Huang1, Guangyuan Wang1, Chao Zhang2,\n\nHonglei Zhuang1\u21e4, Lance Kaplan3, Jiawei Han1\n\n1 Department of Computer Science, University of Illinois at Urbana-Champaign\n\n2 College of Computing, Georgia Institute of Technology\n\n3 U.S. Army Research Laboratory\n\n1 {yumeng5,jiaxinh3,gwang10,hzhuang3,hanj}@illinois.edu\n\n2 chaozhang@gatech.edu 3 lance.m.kaplan.civ@mail.mil\n\nAbstract\n\nUnsupervised text embedding has shown great power in a wide range of NLP tasks.\nWhile text embeddings are typically learned in the Euclidean space, directional\nsimilarity is often more effective in tasks such as word similarity and document\nclustering, which creates a gap between the training stage and usage stage of text\nembedding. To close this gap, we propose a spherical generative model based\non which unsupervised word and paragraph embeddings are jointly learned. To\nlearn text embeddings in the spherical space, we develop an ef\ufb01cient optimization\nalgorithm with convergence guarantee based on Riemannian optimization. Our\nmodel enjoys high ef\ufb01ciency and achieves state-of-the-art performances on various\ntext embedding tasks including word similarity and document clustering.\n\n1\n\nIntroduction\n\nRecent years have witnessed enormous success of unsupervised text embedding techniques [29,\n30, 33] in various natural language processing and text mining tasks. Such techniques capture\nthe semantics of textual units (e.g., words, paragraphs) via learning low-dimensional distributed\nrepresentations in an unsupervised way, which can be either directly used as feature representations\nor further \ufb01ne-tuned with training data from downstream supervised tasks. Notably, the popular\nWord2Vec method [29, 30] learns word embeddings in the Euclidean space, by modeling local word\nco-occurrences in the corpus. This strategy has later been extended to obtain embeddings of other\ntextual units such as sentences [2, 19] and paragraphs [22].\nDespite the success of unsupervised text embedding techniques, an intriguing gap exists between the\ntraining procedure and the practical usages of the learned embeddings. While the embeddings are\nlearned in the Euclidean space, it is often the directional similarity between word vectors that captures\nword semantics more effectively. Across a wide range of word similarity and document clustering\ntasks [3, 16, 23], it is common practice to either use cosine similarity as the similarity metric or \ufb01rst\nnormalize word and document vectors before computing textual similarities. Current procedures of\ntraining text embeddings in the Euclidean space and using their similarities in the spherical space\nis clearly suboptimal. After projecting the embedding from Euclidean space to spherical space, the\noptimal solution to the loss function in the original space may not remain optimal in the new space.\nIn this work, we propose a method that learns spherical text embeddings in an unsupervised way. In\ncontrast to existing techniques that learn text embeddings in the Euclidean space and use normalization\nas a post-processing step, we directly learn text embeddings in a spherical space by imposing unit-\nnorm constraints on embeddings. Speci\ufb01cally, we de\ufb01ne a two-step generative model on the surface\nof a unit sphere: A word is \ufb01rst generated according to the semantics of the paragraph, and then\n\n\u21e4Currently at Google Research.\n\n33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.\n\n\fthe surrounding words are generated in consistency with the center word\u2019s semantics. We cast the\nlearning of the generative model as an optimization problem and propose an ef\ufb01cient Riemannian\noptimization procedure to learn spherical text embeddings.\nAnother major advantage of our spherical text embedding model is that it can jointly learn word\nembeddings and paragraph embeddings. This property naturally stems from our two-step generative\nprocess, where the generation of a word is dependent on its belonging paragraph with a von Mishes-\nFisher distribution in the spherical space. Explicitly modeling the generative relationships between\nwords and their belonging paragraphs allows paragraph embeddings to be directly obtained during\nthe training stage. Furthermore, it allows the model to learn better word embeddings by jointly\nexploiting word-word and word-paragraph co-occurrence statistics; this is distinct from existing word\nembedding techniques that learn word embeddings only based on word co-occurrences [5, 29, 30, 33]\nin the corpus.\n\nContributions.\n(1) We propose to learn text embeddings in the spherical space which addresses the\nmismatch issue between training and using embeddings of previous Euclidean embedding models;\n(2) We propose a two-step generative model that jointly learns unsupervised word and paragraph\nembeddings by exploiting word-word and word-paragraph co-occurrence statistics; (3) We develop\nan ef\ufb01cient optimization algorithm in the spherical space with convergence guarantee; (4) Our model\nachieves state-of-the-art performances on various text embedding applications.\n\n2 Related Work\n2.1 Text Embedding\nMost unsupervised text embedding models such as [5, 19, 22, 29, 30, 33, 37, 42] are trained in the\nEuclidean space. The embeddings are trained to capture semantic similarity of textual units based on\nco-occurrence statistics, and demonstrate effectiveness on various text semantics tasks such as named\nentity recognition [21], text classi\ufb01cation [18, 27, 28, 38] and machine translation [8]. Recently,\nnon-Euclidean embedding space has been explored for learning speci\ufb01c structural representations.\nPoincar\u00e9 [11, 31, 39], Lorentz [32] and hyperbolic cone [15] models have proven successful on\nlearning hierarchical representations in a hyperbolic space for tasks such as lexical entailment and\nlink prediction. Our model also learns unsupervised text embeddings in a non-Euclidean space, but\nstill for general text embedding applications including word similarity and document clustering.\n\n2.2 Spherical Space Models\nPrevious works have shown that the spherical space is a superior choice for tasks focusing on\ndirectional similarity. For example, normalizing document tf-idf vectors is common practice when\nused as features for document clustering and classi\ufb01cation, which helps regularize the vector against\nthe document length and leads to better document clustering performance [3, 16]. Spherical generative\nmodeling [4, 43, 44] models the distribution of words on the unit sphere, motivated by the effectiveness\nof directional metrics over word embeddings. Recently, spherical models also show great effectiveness\nin deep learning. Spherical normalization [24] on the input leads to easier optimization, faster\nconvergence and better accuracy of neural networks. Also, a spherical loss function can be used to\nreplace the conventional softmax layer in language generation tasks, which results in faster and better\ngeneration quality [20]. Motivated by the success of these models, we propose to learn unsupervised\ntext embeddings in the spherical space so that the embedding space discrepancy between training and\nusage can be eliminated, and directional similarity is more effectively captured.\n\n3 Spherical Text Embedding\n\nIn this section, we introduce the spherical generative model for jointly learning word and paragraph\nembeddings and the corresponding loss function.\n\n3.1 The Generative Model\nThe design of our generative model is inspired by the way humans write articles: Each word should be\nsemantically consistent with not only its surrounding words, but also the entire paragraph/document.\n\n2\n\n\fSpeci\ufb01cally, we assume text generation is a two-step process: A center word is \ufb01rst generated\naccording to the semantics of the paragraph, and then the surrounding words are generated based\non the center word\u2019s semantics2. Further, we assume the direction in the spherical embedding space\ncaptures textual semantics, and higher directional similarity implies higher co-occurrence probability.\nHence, we model the text generation process as follows: Given a paragraph d, a center word u is \ufb01rst\ngenerated by\n\nand then a context word is generated by\n\np(u | d) / exp(cos(u, d)),\n\n(1)\n\np(v | u) / exp(cos(v, u)),\n\n(2)\nwhere kuk = kvk = kdk = 1, and cos(\u00b7,\u00b7) denotes the cosine of the angle between two vectors on\nthe unit sphere.\nNext we derive the analytic forms of Equations (1) and (2).\nTheorem 1. When the corpus has in\ufb01nite vocabulary, i.e., |V |! 1 , the analytic forms of Equa-\ntions (1) and (2) are given by the von Mises-Fisher (vMF) distribution with the prior embedding as\nthe mean direction and constant 1 as the concentration parameter, i.e.,\n\nlim\n\n|V |!1\n\np(v | u) = vMFp(v; u, 1),\n\nlim\n\n|V |!1\n\np(u | d) = vMFp(u; d, 1).\n\nThe proof of Theorem 1 can be found in Appendix A.\nThe vMF distribution de\ufb01nes a probability density over a hypersphere and is parameterized by a\nmean vector \u00b5 and a concentration parameter \uf8ff. The probability density closer to \u00b5 is greater and\nthe spread is controlled by \uf8ff. Formally, A unit random vector x 2 Sp1 has the p-variate vMF\ndistribution vMFp(x; \u00b5,\uf8ff ) if its probability dense function is\n\nwhere k\u00b5k = 1 is the mean direction, \uf8ff 0 is the concentration parameter, and the normalization\nconstant cp(\uf8ff) is given by\n\nf (x; \u00b5,\uf8ff ) = cp(\uf8ff) exp (\uf8ff \u00b7 cos(x, \u00b5)) ,\n\ncp(\uf8ff) =\n\n\uf8ffp/21\n\n(2\u21e1)p/2Ip/21(\uf8ff)\n\n,\n\nwhere Ir(\u00b7) represents the modi\ufb01ed Bessel function of the \ufb01rst kind at order r, given by De\ufb01nition 1\nin the appendix.\nFinally, the probability density function of a context word v appearing in a center word u\u2019s local\ncontext window in a paragraph/document d is given by\n\np(v, u | d) / p(v | u) \u00b7 p(u | d) / vMFp(v; u, 1) \u00b7 vMFp(u; d, 1).\n\n3.2 Objective\nGiven a positive training tuple (u, v, d) where v appears in the local context window of u in paragraph\nd, we aim to maximize the probability p(v, u | d), while minimize the probability p(v, u0 | d) where\nu0 is a randomly sampled word from the vocabulary serving as a negative sample. This is similar to\nthe negative sampling technique used in Word2Vec [30] and GloVe [33]. To achieve this, we employ\na max-margin loss function, similar to [15, 40, 41], and push the log likelihood of the positive tuple\nover the negative one by a margin:\n\nL(u, v, d) = max\u27130, m logcp(1) exp(cos(v, u)) \u00b7 cp(1) exp(cos(u, d))\n+ logcp(1) exp(cos(v, u0)) \u00b7 cp(1) exp(cos(u0, d))\u25c6\n\n= max (0, m cos(v, u) cos(u, d) + cos(v, u0) + cos(u0, d)) ,\n\n(3)\n\nwhere m > 0 is the margin.\n\n2Like previous works, we assume each word has independent center word representation and context\nword representation, and thus the generation processes of a word as a center word and as a context word are\nindependent.\n\n3\n\n\f4 Optimization\n\nIn this section, we describe the approach to optimize the objective introduced in the previous section\non the unit sphere.\n\n4.1 The Constrained Optimization Problem\nThe unit hypersphere Sp1 := {x 2 Rp |k xk = 1} is the common choice for spherical space\noptimization problems. The text embedding training is thus a constrained optimization problem:\n\n\u21e5 L(\u21e5)\nmin\n\ns.t. 8\u2713 2 \u21e5 : k\u2713k = 1,\ni=1 is the set of target word embeddings, context word\n\nwhere \u21e5 = {ui}|V |\n\ni=1 [{ vi}|V |\n\ni=1 [{ di}|D|\n\nembeddings and paragraph embeddings to be learned.\nSince the optimization problem is constrained on the unit sphere, the Euclidean space optimization\nmethods such as SGD cannot be used to optimize our objective, because the Euclidean gradient\nprovides the update direction in a non-curvature space, while the parameters in our model must be\nupdated on a surface with constant positive curvature. Therefore, we need to base our embedding\ntraining problem on Riemannian optimization.\n\n4.2 Preliminaries\nA Riemannian manifold (M, g) is a real, smooth manifold whose tangent spaces are endowed with a\nsmoothly varying inner product g, also called the Riemannian metric. Let TxM denote the tangent\nspace at x 2M , then g de\ufb01nes the inner product h\u00b7,\u00b7ix : TxM\u21e5 TxM! R. A unit sphere Sp1\ncan be considered as a Riemmannian submanifold of Rp, and its Riemannian metric can be inherited\nfrom Rp, i.e., h\u21b5, ix := \u21b5>.\nThe intrinsic distance on the unit sphere between two arbitrary points x, y 2 Sp1 is de\ufb01ned by\nd(x, y) := arccos(x>y). A geodesic segment : [a, b] ! Sp1 is the generalization of a straight\nline to the sphere, and it is said to be minimal if it equals to the intrinsic distance between its end\npoints, i.e., `() = arccos((a)>(b)).\nLet TxSp1 denote the tangent hyperplane at x 2 Sp1, i.e., TxSp1 := {y 2 Rp | x>y = 0}.\nThe projection onto TxSp1 is given by the linear mapping I xx> : Rp ! TxSp1 where I is\nthe identity matrix. The exponential mapping expx : TxSp1 ! Sp1 projects a tangent vector\nz 2 TxSp1 onto the sphere such that expx(z) = y, (0) = x, (1) = y and @\n4.3 Riemannian Optimization\n\n@t (0) = z.\n\nSince the unit sphere is a Riemannian manifold, we can optimize our objectives with Riemannian\nSGD [6, 34]. Speci\ufb01cally, the parameters are updated by\n\nxt+1 = expxt (\u2318tgrad f (xt)) ,\n\nwhere \u2318t denotes the learning rate and grad f (xt) 2 TxtSp1 is the Riemannian gradient of a\ndifferentiable function f : Sp1 ! R.\nOn the unit sphere, the exponential mapping expx : TxSp1 ! Sp1 is given by\n, z 2 TxSp1\\{0},\n\nexpx(z) :=(cos(kzk)x + sin(kzk) z\n\nz = 0.\n\nkzk\n\n(4)\n\nx,\n\nTo derive the Riemannian gradient grad f (x) at x, we view Sp1 as a Riemannian submanifold of\nRp endowed with the canonical Riemannian metric h\u21b5, ix := \u21b5>. Then the Riemannian gradient\nis obtained by using the linear mapping I xx> : Rp ! TxSp1 to project the Euclidean gradient\nrf (x) from the ambient Euclidean space onto the tangent hyperplane [1, 12], i.e.,\n\ngrad f (x) :=I xx>rf (x).\n\n4\n\n(5)\n\n\fAAACFHicdVDPSxwxFM5otbr+2tajl9BFUNQl2S7ueigsePFooavCzjhkshkNZpIheVO6DPtHePFf6cWDIl49eOt/04xuoYp+EPLxfe/x3vuSXEkHhPwJpqY/zMx+nJuvLSwuLa/UP30+cqawXPS5UcaeJMwJJbXogwQlTnIrWJYocZxc7Ff+8U9hnTT6B4xyEWXsTMtUcgZeiutbO6FmiWI4jSneCBOjhm6U+a/8NY5h89uAbtPotAzB5OO43iDNve5uq72LSZOQDm3RirQ67a9tTL1SoYEmOIzrj+HQ8CITGrhizg0oySEqmQXJlRjXwsKJnPELdiYGnmqWCReVT0eN8bpXhjg11j8N+En9v6NkmatW9ZUZg3P32qvEt7xBAWk3KqXOCxCaPw9KC4XB4CohPJRWcFAjTxi30u+K+TmzjIPPseZD+Hcpfp8ctZqUNOn3dqNHJnHMoTX0BW0gijqohw7QIeojji7Rb3SDboOr4Dq4C+6fS6eCSc8qeoHg4S+1OZ31\nAAACFHicdVDPSxwxFM5otbr+2tajl9BFUNQl2S7ueigsePFooavCzjhkshkNZpIheVO6DPtHePFf6cWDIl49eOt/04xuoYp+EPLxfe/x3vuSXEkHhPwJpqY/zMx+nJuvLSwuLa/UP30+cqawXPS5UcaeJMwJJbXogwQlTnIrWJYocZxc7Ff+8U9hnTT6B4xyEWXsTMtUcgZeiutbO6FmiWI4jSneCBOjhm6U+a/8NY5h89uAbtPotAzB5OO43iDNve5uq72LSZOQDm3RirQ67a9tTL1SoYEmOIzrj+HQ8CITGrhizg0oySEqmQXJlRjXwsKJnPELdiYGnmqWCReVT0eN8bpXhjg11j8N+En9v6NkmatW9ZUZg3P32qvEt7xBAWk3KqXOCxCaPw9KC4XB4CohPJRWcFAjTxi30u+K+TmzjIPPseZD+Hcpfp8ctZqUNOn3dqNHJnHMoTX0BW0gijqohw7QIeojji7Rb3SDboOr4Dq4C+6fS6eCSc8qeoHg4S+1OZ31\nAAACFHicdVDPSxwxFM5otbr+2tajl9BFUNQl2S7ueigsePFooavCzjhkshkNZpIheVO6DPtHePFf6cWDIl49eOt/04xuoYp+EPLxfe/x3vuSXEkHhPwJpqY/zMx+nJuvLSwuLa/UP30+cqawXPS5UcaeJMwJJbXogwQlTnIrWJYocZxc7Ff+8U9hnTT6B4xyEWXsTMtUcgZeiutbO6FmiWI4jSneCBOjhm6U+a/8NY5h89uAbtPotAzB5OO43iDNve5uq72LSZOQDm3RirQ67a9tTL1SoYEmOIzrj+HQ8CITGrhizg0oySEqmQXJlRjXwsKJnPELdiYGnmqWCReVT0eN8bpXhjg11j8N+En9v6NkmatW9ZUZg3P32qvEt7xBAWk3KqXOCxCaPw9KC4XB4CohPJRWcFAjTxi30u+K+TmzjIPPseZD+Hcpfp8ctZqUNOn3dqNHJnHMoTX0BW0gijqohw7QIeojji7Rb3SDboOr4Dq4C+6fS6eCSc8qeoHg4S+1OZ31\nAAACFHicdVDPSxwxFM5otbr+2tajl9BFUNQl2S7ueigsePFooavCzjhkshkNZpIheVO6DPtHePFf6cWDIl49eOt/04xuoYp+EPLxfe/x3vuSXEkHhPwJpqY/zMx+nJuvLSwuLa/UP30+cqawXPS5UcaeJMwJJbXogwQlTnIrWJYocZxc7Ff+8U9hnTT6B4xyEWXsTMtUcgZeiutbO6FmiWI4jSneCBOjhm6U+a/8NY5h89uAbtPotAzB5OO43iDNve5uq72LSZOQDm3RirQ67a9tTL1SoYEmOIzrj+HQ8CITGrhizg0oySEqmQXJlRjXwsKJnPELdiYGnmqWCReVT0eN8bpXhjg11j8N+En9v6NkmatW9ZUZg3P32qvEt7xBAWk3KqXOCxCaPw9KC4XB4CohPJRWcFAjTxi30u+K+TmzjIPPseZD+Hcpfp8ctZqUNOn3dqNHJnHMoTX0BW0gijqohw7QIeojji7Rb3SDboOr4Dq4C+6fS6eCSc8qeoHg4S+1OZ31\n\nrf1(xt) = [1, 1]>\nzAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\n\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\n\ndcos \u00b7 z\n\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\n\nxt+1\n\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\n\nTxtS1\n\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\n\nS1\n\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\n\nxt\n\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\n\nOAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\n\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odBu3wMYA6nMMFXEEIN3AHD9CBLghI4BXevYn35n2suqp569LO4I+8zx84xIo4\nAAAB3XicbZBLSwMxFIXv1FetVatbN8EiuCozbnQpuHFnC/YB7VAy6Z02NpMZkjtCKf0Fblwo4t9y578xfSy09UDg45yE3HuiTElLvv/tFba2d3b3ivulg/Lh0XHlpNyyaW4ENkWqUtOJuEUlNTZJksJOZpAnkcJ2NL6b5+1nNFam+pEmGYYJH2oZS8HJWY2HfqXq1/yF2CYEK6jCSvV+5as3SEWeoCahuLXdwM8onHJDUiiclXq5xYyLMR9i16HmCdpwuhh0xi6cM2BxatzRxBbu7xdTnlg7SSJ3M+E0suvZ3Pwv6+YU34RTqbOcUIvlR3GuGKVsvjUbSIOC1MQBF0a6WZkYccMFuW5KroRgfeVNaF3VAr8WNHwowhmcwyUEcA23cA91aIIAhBd4g3fvyXv1PpZ1FbxVb6fwR97nD5Uyi4A=\nAAAB3XicbZBLSwMxFIXv1FetVatbN8EiuCozbnQpuHFnC/YB7VAy6Z02NpMZkjtCKf0Fblwo4t9y578xfSy09UDg45yE3HuiTElLvv/tFba2d3b3ivulg/Lh0XHlpNyyaW4ENkWqUtOJuEUlNTZJksJOZpAnkcJ2NL6b5+1nNFam+pEmGYYJH2oZS8HJWY2HfqXq1/yF2CYEK6jCSvV+5as3SEWeoCahuLXdwM8onHJDUiiclXq5xYyLMR9i16HmCdpwuhh0xi6cM2BxatzRxBbu7xdTnlg7SSJ3M+E0suvZ3Pwv6+YU34RTqbOcUIvlR3GuGKVsvjUbSIOC1MQBF0a6WZkYccMFuW5KroRgfeVNaF3VAr8WNHwowhmcwyUEcA23cA91aIIAhBd4g3fvyXv1PpZ1FbxVb6fwR97nD5Uyi4A=\nAAAB6HicbVA9SwNBEJ2LXzF+RS1tFoNgFe5sTBm0sTMB8wHJEfY2c8mavb1jd08IR36BjYUitv4kO/+Nm+QKTXww8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7AST27nfeUKleSwfzDRBP6IjyUPOqLFS835QrrhVdwGyTrycVCBHY1D+6g9jlkYoDRNU657nJsbPqDKcCZyV+qnGhLIJHWHPUkkj1H62OHRGLqwyJGGsbElDFurviYxGWk+jwHZG1Iz1qjcX//N6qQlrfsZlkhqUbLkoTAUxMZl/TYZcITNiagllittbCRtTRZmx2ZRsCN7qy+ukfVX13KrXdCv1mzyOIpzBOVyCB9dQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AptWMzw==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\n\nTxtS1\n\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\nAAACC3icbVC7TsMwFHXKq5RXgJElaoXEVCUICcYKFsYi+pKaEDmO01p14sh2EJWVnYVfYWEAIVZ+gI2/wWkzQMuVLB+dc6/uuSdIKRHStr+Nysrq2vpGdbO2tb2zu2fuH/QEyzjCXcQo44MACkxJgruSSIoHKccwDijuB5OrQu/fYy4ISzpymmIvhqOERARBqSnfrHd85QaMhmIa6089+DLP3RjKcRCo2/xOOblvNuymPStrGTglaICy2r755YYMZTFOJKJQiKFjp9JTkEuCKM5rbiZwCtEEjvBQwwTGWHhqdktuHWsmtCLG9UukNWN/TygYi8Kr7ixcikWtIP/ThpmMLjxFkjSTOEHzRVFGLcmsIhgrJBwjSacaQMSJ9mqhMeQQSR1fTYfgLJ68DHqnTcduOjdnjdZlGUcVHIE6OAEOOActcA3aoAsQeATP4BW8GU/Gi/FufMxbK0Y5cwj+lPH5A77InBw=\n\nS1\n\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\nAAAB9XicbVDLSgMxFL3js9ZX1aWbYBFclYkIuiy6cVnRPqCdlkyaaUMzmSHJKGWY/3DjQhG3/os7/8ZMOwttPRA4nHMv9+T4seDauO63s7K6tr6xWdoqb+/s7u1XDg5bOkoUZU0aiUh1fKKZ4JI1DTeCdWLFSOgL1vYnN7nffmRK80g+mGnMvJCMJA84JcZK/V5IzNj30/usn+JsUKm6NXcGtExwQapQoDGofPWGEU1CJg0VROsudmPjpUQZTgXLyr1Es5jQCRmxrqWShEx76Sx1hk6tMkRBpOyTBs3U3xspCbWehr6dzFPqRS8X//O6iQmuvJTLODFM0vmhIBHIRCivAA25YtSIqSWEKm6zIjomilBjiyrbEvDil5dJ67yG3Rq+u6jWr4s6SnAMJ3AGGC6hDrfQgCZQUPAMr/DmPDkvzrvzMR9dcYqdI/gD5/MHtiaSog==\n\nxt\n\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\nAAAB+XicdVDNS8MwHE3n15xfVY9egkPwVNJatnkbePE4wc3BVkqapltY+kGSDkfZf+LFgyJe/U+8+d+YbhNU9EHI473fj7y8IONMKoQ+jMra+sbmVnW7trO7t39gHh71ZJoLQrsk5anoB1hSzhLaVUxx2s8ExXHA6V0wuSr9uykVkqXJrZpl1IvxKGERI1hpyTfNYZDyUM5ifRX3c1/5Zh1Zl62G4zYgshBq2o5dEqfpXrjQ1kqJOlih45vvwzAleUwTRTiWcmCjTHkFFooRTue1YS5phskEj+hA0wTHVHrFIvkcnmklhFEq9EkUXKjfNwocyzKcnoyxGsvfXin+5Q1yFbW8giVZrmhClg9FOYcqhWUNMGSCEsVnmmAimM4KyRgLTJQuq6ZL+Pop/J/0HMtGln3j1ttoVUcVnIBTcA5s0ARtcA06oAsImIIH8ASejcJ4NF6M1+VoxVjtHIMfMN4+AbwSlFE=\n\nOAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\n\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odBu3wMYA6nMMFXEEIN3AHD9CBLghI4BXevYn35n2suqp569LO4I+8zx84xIo4\nAAAB3XicbZBLSwMxFIXv1FetVatbN8EiuCozbnQpuHFnC/YB7VAy6Z02NpMZkjtCKf0Fblwo4t9y578xfSy09UDg45yE3HuiTElLvv/tFba2d3b3ivulg/Lh0XHlpNyyaW4ENkWqUtOJuEUlNTZJksJOZpAnkcJ2NL6b5+1nNFam+pEmGYYJH2oZS8HJWY2HfqXq1/yF2CYEK6jCSvV+5as3SEWeoCahuLXdwM8onHJDUiiclXq5xYyLMR9i16HmCdpwuhh0xi6cM2BxatzRxBbu7xdTnlg7SSJ3M+E0suvZ3Pwv6+YU34RTqbOcUIvlR3GuGKVsvjUbSIOC1MQBF0a6WZkYccMFuW5KroRgfeVNaF3VAr8WNHwowhmcwyUEcA23cA91aIIAhBd4g3fvyXv1PpZ1FbxVb6fwR97nD5Uyi4A=\nAAAB3XicbZBLSwMxFIXv1FetVatbN8EiuCozbnQpuHFnC/YB7VAy6Z02NpMZkjtCKf0Fblwo4t9y578xfSy09UDg45yE3HuiTElLvv/tFba2d3b3ivulg/Lh0XHlpNyyaW4ENkWqUtOJuEUlNTZJksJOZpAnkcJ2NL6b5+1nNFam+pEmGYYJH2oZS8HJWY2HfqXq1/yF2CYEK6jCSvV+5as3SEWeoCahuLXdwM8onHJDUiiclXq5xYyLMR9i16HmCdpwuhh0xi6cM2BxatzRxBbu7xdTnlg7SSJ3M+E0suvZ3Pwv6+YU34RTqbOcUIvlR3GuGKVsvjUbSIOC1MQBF0a6WZkYccMFuW5KroRgfeVNaF3VAr8WNHwowhmcwyUEcA23cA91aIIAhBd4g3fvyXv1PpZ1FbxVb6fwR97nD5Uyi4A=\nAAAB6HicbVA9SwNBEJ2LXzF+RS1tFoNgFe5sTBm0sTMB8wHJEfY2c8mavb1jd08IR36BjYUitv4kO/+Nm+QKTXww8Hhvhpl5QSK4Nq777RQ2Nre2d4q7pb39g8Oj8vFJW8epYthisYhVN6AaBZfYMtwI7CYKaRQI7AST27nfeUKleSwfzDRBP6IjyUPOqLFS835QrrhVdwGyTrycVCBHY1D+6g9jlkYoDRNU657nJsbPqDKcCZyV+qnGhLIJHWHPUkkj1H62OHRGLqwyJGGsbElDFurviYxGWk+jwHZG1Iz1qjcX//N6qQlrfsZlkhqUbLkoTAUxMZl/TYZcITNiagllittbCRtTRZmx2ZRsCN7qy+ukfVX13KrXdCv1mzyOIpzBOVyCB9dQhztoQAsYIDzDK7w5j86L8+58LFsLTj5zCn/gfP4AptWMzw==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\nAAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRizdbsB/QhrLZTtq1m03Y3Qgl9Bd48aCIV3+SN/+N2zYHbX0w8Hhvhpl5QSK4Nq777RTW1jc2t4rbpZ3dvf2D8uFRS8epYthksYhVJ6AaBZfYNNwI7CQKaRQIbAfj25nffkKleSwfzCRBP6JDyUPOqLFS475frrhVdw6ySrycVCBHvV/+6g1ilkYoDRNU667nJsbPqDKcCZyWeqnGhLIxHWLXUkkj1H42P3RKzqwyIGGsbElD5urviYxGWk+iwHZG1Iz0sjcT//O6qQmv/YzLJDUo2WJRmApiYjL7mgy4QmbExBLKFLe3EjaiijJjsynZELzll1dJ66LquVWvcVmp3eRxFOEETuEcPLiCGtxBHZrAAOEZXuHNeXRenHfnY9FacPKZY/gD5/MHqBWM0w==\n\nxt+1\n\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\nAAAB/XicdVBLSwMxGMz6rPW1Pm5egkUQhCXRLra3ghePFewD2lKy2Wwbmn2QZMW6FP+KFw+KePV/ePPfmG0rqOhAyDDzfWQyXiK40gh9WAuLS8srq4W14vrG5ta2vbPbVHEqKWvQWMSy7RHFBI9YQ3MtWDuRjISeYC1vdJH7rRsmFY+jaz1OWC8kg4gHnBJtpL693/Vi4atxaK7sdtLP9Ame9O0ScpBbdTGCyHERrp7lpFqtlF0XYgdNUQJz1Pv2e9ePaRqySFNBlOpglOheRqTmVLBJsZsqlhA6IgPWMTQiIVO9bJp+Ao+M4sMgluZEGk7V7xsZCVUe0EyGRA/Vby8X//I6qQ4qvYxHSapZRGcPBamAOoZ5FdDnklEtxoYQKrnJCumQSEK1KaxoSvj6KfyfNE8djBx8VS7V0LyOAjgAh+AYYHAOauAS1EEDUHAHHsATeLburUfrxXqdjS5Y85098APW2yeBtJXZ\n\nzAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\n\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\nAAAB9XicdVBLSwMxGMzWV62vqkcvwSJ4WpLS1R4LXjxWsA9oa8lms21odrMkWaUu/R9ePCji1f/izX9jtq2gogMhw8z3kcn4ieDaIPThFFZW19Y3ipulre2d3b3y/kFby1RR1qJSSNX1iWaCx6xluBGsmyhGIl+wjj+5yP3OLVOay/jaTBM2iMgo5iGnxFjppu9LEehpZK/sfjYsV5CLzrwarkPkegjXsWdJ1cMIVSF20RwVsERzWH7vB5KmEYsNFUTrHkaJGWREGU4Fm5X6qWYJoRMyYj1LYxIxPcjmqWfwxCoBDKWyJzZwrn7fyEik82h2MiJmrH97ufiX10tNWB9kPE5Sw2K6eChMBTQS5hXAgCtGjZhaQqjiNiukY6IINbaoki3h66fwf9Kuuhi5+KpWaaBlHUVwBI7BKcDgHDTAJWiCFqBAgQfwBJ6dO+fReXFeF6MFZ7lzCH7AefsEn4iTLw==\n\ndcos \u00b7 z\n\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\nAAACBHicdVDLSgMxFM34rPU16rKbYBFcDZnawXZXcOOygn1Ap5RMJm1DM5MhyQh16MKNv+LGhSJu/Qh3/o2ZtoKKHgg5nHMv994TJJwpjdCHtbK6tr6xWdgqbu/s7u3bB4dtJVJJaIsILmQ3wIpyFtOWZprTbiIpjgJOO8HkIvc7N1QqJuJrPU1oP8KjmA0ZwdpIA7sUDjKfCDXzSSi0Hwgeqmlkvux2NrDLyEFe3XMRRI6H3PpZTur1WtXzoOugOcpgiebAfvdDQdKIxppwrFTPRYnuZ1hqRjidFf1U0QSTCR7RnqExjqjqZ/MjZvDEKCEcCmlerOFc/d6R4Ujlq5nKCOux+u3l4l9eL9XDWj9jcZJqGpPFoGHKoRYwTwSGTFKi+dQQTCQzu0IyxhITbXIrmhC+LoX/k3bFcZHjXlXLjcoyjgIogWNwClxwDhrgEjRBCxBwBx7AE3i27q1H68V6XZSuWMueI/AD1tsnsOqZXg==\n\nrf2(xt) = [1,1]>\n\nAAACFXicdVBBSxwxFM5oq+va6qpHL6GLYEGHybi4ehCWeulxC10VdsYhk81oMJMMyRtxGeZPeOlf6cVDS/Fa6K3/ppl1C22pH4R8fN97vPe+tJDCQhD89BYWX7xcWm6ttFdfvV5b72xsnlldGsZHTEttLlJquRSKj0CA5BeF4TRPJT9Pb04b//yWGyu0+gjTgsc5vVIiE4yCk5LO3n6kaCopzpIQ70aplhM7zd1X3dUJvD0Zk719El9WEeiiTjrdwD8+Ogx7hzjwg6BPQtKQsN876GHilAZdNMcw6fyIJpqVOVfAJLV2TIIC4ooaEEzyuh2VlheU3dArPnZU0ZzbuJpdVeMdp0xwpo17CvBM/bOjorltdnWVOYVr+6/XiP/zxiVkR3ElVFECV+xpUFZKDBo3EeGJMJyBnDpCmRFuV8yuqaEMXJBtF8LvS/Hz5Cz0SeCTD73u4N08jhbaRm/QLiKojwboPRqiEWLoHn1GX9BX75P34H3zHp9KF7x5zxb6C973XzNQnj8=\nAAACFXicdVBBSxwxFM5oq+va6qpHL6GLYEGHybi4ehCWeulxC10VdsYhk81oMJMMyRtxGeZPeOlf6cVDS/Fa6K3/ppl1C22pH4R8fN97vPe+tJDCQhD89BYWX7xcWm6ttFdfvV5b72xsnlldGsZHTEttLlJquRSKj0CA5BeF4TRPJT9Pb04b//yWGyu0+gjTgsc5vVIiE4yCk5LO3n6kaCopzpIQ70aplhM7zd1X3dUJvD0Zk719El9WEeiiTjrdwD8+Ogx7hzjwg6BPQtKQsN876GHilAZdNMcw6fyIJpqVOVfAJLV2TIIC4ooaEEzyuh2VlheU3dArPnZU0ZzbuJpdVeMdp0xwpo17CvBM/bOjorltdnWVOYVr+6/XiP/zxiVkR3ElVFECV+xpUFZKDBo3EeGJMJyBnDpCmRFuV8yuqaEMXJBtF8LvS/Hz5Cz0SeCTD73u4N08jhbaRm/QLiKojwboPRqiEWLoHn1GX9BX75P34H3zHp9KF7x5zxb6C973XzNQnj8=\nAAACFXicdVBBSxwxFM5oq+va6qpHL6GLYEGHybi4ehCWeulxC10VdsYhk81oMJMMyRtxGeZPeOlf6cVDS/Fa6K3/ppl1C22pH4R8fN97vPe+tJDCQhD89BYWX7xcWm6ttFdfvV5b72xsnlldGsZHTEttLlJquRSKj0CA5BeF4TRPJT9Pb04b//yWGyu0+gjTgsc5vVIiE4yCk5LO3n6kaCopzpIQ70aplhM7zd1X3dUJvD0Zk719El9WEeiiTjrdwD8+Ogx7hzjwg6BPQtKQsN876GHilAZdNMcw6fyIJpqVOVfAJLV2TIIC4ooaEEzyuh2VlheU3dArPnZU0ZzbuJpdVeMdp0xwpo17CvBM/bOjorltdnWVOYVr+6/XiP/zxiVkR3ElVFECV+xpUFZKDBo3EeGJMJyBnDpCmRFuV8yuqaEMXJBtF8LvS/Hz5Cz0SeCTD73u4N08jhbaRm/QLiKojwboPRqiEWLoHn1GX9BX75P34H3zHp9KF7x5zxb6C973XzNQnj8=\nAAACFXicdVBBSxwxFM5oq+va6qpHL6GLYEGHybi4ehCWeulxC10VdsYhk81oMJMMyRtxGeZPeOlf6cVDS/Fa6K3/ppl1C22pH4R8fN97vPe+tJDCQhD89BYWX7xcWm6ttFdfvV5b72xsnlldGsZHTEttLlJquRSKj0CA5BeF4TRPJT9Pb04b//yWGyu0+gjTgsc5vVIiE4yCk5LO3n6kaCopzpIQ70aplhM7zd1X3dUJvD0Zk719El9WEeiiTjrdwD8+Ogx7hzjwg6BPQtKQsN876GHilAZdNMcw6fyIJpqVOVfAJLV2TIIC4ooaEEzyuh2VlheU3dArPnZU0ZzbuJpdVeMdp0xwpo17CvBM/bOjorltdnWVOYVr+6/XiP/zxiVkR3ElVFECV+xpUFZKDBo3EeGJMJyBnDpCmRFuV8yuqaEMXJBtF8LvS/Hz5Cz0SeCTD73u4N08jhbaRm/QLiKojwboPRqiEWLoHn1GX9BX75P34H3zHp9KF7x5zxb6C973XzNQnj8=\n\nz = gradf (xt)\n\nAAACGHicbVC7SgNBFJ2NrxhfUUubwaDEJu6KoI0QtLGMYB6QDWF2djYZMvtg5q4kLvsZNv6KjYUitun8G2eTFDHxwDCHc+7l3nucSHAFpvlj5FZW19Y38puFre2d3b3i/kFDhbGkrE5DEcqWQxQTPGB14CBYK5KM+I5gTWdwl/nNJyYVD4NHGEWs45NewD1OCWipWzy3nVC4auTrL3lOT29sYENIepK4KfbK8+4w7cJZt1gyK+YEeJlYM1JCM9S6xbHthjT2WQBUEKXalhlBJyESOBUsLdixYhGhA9JjbU0D4jPVSSaHpfhEKy72QqlfAHiiznckxFfZdrrSJ9BXi14m/ue1Y/CuOwkPohhYQKeDvFhgCHGWEna5ZBTESBNCJde7YtonklDQWRZ0CNbiycukcVGxzIr1cFmq3s7iyKMjdIzKyEJXqIruUQ3VEUUv6A19oE/j1Xg3vozvaWnOmPUcoj8wxr+HU6FN\nAAACGHicbVC7SgNBFJ2NrxhfUUubwaDEJu6KoI0QtLGMYB6QDWF2djYZMvtg5q4kLvsZNv6KjYUitun8G2eTFDHxwDCHc+7l3nucSHAFpvlj5FZW19Y38puFre2d3b3i/kFDhbGkrE5DEcqWQxQTPGB14CBYK5KM+I5gTWdwl/nNJyYVD4NHGEWs45NewD1OCWipWzy3nVC4auTrL3lOT29sYENIepK4KfbK8+4w7cJZt1gyK+YEeJlYM1JCM9S6xbHthjT2WQBUEKXalhlBJyESOBUsLdixYhGhA9JjbU0D4jPVSSaHpfhEKy72QqlfAHiiznckxFfZdrrSJ9BXi14m/ue1Y/CuOwkPohhYQKeDvFhgCHGWEna5ZBTESBNCJde7YtonklDQWRZ0CNbiycukcVGxzIr1cFmq3s7iyKMjdIzKyEJXqIruUQ3VEUUv6A19oE/j1Xg3vozvaWnOmPUcoj8wxr+HU6FN\nAAACGHicbVC7SgNBFJ2NrxhfUUubwaDEJu6KoI0QtLGMYB6QDWF2djYZMvtg5q4kLvsZNv6KjYUitun8G2eTFDHxwDCHc+7l3nucSHAFpvlj5FZW19Y38puFre2d3b3i/kFDhbGkrE5DEcqWQxQTPGB14CBYK5KM+I5gTWdwl/nNJyYVD4NHGEWs45NewD1OCWipWzy3nVC4auTrL3lOT29sYENIepK4KfbK8+4w7cJZt1gyK+YEeJlYM1JCM9S6xbHthjT2WQBUEKXalhlBJyESOBUsLdixYhGhA9JjbU0D4jPVSSaHpfhEKy72QqlfAHiiznckxFfZdrrSJ9BXi14m/ue1Y/CuOwkPohhYQKeDvFhgCHGWEna5ZBTESBNCJde7YtonklDQWRZ0CNbiycukcVGxzIr1cFmq3s7iyKMjdIzKyEJXqIruUQ3VEUUv6A19oE/j1Xg3vozvaWnOmPUcoj8wxr+HU6FN\nAAACGHicbVC7SgNBFJ2NrxhfUUubwaDEJu6KoI0QtLGMYB6QDWF2djYZMvtg5q4kLvsZNv6KjYUitun8G2eTFDHxwDCHc+7l3nucSHAFpvlj5FZW19Y38puFre2d3b3i/kFDhbGkrE5DEcqWQxQTPGB14CBYK5KM+I5gTWdwl/nNJyYVD4NHGEWs45NewD1OCWipWzy3nVC4auTrL3lOT29sYENIepK4KfbK8+4w7cJZt1gyK+YEeJlYM1JCM9S6xbHthjT2WQBUEKXalhlBJyESOBUsLdixYhGhA9JjbU0D4jPVSSaHpfhEKy72QqlfAHiiznckxFfZdrrSJ9BXi14m/ue1Y/CuOwkPohhYQKeDvFhgCHGWEna5ZBTESBNCJde7YtonklDQWRZ0CNbiycukcVGxzIr1cFmq3s7iyKMjdIzKyEJXqIruUQ3VEUUv6A19oE/j1Xg3vozvaWnOmPUcoj8wxr+HU6FN\n\ndcos = 1 cos (xt,rf (xt)) = 1 +\n\nAAACkHicfVHBTuMwEHUC7EIXli575GJRrQRaqBJYaTlQLYgL4gQSBaS6GzmO01o4dmRPEFXI9/A/3PgbnFIkaBEjWX56b2Y8fhPnUlgIgifPn5tf+PJ1canxbXnl+2rzx9ql1YVhvMu01OY6ppZLoXgXBEh+nRtOs1jyq/jmuNavbrmxQqsLGOW8n9GBEqlgFBwVNR+SqCRM2wp3wp0aEMlT2CSxlokdZe4q76oItneIorGkOJ2RtogRgyFsNTrhb5IaysrpjP8EdI5fG+DZDlVJ7j+RyX3ViJqtoB2MA8+CcAJaaBJnUfORJJoVGVfAJLW2FwY59EtqQDDJqwYpLM8pu6ED3nNQ0Yzbfjk2tMK/HJPgVBt3FOAx+7aipJmtJ3SZGYWhndZq8iOtV0C63y+Fygvgir08lBYSg8b1dnAiDGcgRw5QZoSbFbMhdaaC22FtQjj95VlwudsOg3Z4/qd1GEzsWETraANtohD9RYfoBJ2hLmLeirfnHXgdf83f9//5Ry+pvjep+YnehX/6DEUXyX4=\nAAACkHicfVHBTuMwEHUC7EIXli575GJRrQRaqBJYaTlQLYgL4gQSBaS6GzmO01o4dmRPEFXI9/A/3PgbnFIkaBEjWX56b2Y8fhPnUlgIgifPn5tf+PJ1canxbXnl+2rzx9ql1YVhvMu01OY6ppZLoXgXBEh+nRtOs1jyq/jmuNavbrmxQqsLGOW8n9GBEqlgFBwVNR+SqCRM2wp3wp0aEMlT2CSxlokdZe4q76oItneIorGkOJ2RtogRgyFsNTrhb5IaysrpjP8EdI5fG+DZDlVJ7j+RyX3ViJqtoB2MA8+CcAJaaBJnUfORJJoVGVfAJLW2FwY59EtqQDDJqwYpLM8pu6ED3nNQ0Yzbfjk2tMK/HJPgVBt3FOAx+7aipJmtJ3SZGYWhndZq8iOtV0C63y+Fygvgir08lBYSg8b1dnAiDGcgRw5QZoSbFbMhdaaC22FtQjj95VlwudsOg3Z4/qd1GEzsWETraANtohD9RYfoBJ2hLmLeirfnHXgdf83f9//5Ry+pvjep+YnehX/6DEUXyX4=\nAAACkHicfVHBTuMwEHUC7EIXli575GJRrQRaqBJYaTlQLYgL4gQSBaS6GzmO01o4dmRPEFXI9/A/3PgbnFIkaBEjWX56b2Y8fhPnUlgIgifPn5tf+PJ1canxbXnl+2rzx9ql1YVhvMu01OY6ppZLoXgXBEh+nRtOs1jyq/jmuNavbrmxQqsLGOW8n9GBEqlgFBwVNR+SqCRM2wp3wp0aEMlT2CSxlokdZe4q76oItneIorGkOJ2RtogRgyFsNTrhb5IaysrpjP8EdI5fG+DZDlVJ7j+RyX3ViJqtoB2MA8+CcAJaaBJnUfORJJoVGVfAJLW2FwY59EtqQDDJqwYpLM8pu6ED3nNQ0Yzbfjk2tMK/HJPgVBt3FOAx+7aipJmtJ3SZGYWhndZq8iOtV0C63y+Fygvgir08lBYSg8b1dnAiDGcgRw5QZoSbFbMhdaaC22FtQjj95VlwudsOg3Z4/qd1GEzsWETraANtohD9RYfoBJ2hLmLeirfnHXgdf83f9//5Ry+pvjep+YnehX/6DEUXyX4=\nAAACkHicfVHBTuMwEHUC7EIXli575GJRrQRaqBJYaTlQLYgL4gQSBaS6GzmO01o4dmRPEFXI9/A/3PgbnFIkaBEjWX56b2Y8fhPnUlgIgifPn5tf+PJ1canxbXnl+2rzx9ql1YVhvMu01OY6ppZLoXgXBEh+nRtOs1jyq/jmuNavbrmxQqsLGOW8n9GBEqlgFBwVNR+SqCRM2wp3wp0aEMlT2CSxlokdZe4q76oItneIorGkOJ2RtogRgyFsNTrhb5IaysrpjP8EdI5fG+DZDlVJ7j+RyX3ViJqtoB2MA8+CcAJaaBJnUfORJJoVGVfAJLW2FwY59EtqQDDJqwYpLM8pu6ED3nNQ0Yzbfjk2tMK/HJPgVBt3FOAx+7aipJmtJ3SZGYWhndZq8iOtV0C63y+Fygvgir08lBYSg8b1dnAiDGcgRw5QZoSbFbMhdaaC22FtQjj95VlwudsOg3Z4/qd1GEzsWETraANtohD9RYfoBJ2hLmLeirfnHXgdf83f9//5Ry+pvjep+YnehX/6DEUXyX4=\n\nx>t rf (xt)\nkrf (xt)k\n\nFigure 1: Example of modi\ufb01ed Riemannian gradient descent on S1. Without modi\ufb01cation, two\nEuclidean descent directions rf1(xt) and rf2(xt) give the same Riemannian gradient z. We\npropose to multiply z with the cosine distance between xt and rf (xt) as the modi\ufb01ed Riemannian\ngradient so that angular distances are taken into account during the parameter update.\n\n4.4 Training Details\nWe describe two sets of design that lead to more ef\ufb01cient and effective training of the above optimiza-\ntion procedure.\nFirst, the exponential mapping requires computation of non-linear functions, speci\ufb01cally, sin(\u00b7)\nand cos(\u00b7) in Equation (4), which is inef\ufb01cient especially when the corpus is large. To tackle this\nissue, we can use a \ufb01rst-order approximation of the exponential mapping, called a retraction, i.e.,\nRx (z) : TxSp1 ! Sp1 such that d(Rx (z) , expx(z)) = O(kzk2). For the unit sphere, we can\nsimply de\ufb01ne the retraction Rx (z) to be an addition in the ambient Euclidean space followed by a\nprojection onto the sphere [1, 6], i.e.,\n\nRx (z) :=\n\n.\n\n(6)\n\nx + z\nkx + zk\n\nSecond, the Riemannian gradient given by Equation (5) provides the correct direction to update the\nparameters on the sphere, but its norm is not optimal when our goal is to train embeddings that capture\ndirectional similarity. This issue can be illustrated by the following toy example shown in Figure\n1: Consider a point xt with Euclidean coordinate (0, 1) on a 2-d unit sphere S1 and two Euclidean\ngradient descent directions rf1(xt) = [1, 1]> and rf2(xt) = [1,1]>. Also, for simplicity,\nassume \u2318t = 1. In this case, the Riemannian gradient projected from rf2(xt) is the same with\nthat of rf1(xt) and is equal to [1, 0]>, i.e., grad f1(xt) = grad f2(xt) = [1, 0]>. However, when\nour goal is to capture directional information and distance is measured by the angles between vectors,\nrf2(xt) suggests a bigger step to take than rf1(xt) at point xt. To explicitly incorporate\nangular distance into the optimization procedure, we use the cosine distance between the current\nkrf (xt)k\u2318, as\npoint xt 2 Sp1 and the Euclidean gradient descent direction rf (xt), i.e.,\u21e31 + x>t rf (xt)\n\na multiplier to the computed Riemannian gradient according to Equation (5). The rationale of this\ndesign is to encourage parameters with greater cosine distance from its target direction to take a larger\nupdate step. We \ufb01nd that when updating negative samples, it is empirically better to use negative\ncosine similarity instead of cosine distance as the multiplier to the Riemannian gradient. This is\nprobably because negative samples are randomly sampled from the vocabulary and most of them are\nsemantically irrelevant with the center word. Therefore, their ideal embeddings should be orthogonal\nto the center word\u2019s embedding. However, using cosine distance will encourage them to point to the\nopposite direction of the center word\u2019s embedding.\nIn summary, with the above designs, we optimize the parameters by the following update rule:\n\nxt+1 = Rxt\u2713\u2318t\u27131 +\n\nx>t rf (xt)\n\nkrf (xt)k\u25c6I xtx>trf (xt)\u25c6 .\n\n(7)\n\n5\n\n\fFinally, we provide the convergence guarantee of the above update rule when applied to optimize our\nobjectives.\nTheorem 2. When the update rule given by Equation (7) is applied to L(x), and the learning rate\nsatis\ufb01es the usual condition in stochastic approximation, i.e.,Pt \u23182\nt < 1 andPt \u2318t = 1, x\nconverges almost surely to a critical point x\u21e4 and gradL(x) converges almost surely to 0, i.e.,\n\ngradL(xt) = 0\u2318 = 1.\n\nPr\u21e3 lim\nt!1L(xt) = L(x\u21e4)\u2318 = 1, Pr\u21e3 lim\n\nt!1\n\nThe proof of Theorem 2 can be found in Appendix B.\n\n5 Evaluation\n\nIn this section, we empirically evaluate the quality of spherical text embeddings for three common text\nembedding application tasks, i.e., word similarity, document clustering and document classi\ufb01cation.\nOur model is named JoSE, for Joint Spherical Embedding. For all three tasks, our spherical\nembeddings and all baselines are trained according to the following setting: The models are trained\nfor 10 iterations on the corpus; the local context window size is 10; the embedding dimension is 100.\nIn our JoSE model, we set the margin in Equation (3) to be 0.15, the number of negative samples to\nbe 2, the initial learning rate to be 0.04 with linear decay. Other hyperparameters are set to be the\ndefault value of the corresponding algorithm.\nAlso, since text embeddings serve as the building block for many downstream tasks, it is essential\nthat the embedding training is ef\ufb01cient and can scale to very large datasets. At the end of this section,\nwe will provide the training time of different word embedding models when trained on the latest\nWikipedia dump.\n\n5.1 Word Similarity\n\nWe conduct word similarity evaluation on the following benchmark datasets: WordSim353 [13],\nMEN [7] and SimLex999 [17]. The training corpus for word similarity is the latest Wikipedia dump3\ncontaining 2.4 billion tokens. Words appearing less than 100 times are discarded, leaving 239, 672\nunique tokens. The Spearman\u2019s rank correlation is reported in Table 1, which re\ufb02ects the consistency\nbetween word similarity rankings given by cosine similarity of word embeddings and human raters.\nWe compare our model with the following baselines: Word2Vec [30], GloVe [33], fastText [5]\nand BERT [10] which are trained in Euclidean space, and Poincar\u00e9 GloVe [39] which is trained in\nPoincar\u00e9 space. The results demonstrate that training embeddings in the spherical space is essential\nfor the superior performance on word similarity. We attempt to explain why the recent popular\nlanguage model, BERT [10], falls behind other baselines on this task: (1) BERT learns contextualized\nrepresentations, but word similarity evaluation is conducted in a context-free manner; averaging\ncontextualized representations to derive context-free representations may not be the intended usage\nof BERT. (2) BERT is optimized on speci\ufb01c downstream tasks like predicting masked words and\nsentence relationships, which have no direct relation to word similarity.\n\nTable 1: Spearman rank correlation on word similarity evaluation.\n\nEmbedding Space\n\nEuclidean\n\nPoincar\u00e9\nSpherical\n\nModel\n\nWord2Vec\n\nGloVe\nfastText\nBERT\n\nPoincar\u00e9 GloVe\n\nJoSE\n\nWordSim353 MEN SimLex999\n\n0.711\n0.598\n0.697\n0.477\n0.623\n0.739\n\n0.726\n0.710\n0.722\n0.594\n0.652\n0.748\n\n0.311\n0.321\n0.303\n0.287\n0.321\n0.339\n\n3https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2\n\n6\n\n\f5.2 Document Clustering\n\nWe perform document clustering to evaluate the quality of the spherical paragraph embeddings trained\nby our model. The training corpus is the 20 Newsgroups dataset4, and we treat each document as a\nparagraph in all compared models. The dataset contains around 18, 000 newsgroup documents (both\ntraining and testing documents are used) partitioned into 20 classes. For clustering methods, we use\nboth K-Means and spherical K-Means (SK-Means) [3] which performs clustering in the spherical\nspace. We compare with the following paragraph embedding baselines: Averaged word embedding\nusing Word2Vec [30], SIF [2], BERT [10] and Doc2Vec [22]. We use four widely used external\nmeasures [3, 25, 36] as metrics: Mutual Information (MI), Normalized Mutual Information (NMI),\nAdjusted Rand Index (ARI), and Purity. The results are reported in Table 2, with mean and standard\ndeviation computed over 10 runs. It is shown that feature quality is generally more important that\nclustering algorithms for document clustering tasks: Using spherical K-Means only gives marginal\nperformance boost over K-Means, while JoSE remains optimal regardless of clustering algorithms.\nThis demonstrates that directional similarity on document/paragraph-level features is bene\ufb01cial also\nfor clustering tasks, which can be captured intrinsically in the spherical space.\n\nTable 2: Document clustering evaluation on the 20 Newsgroup dataset.\n\nAvg. W2V\n\nEmbedding Clus. Alg.\nK-Means\nSK-Means\nK-Means\nSK-Means\nK-Means\nSK-Means\nK-Means\nSK-Means\nK-Means\nSK-Means\n\nDoc2Vec\n\nSIF\n\nBERT\n\nJoSE\n\nMI\n\nNMI\n\nARI\n\n1.299 \u00b1 0.031\n1.328 \u00b1 0.024\n0.893 \u00b1 0.028\n0.958 \u00b1 0.012\n0.719 \u00b1 0.013\n0.854 \u00b1 0.022\n1.856 \u00b1 0.020\n1.876 \u00b1 0.020\n1.975 \u00b1 0.026\n1.982 \u00b1 0.034\n\n0.445 \u00b1 0.009\n0.453 \u00b1 0.009\n0.308 \u00b1 0.009\n0.322 \u00b1 0.004\n0.248 \u00b1 0.004\n0.289 \u00b1 0.008\n0.626 \u00b1 0.006\n0.630 \u00b1 0.007\n0.663 \u00b1 0.008\n0.664 \u00b1 0.010\n\n0.247 \u00b1 0.008\n0.250 \u00b1 0.008\n0.137 \u00b1 0.006\n0.164 \u00b1 0.004\n0.100 \u00b1 0.003\n0.127 \u00b1 0.003\n0.469 \u00b1 0.015\n0.494 \u00b1 0.012\n0.556 \u00b1 0.018\n0.568 \u00b1 0.020\n\nPurity\n\n0.408 \u00b1 0.014\n0.419 \u00b1 0.012\n0.285 \u00b1 0.011\n0.331 \u00b1 0.005\n0.233 \u00b1 0.005\n0.281 \u00b1 0.010\n0.640 \u00b1 0.016\n0.648 \u00b1 0.017\n0.711 \u00b1 0.020\n0.721 \u00b1 0.029\n\n5.3 Document Classi\ufb01cation\n\nApart from document clustering, we also evaluate the quality of spherical paragraph embeddings on\ndocument classi\ufb01cation tasks. Besides the 20 Newsgroup dataset used in Section 5.2 which is a topic\nclassi\ufb01cation dataset, we evaluate different document/paragraph embedding methods also on a binary\nsentiment classi\ufb01cation dataset consisting of 1, 000 positive and 1, 000 negative movie reviews5. We\nagain treat each document in both datasets as a paragraph in all models. For the 20 Newsgroup\ndataset, we follow the original train/test sets split; for the movie review dataset, we randomly select\n80% of the data as training and 20% as testing. We use k-NN [9] as the classi\ufb01cation algorithm with\nEuclidean distance as the distance metric. Since k-NN is a non-parametric method, the performances\nof k-NN directly re\ufb02ect how well the topology of the embedding space captures document-level\nsemantics (i.e., whether documents from the same semantic class are embedded closer). We set k = 3\nin the experiment (we observe similar comparison results when ranging k in [1, 10]) and report the\nperformances of all methods measured by Macro-F1 and Micro-F1 scores in Table 3. JoSE achieves\nthe best performances on both datasets with k-NN classi\ufb01cation, demonstrating the effectiveness of\nJoSE in capturing both topical and sentiment semantics into learned paragraph embeddings.\n\n5.4 Training Ef\ufb01ciency\n\nWe report the training time on the latest Wikipedia dump per iteration of all baselines used in\nSection 5.1 to compare the training ef\ufb01ciency. All the models except BERT are run on a machine\nwith 20 cores of Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80 GHz; BERT is trained on 8 NVIDIA\n\n4http://qwone.com/~jason/20Newsgroups/\n5http://www.cs.cornell.edu/people/pabo/movie-review-data/\n\n7\n\n\fTable 3: Document classi\ufb01cation evaluation using k-NN (k = 3).\n\nEmbedding\n\nAvg. W2V\n\nSIF\nBERT\nDoc2Vec\n\nJoSE\n\n20 Newsgroup\n\nMovie Review\n\nMacro-F1 Micro-F1 Macro-F1 Micro-F1\n\n0.630\n0.552\n0.380\n0.648\n0.703\n\n0.631\n0.549\n0.371\n0.645\n0.707\n\n0.712\n0.650\n0.664\n0.674\n0.764\n\n0.713\n0.656\n0.665\n0.678\n0.765\n\nGeForce GTX 1080 GPUs. The training time is reported in Table 4. All text embedding frameworks\nare able to scale to large datasets (except BERT which is not speci\ufb01cally designed for learning text\nembeddings), but JoSE enjoys the highest ef\ufb01ciency. The overall ef\ufb01ciency of our model results\nfrom both our objective function design and the optimization procedure: (1) The objective of our\nmodel (Equation (3)) only contains simple operations (note that cosine similarity on the unit sphere\nis simply vector dot product), while other models contains non-linear operations (Word2Vec\u2019s and\nfastText\u2019s objectives involve exponential functions; GloVe\u2019s objective involves logarithm functions);\n(2) After replacing the original exponential mapping (Equation (4)) with retraction (Equation (6)),\nthe update rule (Equation (7)) only computes vector additions, multiplications and normalization in\naddition to the Euclidean gradient, which are all inexpensive operations.\n\nTable 4: Training time (per iteration) on the latest Wikipedia dump.\n\nWord2Vec\n0.81 hrs\n\nGloVe\n0.85 hrs\n\nfastText\n2.11 hrs\n\nBERT\n> 5 days\n\nPoincar\u00e9 GloVe\n\n1.25 hrs\n\nJoSE\n0.73 hrs\n\n6 Conclusions and Future Work\n\nIn this paper, we propose to address the discrepancy between the training procedure and the practical\nusage of Euclidean text embeddings by learning spherical text embeddings that intrinsically captures\ndirectional similarity. Speci\ufb01cally, we introduce a spherical generative model consisting of a two-step\ngenerative process to jointly learn word and paragraph embeddings. Furthermore, we develop an\nef\ufb01cient Riemannian optimization method to train text embeddings on the unit hypersphere. State-\nof-the-art results on common text embedding applications including word similarity and document\nclustering demonstrate the ef\ufb01cacy of our model. With a simple training objective and an ef\ufb01cient\noptimization procedure, our proposed model enjoys better ef\ufb01ciency compared to previous embedding\nlearning systems.\nIn future work, it will be interesting to exploit spherical embedding space for other tasks like lexical\nentailment, by also learning the concentration parameter in the vMF distribution of each word in\nthe generative model or designing other generative models. It may also be possible to incorporate\nother signals such as subword information [5] into spherical text embeddings learning for even better\nembedding quality. Our unsupervised embedding model may also bene\ufb01t other supervised tasks:\nSince word embeddings are commonly used as the \ufb01rst layer in deep neural networks, it might be\nbene\ufb01cial to either add norm constraints or apply Riemannian optimization when \ufb01ne-tuning the\nword embedding layer.\n\nAcknowledgments\nResearch was sponsored in part by U.S. Army Research Lab. under Cooperative Agreement No.\nW911NF-09-2-0053 (NSCTA), DARPA under Agreements No. W911NF-17-C-0099 and FA8750-\n19-2-1004, National Science Foundation IIS 16-18481, IIS 17-04532, and IIS 17-41317, DTRA\nHDTRA11810026, and grant 1U54GM114838 awarded by NIGMS through funds provided by the\ntrans-NIH Big Data to Knowledge (BD2K) initiative (www.bd2k.nih.gov). Any opinions, \ufb01ndings,\nand conclusions or recommendations expressed in this document are those of the author(s) and should\nnot be interpreted as the views of any U.S. Government. The U.S. Government is authorized to\n\n8\n\n\freproduce and distribute reprints for Government purposes notwithstanding any copyright notation\nhereon. We thank anonymous reviewers for valuable and insightful feedback.\n\nReferences\n[1] P.-A. Absil, R. E. Mahony, and R. Sepulchre. Optimization algorithms on matrix manifolds.\n\n2007.\n\n[2] S. Arora, Y. Liang, and T. Ma. A simple but tough-to-beat baseline for sentence embeddings. In\n\nICLR, 2017.\n\n[3] A. Banerjee, I. S. Dhillon, J. Ghosh, and S. Sra. Clustering on the unit hypersphere using von\n\nmises-\ufb01sher distributions. Journal of Machine Learning Research, 6:1345\u20131382, 2005.\n\n[4] K. Batmanghelich, A. Saeedi, K. Narasimhan, and S. J. Gershman. Nonparametric spherical\n\ntopic modeling with word embeddings. In ACL, 2016.\n\n[5] P. Bojanowski, E. Grave, A. Joulin, and T. Mikolov. Enriching word vectors with subword\n\ninformation. TACL, 2017.\n\n[6] S. Bonnabel. Stochastic gradient descent on riemannian manifolds. IEEE Transactions on\n\nAutomatic Control, 58:2217\u20132229, 2013.\n\n[7] E. Bruni, N.-K. Tran, and M. Baroni. Multimodal distributional semantics. J. Artif. Intell. Res.,\n\n49:1\u201347, 2014.\n\n[8] K. Cho, B. van Merrienboer, \u00c7aglar G\u00fcl\u00e7ehre, D. Bahdanau, F. Bougares, H. Schwenk, and\nY. Bengio. Learning phrase representations using rnn encoder-decoder for statistical machine\ntranslation. In EMNLP, 2014.\n\n[9] T. M. Cover and P. E. Hart. Nearest neighbor pattern classi\ufb01cation. IEEE Trans. Information\n\nTheory, 13:21\u201327, 1967.\n\n[10] J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova. Bert: Pre-training of deep bidirectional\n\ntransformers for language understanding. CoRR, abs/1810.04805, 2018.\n\n[11] B. Dhingra, C. J. Shallue, M. Norouzi, A. M. Dai, and G. E. Dahl. Embedding text in hyperbolic\n\nspaces. In TextGraphs@NAACL-HLT, 2018.\n\n[12] O. P. Ferreira, A. N. Iusem, and S. Z. N\u00e9meth. Concepts and techniques of optimization on the\n\nsphere. 2014.\n\n[13] L. Finkelstein, E. Gabrilovich, Y. Matias, E. Rivlin, Z. Solan, G. Wolfman, and E. Ruppin.\n\nPlacing search in context: the concept revisited. In WWW, 2001.\n\n[14] D. L. Fisk. Quasi-martingales. Transactions of the American Mathematical Society, 1965.\n[15] O.-E. Ganea, G. B\u00e9cigneul, and T. Hofmann. Hyperbolic entailment cones for learning hierar-\n\nchical embeddings. In ICML, 2018.\n\n[16] S. Gopal and Y. Yang. Von mises-\ufb01sher clustering models. In ICML, 2014.\n[17] F. Hill, R. Reichart, and A. Korhonen. Simlex-999: Evaluating semantic models with (genuine)\n\nsimilarity estimation. Computational Linguistics, 41:665\u2013695, 2015.\n\n[18] Y. Kim. Convolutional neural networks for sentence classi\ufb01cation. In EMNLP, 2014.\n[19] R. Kiros, Y. Zhu, R. R. Salakhutdinov, R. S. Zemel, A. Torralba, R. Urtasun, and S. Fidler.\n\nSkip-thought vectors. In NIPS, 2015.\n\n[20] S. Kumar and Y. Tsvetkov. Von mises-\ufb01sher loss for training sequence to sequence models with\n\ncontinuous outputs. In ICLR, 2019.\n\n[21] G. Lample, M. Ballesteros, S. Subramanian, K. Kawakami, and C. Dyer. Neural architectures\n\nfor named entity recognition. In HLT-NAACL, 2016.\n\n9\n\n\f[22] Q. V. Le and T. Mikolov. Distributed representations of sentences and documents. In ICML,\n\n2014.\n\n[23] O. Levy and Y. Goldberg. Linguistic regularities in sparse and explicit word representations. In\n\nCoNLL, 2014.\n\n[24] W. Liu, Y.-M. Zhang, X. Li, Z. Yu, B. Dai, T. Zhao, and L. Song. Deep hyperspherical learning.\n\nIn NIPS, 2017.\n\n[25] C. D. Manning, P. Raghavan, and H. Sch\u00fctze. Introduction to information retrieval. 2008.\n[26] K. V. Mardia and P. E. Jupp. Directional statistics, volume 494. John Wiley & Sons, 2009.\n[27] Y. Meng, J. Shen, C. Zhang, and J. Han. Weakly-supervised neural text classi\ufb01cation. In CIKM,\n\n2018.\n\n[28] Y. Meng, J. Shen, C. Zhang, and J. Han. Weakly-supervised hierarchical text classi\ufb01cation. In\n\nAAAI, 2019.\n\n[29] T. Mikolov, K. Chen, G. S. Corrado, and J. Dean. Ef\ufb01cient estimation of word representations\n\nin vector space. CoRR, abs/1301.3781, 2013.\n\n[30] T. Mikolov, I. Sutskever, K. Chen, G. S. Corrado, and J. Dean. Distributed representations of\n\nwords and phrases and their compositionality. In NIPS, 2013.\n\n[31] M. Nickel and D. Kiela. Poincar\u00e9 embeddings for learning hierarchical representations. In\n\nNIPS, 2017.\n\n[32] M. Nickel and D. Kiela. Learning continuous hierarchies in the lorentz model of hyperbolic\n\ngeometry. In ICML, 2018.\n\n[33] J. Pennington, R. Socher, and C. D. Manning. Glove: Global vectors for word representation.\n\nIn EMNLP, 2014.\n\n[34] S. T. Smith. Optimization techniques on riemannian manifolds. CoRR, abs/1407.5965, 2014.\n[35] S. Sra. Matrix nearness problems in data mining. PhD thesis, 2007.\n[36] D. Steinley. Properties of the hubert-arabie adjusted rand index. Psychological methods, 9\n\n3:386\u201396, 2004.\n\n[37] J. Tang, M. Qu, and Q. Mei. Pte: Predictive text embedding through large-scale heterogeneous\n\ntext networks. In KDD, 2015.\n\n[38] F. Tao, C. Zhang, X. Chen, M. Jiang, T. Hanratty, L. M. Kaplan, and J. Han. Doc2cube:\n\nAllocating documents to text cube without labeled data. In ICDM, 2018.\n\n[39] A. Tifrea, G. B\u00e9cigneul, and O.-E. Ganea. Poincar\u00e9 glove: Hyperbolic word embeddings. In\n\nICLR, 2019.\n\n[40] I. Vendrov, R. Kiros, S. Fidler, and R. Urtasun. Order-embeddings of images and language. In\n\nICLR, 2016.\n\n[41] L. Vilnis and A. McCallum. Word representations via gaussian embedding. In ICLR, 2015.\n[42] J. Wieting, M. Bansal, K. Gimpel, and K. Livescu. Towards universal paraphrastic sentence\n\nembeddings. In ICLR, 2016.\n\n[43] C. Zhang, L. Liu, D. Lei, Q. Yuan, H. Zhuang, T. Hanratty, and J. Han. Triovecevent: Embedding-\n\nbased online local event detection in geo-tagged tweet streams. In KDD, 2017.\n\n[44] H. Zhuang, C. Wang, F. Tao, L. M. Kaplan, and J. Han. Identifying semantically deviating\n\noutlier documents. In EMNLP, 2017.\n\n10\n\n\f", "award": [], "sourceid": 4463, "authors": [{"given_name": "Yu", "family_name": "Meng", "institution": "University of Illinois at Urbana-Champaign"}, {"given_name": "Jiaxin", "family_name": "Huang", "institution": "University of Illinois Urbana-Champaign"}, {"given_name": "Guangyuan", "family_name": "Wang", "institution": "UIUC"}, {"given_name": "Chao", "family_name": "Zhang", "institution": "Georgia Institute of Technology"}, {"given_name": "Honglei", "family_name": "Zhuang", "institution": "Google Research"}, {"given_name": "Lance", "family_name": "Kaplan", "institution": "U.S. Army Research Laboratory"}, {"given_name": "Jiawei", "family_name": "Han", "institution": "UIUC"}]}