{"title": "Deep Reinforcement Learning of Marked Temporal Point Processes", "book": "Advances in Neural Information Processing Systems", "page_first": 3168, "page_last": 3178, "abstract": "In a wide variety of applications, humans interact with a complex environment by means of asynchronous stochastic discrete events in continuous time. Can we design online interventions that will help humans achieve certain goals in such asynchronous setting? In this paper, we address the above problem from the perspective of deep reinforcement learning of marked temporal point processes, where both the actions taken by an agent and the feedback it receives from the environment are asynchronous stochastic discrete events characterized using marked temporal point processes. In doing so, we define the agent's policy using the intensity and mark distribution of the corresponding process and then derive \na flexible policy gradient method, which embeds the agent's actions and the feedback it receives into real-valued vectors using deep recurrent neural networks. Our method does not make any assumptions on the functional form of the intensity and mark distribution of the feedback and it allows for arbitrarily complex reward functions. We apply our methodology to two different applications in viral marketing and personalized teaching and, using data gathered from Twitter and Duolingo, we show that it may be able to find interventions to help marketers and learners achieve their goals more effectively than alternatives.", "full_text": "Deep Reinforcement Learning of\nMarked Temporal Point Processes\n\nUtkarsh Upadhyay\n\nMPI-SWS\n\nutkarshu@mpi-sws.org\n\nAbir De\nMPI-SWS\n\nade@mpi-sws.org\n\nAbstract\n\nManuel Gomez-Rodrizuez\n\nMPI-SWS\n\nmanuelgr@mpi-sws.org\n\nIn a wide variety of applications, humans interact with a complex environment\nby means of asynchronous stochastic discrete events in continuous time. Can we\ndesign online interventions that will help humans achieve certain goals in such asyn-\nchronous setting? In this paper, we address the above problem from the perspective\nof deep reinforcement learning of marked temporal point processes, where both\nthe actions taken by an agent and the feedback it receives from the environment\nare asynchronous stochastic discrete events characterized using marked temporal\npoint processes. In doing so, we de\ufb01ne the agent\u2019s policy using the intensity and\nmark distribution of the corresponding process and then derive a \ufb02exible policy\ngradient method, which embeds the agent\u2019s actions and the feedback it receives\ninto real-valued vectors using deep recurrent neural networks. Our method does not\nmake any assumptions on the functional form of the intensity and mark distribution\nof the feedback and it allows for arbitrarily complex reward functions. We apply\nour methodology to two different applications in personalized teaching and viral\nmarketing and, using data gathered from Duolingo and Twitter, we show that it\nmay be able to \ufb01nd interventions to help learners and marketers achieve their goals\nmore effectively than alternatives.\n\n1\n\nIntroduction\n\nIn recent years, the framework of marked temporal point processes (MTPPs) [1] has become in-\ncreasingly popular for modeling asynchronous event data in continuous time, which is ubiquitous\nin a wide range of application domains, from social and information networks to \ufb01nance or health\ninformatics. For example, in social and information networks, events may represent users\u2019 posts,\nclicks or likes; in \ufb01nance, they may represent buying and selling orders; or, in health informatics,\nthey may represent when a patient exhibits different symptoms or receives treatment. In most cases,\nthe development of a new model reduces to the problem of designing an appropriate functional form\nfor the conditional intensity (or intensities) of the events of interest as well as the distribution of the\ncorresponding mark(s).\nIn this context, a recent line of work [13, 27, 29, 30, 33, 34] has exploited an alternative view\nof MTPPs as stochastic differential equations (SDEs) with jumps [10] to design online, adaptive\ninterventions using stochastic optimal control. While this line of work has shown promise at enhancing\nthe functioning of social and information systems, their wide spread use and deployment is precluded\nmainly by two drawbacks. First, they make strong assumptions about the functional form of the\nconditional intensities and mark distributions of the MTPPs, which in turn prevent them from using\nstate of the art MTPP models based on deep learning [5, 11, 17]. Second, the objective functions that\nthe interventions optimize upon, need to be carefully chosen to ensure that the underlying stochastic\noptimal control problem remains tractable. As a consequence, the use of (more) meaningful objective\n\n32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montr\u00e9al, Canada.\n\n\fFigure 1: Reinforcement learning setups. In the traditional discrete time setting [26], actions and\nfeedback occur in discrete time; in the continuous time setting [4], actions and feedback are real\nvalue functions in continuous time; and, in the marked temporal point process setting (our work),\nactions and feedback are asynchronous events localized in continuous time.\n\nfunctions with clear semantics is often off limits. In our work, we overcome these drawbacks by\napproaching the problem from the perspective of deep reinforcement learning of MTPPs.\nMore speci\ufb01cally, we \ufb01rst introduce a novel reinforcement learning problem where both the actions\ntaken by an agent and the feedback it receives from its environment are asynchronous stochastic\nevents in continuous time, which are characterized using MTPPs. Here, the goal is \ufb01nding the\noptimal intensity and mark distribution for the agent\u2019s actions\u2014the optimal policy\u2014that maximize\nan arbitrary reward function, which may depend on its actions and the feedback. Then, we derive\na novel policy gradient method, specially designed to solve the above problem, which embeds the\nagent\u2019s actions and the feedback from the environment into real-valued vectors using deep recurrent\nneural networks (RNNs). In contrast with the literature on stochastic optimal control of SDEs with\njumps, our method does not make any assumptions on the functional form of the conditional intensity\n(or intensities) and mark distribution(s) characterizing the feedback, and it allows for arbitrarily\ncomplex reward functions. Moreover, it departs from previous work in the reinforcement learning\nliterature [4, 6, 8, 9, 15, 20, 26, 28, 31] in two key aspects, which are also illustrated in Figure 1:\n\nI. The agent\u2019s actions and environment\u2019s feedback are asynchronous stochastic events in conti-\nnuous time. In contrast, previous work has considered synchronous actions and (potentially\ndelayed) feedback in discrete time [6, 15, 20, 31], with few notable exceptions [4, 9, 28].\nWhile these exceptions considered continuous time, they assumed actions and feedback to\nbe continuous and deterministic and the dynamics of the environment to be known.1\n\nII. Our policy is a conditional intensity function (and a mark distribution), which is used to\nsample the times (and marks) of the agent\u2019s actions. Here, note that a sampled agent\u2019s action\nmay need to be resampled due to the occurrence of new feedback events before the sampled\ntime. In contrast, previous works considered the policy to be a probability distribution or,\nmore rarely, a deterministic function [4, 9, 28].\n\nFinally, we apply our methodology to two different applications in personalized teaching [14, 22, 27]\nand viral marketing [12, 25, 29, 33, 34], respectively. For simple dynamics and objective fun-\nctions, which allow for stochastic optimal control approaches, our method achieves a comparable\nperformance even though it does not have access to the true underlying dynamics. For complex\ndynamics and/or objective functions, which do not allow for stochastic optimal control approaches, our\nmethod is able to successfully \ufb01nd interventions that optimize the corresponding objective function\nand beat several competitive baselines. To facilitate research in temporal point processes within the\nreinforcement learning community at large, we are releasing an open-source implementation of our\nmethod in TensorFlow as well as synthetic and real-world data used in our experiments.2\n\n2 Problem formulation\n\nIn this section, we \ufb01rst brie\ufb02y revisit the theoretical framework of marked temporal point processes [1]\nand then use it to formally de\ufb01ne our novel reinforcement learning problem, where an agent interacts\nwith a complex environment by means of asynchronous stochastic discrete events in continuous time.\n\n1Our setting should not be confused with the asynchronous setting of Mnih et al. [20], where the gradient descent is asynchronous but the\n\naction/observations are synchronous and the system evolves at discrete time steps.\n\n2https://github.com/Networks-Learning/tpprl\n\n2\n\n\fMarked temporal point processes. A marked temporal point process (MTPP) is a random process\nwhose realization consists of an ordered sequence of events localized in time, i.e.,\n\nH = {e0 = (t0, z0), e1 = (t1, z1), . . . , en = (tn, zn)},\n\nwhere ti 2 R+ is the time of occurrence of event i 2 Z and zi 2Z is the associated mark. The\nactual meaning of the events varies across applications, e.g. in social networks, ti may represent\nthe time when a message is posted, clicked or liked, zi may represent the type of interaction, the\nmessage content, or its polarity, and the domain of the marks Z is application dependent. Here,\nwe characterize the event times of a MTPP using a conditional intensity function \u21e4(t), which\nis the probability of observing an event in the time window [t, t + dt) given the events history\nHt = {ei = (ti, zi) 2H| ti < t}, i.e.,\n\n\u21e4(t) := P{event in [t, t + dt)|Ht},\n\n(1)\nwhere the sign \u21e4 means that the intensity may depend on the history Ht. Moreover, we characterize\nthe marks of the events using a distribution m(z |Ht) = m\u21e4(z), which is the probability that mark z\nis selected, if an event has occurred at time t. Then, we can compute the likelihood of a history of\nevents AT \u2713H T as:\n\nP(AT ) :=0@ Yei2AT\n\nProb. of an action at ti\n\n\u21e4(ti)\n\nz}|{\n\n1A\n\nProb. of mark zi\n\nm\u21e4(zi)\n\n| {z }\n\nProb. of no actions at t 2 [0,T ]\\{ti}\n\n}|\nz\nexp Z T\n\n0\n\n{\n\u21e4(s) ds! .\n\n(2)\n\nIn the remainder of the paper, whenever an intensity function and mark distribution are parametrized\nby \u2713, we write \u21e4\u2713(\u00b7), m\u21e4\u2713(\u00b7), P\u2713(AT ), and, for notational simplicity, use p\u21e4\u2713 = (\u21e4\u2713, m\u21e4\u2713) as a short-\nhand to denote the joint probability density of the MTPP. Recent literature [5, 8, 12, 13, 17, 30, 33] has\nestablished that MTPPs outperform other models (e.g., exponential law) in their ability to accurately\npredict online and off-line human actions.\nReinforcement learning of marked temporal point processes. Assume there is an agent who takes\nactions in a complex environment and the environment also provides feedback to the agent over time.\nMoreover, both the actions and the feedback are asynchronous stochastic events localized in time and\nthus we characterize them using marked temporal point processes (MTPPs), i.e.,\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713)\n\n\u2014 Action events: A = {ei = (ti, yi)}, where (ti, yi) \u21e0 p\u21e4\n\u2014 Feedback events: F = {fi = (ti, zi)}, where (ti, zi) \u21e0 p\u21e4\n\nF; to depend on\nIn the above characterization, we allow the joint probability densities p\u21e4\nthe joint history of events Ht := At [F t. Finally, after a cut-off time T , we assume that the agent\nreceives an arbitrary (stochastic) reward R\u21e4(T ), which may depend on the agent\u2019s actions AT and\nthe environment\u2019s feedback FT .\nGiven the above problem setting, we can formally de\ufb01ne our reinforcement learning (RL) problem\nfor marked temporal point processes as follows:\nProblem de\ufb01nition. Given an agent with p\u21e4\nF; = (\u21e4, m\u21e4)\nand an arbitrary stochastic reward R\u21e4(T ), the goal is to \ufb01nd the optimal action intensity and mark\ndistribution\u2014the optimal policy\u2014that maximize the expected reward. Formally,\n\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713), an environment with p\u21e4\n\nF; = (\u21e4, m\u21e4)\nA;\u2713 and p\u21e4\n\nmaximize\n\np\u21e4\nA;\u2713(\u00b7)\n\nEAT \u21e0p\u21e4\n\nA;\u2713(\u00b7),FT \u21e0p\u21e4\n\nF;(\u00b7) [R\u21e4(T )] ,\n\n(3)\n\nwhere the expectation is taken over all possible realizations of the marked temporal point processes\nassociated to the agent\u2019s action events and the environment\u2019s feedback events. In the remainder of\nthe paper, we will denote the optimal policy using \u21e1\u21e4(\u2713) = argmaxp\u21e4\n\nA;\u2713(\u00b7) E [R\u21e4(T )].\n\nNote that the above de\ufb01nition departs from previous work on reinforcement learning [4, 6, 9, 15, 20,\n26, 28, 31] in several ways. First, the agent\u2019s actions and environment\u2019s feedback are asynchronous\nstochastic events in continuous time. Moreover, note that the agent may receive feedback from the\nenvironment asynchronously at any time, not only after each of its actions. This is in contrast with\nprevious work in the literature, which has only considered synchronous actions (and potentially\ndelayed) feedback in discrete time (or, in some cases, continuous actions and feedback), as illustrated\n\n3\n\n\fAgent\n\nAAAB7nicdVDLSgMxFL1TX7W+qi7dBIvgasgUUZcVNy6r2Ae0Q8mkmTY0kxmSjFCGfoQbF4q49Xvc+Tdmpi2o6IWQwzn3cO89QSK4Nhh/OqWV1bX1jfJmZWt7Z3evun/Q1nGqKGvRWMSqGxDNBJesZbgRrJsoRqJAsE4wuc71zgNTmsfy3kwT5kdkJHnIKTGW6mRXIybNbFCtYfcc54Wwe7YE9Tnw3OLHNVhUc1D96A9jmkbWTAXRuufhxPgZUYZTwWaVfqpZQuiEjFjPQkkipv2sWHeGTiwzRGGs7JMGFex3R0YiradRYDsjYsb6t5aTf2m91ISXfsZlkhom6XxQmApkYpTfjoZcMWrE1AJCFbe7IjomilBjE6rYEJaXov9Bu+562PVu67XG3SKOMhzBMZyCBxfQgBtoQgsoTOARnuHFSZwn59V5m7eWnIXnEH6U8/4FkkiPwQ==\nAAAB7nicdVDLSgMxFL1TX7W+qi7dBIvgasgUUZcVNy6r2Ae0Q8mkmTY0kxmSjFCGfoQbF4q49Xvc+Tdmpi2o6IWQwzn3cO89QSK4Nhh/OqWV1bX1jfJmZWt7Z3evun/Q1nGqKGvRWMSqGxDNBJesZbgRrJsoRqJAsE4wuc71zgNTmsfy3kwT5kdkJHnIKTGW6mRXIybNbFCtYfcc54Wwe7YE9Tnw3OLHNVhUc1D96A9jmkbWTAXRuufhxPgZUYZTwWaVfqpZQuiEjFjPQkkipv2sWHeGTiwzRGGs7JMGFex3R0YiradRYDsjYsb6t5aTf2m91ISXfsZlkhom6XxQmApkYpTfjoZcMWrE1AJCFbe7IjomilBjE6rYEJaXov9Bu+562PVu67XG3SKOMhzBMZyCBxfQgBtoQgsoTOARnuHFSZwn59V5m7eWnIXnEH6U8/4FkkiPwQ==\nAAAB7nicdVDLSgMxFL1TX7W+qi7dBIvgasgUUZcVNy6r2Ae0Q8mkmTY0kxmSjFCGfoQbF4q49Xvc+Tdmpi2o6IWQwzn3cO89QSK4Nhh/OqWV1bX1jfJmZWt7Z3evun/Q1nGqKGvRWMSqGxDNBJesZbgRrJsoRqJAsE4wuc71zgNTmsfy3kwT5kdkJHnIKTGW6mRXIybNbFCtYfcc54Wwe7YE9Tnw3OLHNVhUc1D96A9jmkbWTAXRuufhxPgZUYZTwWaVfqpZQuiEjFjPQkkipv2sWHeGTiwzRGGs7JMGFex3R0YiradRYDsjYsb6t5aTf2m91ISXfsZlkhom6XxQmApkYpTfjoZcMWrE1AJCFbe7IjomilBjE6rYEJaXov9Bu+562PVu67XG3SKOMhzBMZyCBxfQgBtoQgsoTOARnuHFSZwn59V5m7eWnIXnEH6U8/4FkkiPwQ==\nAAAB7nicdVDLSgMxFL1TX7W+qi7dBIvgasgUUZcVNy6r2Ae0Q8mkmTY0kxmSjFCGfoQbF4q49Xvc+Tdmpi2o6IWQwzn3cO89QSK4Nhh/OqWV1bX1jfJmZWt7Z3evun/Q1nGqKGvRWMSqGxDNBJesZbgRrJsoRqJAsE4wuc71zgNTmsfy3kwT5kdkJHnIKTGW6mRXIybNbFCtYfcc54Wwe7YE9Tnw3OLHNVhUc1D96A9jmkbWTAXRuufhxPgZUYZTwWaVfqpZQuiEjFjPQkkipv2sWHeGTiwzRGGs7JMGFex3R0YiradRYDsjYsb6t5aTf2m91ISXfsZlkhom6XxQmApkYpTfjoZcMWrE1AJCFbe7IjomilBjE6rYEJaXov9Bu+562PVu67XG3SKOMhzBMZyCBxfQgBtoQgsoTOARnuHFSZwn59V5m7eWnIXnEH6U8/4FkkiPwQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt 2 Ht\n\nAAAB+3icbVC7TgJBFL3rE/GFWNpMJBorsmujJYkNJRh5JLAhs8MAE2ZnNzN3jWTDr9hYaIytvd9gZ/wZZ4FCwZNMcnLOvblnThBLYdB1v5y19Y3Nre3cTn53b//gsHBUbJoo0Yw3WCQj3Q6o4VIo3kCBkrdjzWkYSN4KxjeZ37rn2ohI3eEk5n5Ih0oMBKNopV6hiKQrVDekOGJUptVpD3uFklt2ZyCrxFuQUuW8/v0BALVe4bPbj1gScoVMUmM6nhujn1KNgkk+zXcTw2PKxnTIO5YqGnLjp7PsU3JmlT4ZRNo+hWSm/t5IaWjMJAzsZBbSLHuZ+J/XSXBw7adCxQlyxeaHBokkGJGsCNIXmjOUE0so08JmJWxENWVo68rbErzlL6+S5mXZc8te3bZxC3Pk4ARO4QI8uIIKVKEGDWDwAI/wDC/O1HlyXp23+eias9g5hj9w3n8AMEqWzQ==\nAAAB+3icbVC7TsMwFHXKq5RXKSOLRQViqhIWGCuxdGwRfUhNFDmu01p1nMi+QVRRP4KFkYWhCLHyE4xsiJ/BfQzQciRLR+fcq3t8gkRwDbb9ZeXW1jc2t/LbhZ3dvf2D4mGppeNUUdaksYhVJyCaCS5ZEzgI1kkUI1EgWDsYXk/99h1TmsfyFkYJ8yLSlzzklICR/GIJsMulGxEYUCKy2tgHv1i2K/YMeJU4C1KunjW+Px4fJnW/+On2YppGTAIVROuuYyfgZUQBp4KNC26qWULokPRZ11BJIqa9bJZ9jE+N0sNhrMyTgGfq742MRFqPosBMTkPqZW8q/ud1UwivvIzLJAUm6fxQmAoMMZ4WgXtcMQpiZAihipusmA6IIhRMXQVTgrP85VXSuqg4dsVpmDZu0Bx5dIxO0Dly0CWqohqqoyai6B49oQl6scbWs/Vqvc1Hc9Zi5wj9gfX+A6b7mKU=\nAAAB+3icbVC7TsMwFHXKq5RXKSOLRQViqhIWGCuxdGwRfUhNFDmu01p1nMi+QVRRP4KFkYWhCLHyE4xsiJ/BfQzQciRLR+fcq3t8gkRwDbb9ZeXW1jc2t/LbhZ3dvf2D4mGppeNUUdaksYhVJyCaCS5ZEzgI1kkUI1EgWDsYXk/99h1TmsfyFkYJ8yLSlzzklICR/GIJsMulGxEYUCKy2tgHv1i2K/YMeJU4C1KunjW+Px4fJnW/+On2YppGTAIVROuuYyfgZUQBp4KNC26qWULokPRZ11BJIqa9bJZ9jE+N0sNhrMyTgGfq742MRFqPosBMTkPqZW8q/ud1UwivvIzLJAUm6fxQmAoMMZ4WgXtcMQpiZAihipusmA6IIhRMXQVTgrP85VXSuqg4dsVpmDZu0Bx5dIxO0Dly0CWqohqqoyai6B49oQl6scbWs/Vqvc1Hc9Zi5wj9gfX+A6b7mKU=\nAAAB+3icbVA9T8MwFHTKVylfoYwsFhUSU5WwwFiJpWNBtEVqoshxndaq40T2C6KK8ldYGECIlT/Cxr/BaTNAy0mWTnfv6Z0vTAXX4DjfVm1jc2t7p77b2Ns/ODyyj5sDnWSKsj5NRKIeQqKZ4JL1gYNgD6liJA4FG4azm9IfPjKleSLvYZ4yPyYTySNOCRgpsJuAPS69mMCUEpF3iwACu+W0nQXwOnEr0kIVeoH95Y0TmsVMAhVE65HrpODnRAGnghUNL9MsJXRGJmxkqCQx036+yF7gc6OMcZQo8yTghfp7Iyex1vM4NJNlSL3qleJ/3iiD6NrPuUwzYJIuD0WZwJDgsgg85opREHNDCFXcZMV0ShShYOpqmBLc1S+vk8Fl23Xa7q3T6txVddTRKTpDF8hFV6iDuqiH+oiiJ/SMXtGbVVgv1rv1sRytWdXOCfoD6/MHDv6Uew==\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\n(a) Data and representation\n\nW3\n\nAAAB6nicdZDLSgMxFIbP1Futt6pLN8GiuBoyiuiy4MZlq/YC7VAyaaYNzWSGJCOUoY/gxoUibn0Qn8Gd+DJm2goq+kPg5zvnkHP+IBFcG4zfncLC4tLySnG1tLa+sblV3t5p6jhVlDVoLGLVDohmgkvWMNwI1k4UI1EgWCsYXeT11i1TmsfyxowT5kdkIHnIKTEWXbd6J71yBbunOBfCLv4yc+LNSaV6WP94BYBar/zW7cc0jZg0VBCtOx5OjJ8RZTgVbFLqppolhI7IgHWslSRi2s+mq07QgSV9FMbKPmnQlH6fyEik9TgKbGdEzFD/ruXwr1onNeG5n3GZpIZJOvsoTAUyMcrvRn2uGDVibA2hittdER0SRaix6ZRsCF+Xov9N89j1sOvVbRpXMFMR9mAfjsCDM6jCJdSgARQGcAcP8OgI5955cp5nrQVnPrMLP+S8fAIP4I/p\nAAAB6nicdZDLSgMxFIbP1Futt6pLN8GiuBoyiuiy4MZlq/YC7VAyaaYNzVxIMkIZ+ggudFERt4Lv4tKd+DJmegEV/SHw851zyDm/FwuuNMYfVm5hcWl5Jb9aWFvf2Nwqbu/UVZRIymo0EpFsekQxwUNW01wL1owlI4EnWMMbXGT1xi2TikfhjR7GzA1IL+Q+p0QbdN3onHSKJWyf4kwI23huZsSZkVL5sPr5Nr5/rXSK7+1uRJOAhZoKolTLwbF2UyI1p4KNCu1EsZjQAemxlrEhCZhy08mqI3RgSBf5kTQv1GhCv0+kJFBqGHimMyC6r37XMvhXrZVo/9xNeRgnmoV0+pGfCKQjlN2NulwyqsXQGEIlN7si2ieSUG3SKZgQ5pei/0392Haw7VRNGlcwVR72YB+OwIEzKMMlVKAGFHpwB2N4tIT1YD1Zz9PWnDWb2YUfsl6+AIaRkcE=\nAAAB6nicdZDLSgMxFIbP1Futt6pLN8GiuBoyiuiy4MZlq/YC7VAyaaYNzVxIMkIZ+ggudFERt4Lv4tKd+DJmegEV/SHw851zyDm/FwuuNMYfVm5hcWl5Jb9aWFvf2Nwqbu/UVZRIymo0EpFsekQxwUNW01wL1owlI4EnWMMbXGT1xi2TikfhjR7GzA1IL+Q+p0QbdN3onHSKJWyf4kwI23huZsSZkVL5sPr5Nr5/rXSK7+1uRJOAhZoKolTLwbF2UyI1p4KNCu1EsZjQAemxlrEhCZhy08mqI3RgSBf5kTQv1GhCv0+kJFBqGHimMyC6r37XMvhXrZVo/9xNeRgnmoV0+pGfCKQjlN2NulwyqsXQGEIlN7si2ieSUG3SKZgQ5pei/0392Haw7VRNGlcwVR72YB+OwIEzKMMlVKAGFHpwB2N4tIT1YD1Zz9PWnDWb2YUfsl6+AIaRkcE=\nAAAB6nicdZDLSgMxFIbP1Futt6pLN8EiuBoyStFlwY3LeukF2qFk0kwbmskMSUYoQx/BjQtF3PpE7nwbM+0UVPSHwM93ziHn/EEiuDYYfzqlldW19Y3yZmVre2d3r7p/0NZxqihr0VjEqhsQzQSXrGW4EaybKEaiQLBOMLnK650HpjSP5b2ZJsyPyEjykFNiLLrrDM4H1Rp26zgXwi5emoJ4BalBoeag+tEfxjSNmDRUEK17Hk6MnxFlOBVsVumnmiWETsiI9ayVJGLaz+arztCJJUMUxso+adCcfp/ISKT1NApsZ0TMWP+u5fCvWi814aWfcZmkhkm6+ChMBTIxyu9GQ64YNWJqDaGK210RHRNFqLHpVGwIy0vR/6Z95nrY9W5wrXFbxFGGIziGU/DgAhpwDU1oAYURPMIzvDjCeXJenbdFa8kpZg7hh5z3L+6FjZc=\n\nWz, bz\n\nAAAB7nicdZDLSgMxFIbPeK31VnXpJlgEF2XICKLLghtXUsV2Cu0wZNJMG5q5kGSEduhDuHGhiFsfwedw59uY6QVU9IfAz3fOIef8QSq40hh/WkvLK6tr66WN8ubW9s5uZW+/pZJMUtakiUhkOyCKCR6zpuZasHYqGYkCwdxgeFnU3XsmFU/iOz1KmReRfsxDTok2yHX9cS3wx36liu0zXAhhGy/MnDhzUq3X3q/BqOFXPrq9hGYRizUVRKmOg1Pt5URqTgWblLuZYimhQ9JnHWNjEjHl5dN1J+jYkB4KE2lerNGUfp/ISaTUKApMZ0T0QP2uFfCvWifT4YWX8zjNNIvp7KMwE0gnqLgd9bhkVIuRMYRKbnZFdEAkodokVDYhLC5F/5vWqe1g27kxadzCTCU4hCM4AQfOoQ5X0IAmUBjCAzzBs5Vaj9aL9TprXbLmMwfwQ9bbFwkckN8=\nAAAB7nicdZDLSgMxFIYz9VbrrSqu3ASL4KIMGUF0WXDjRqhiO4V2GDJppg3NZIYkI7RDH8KNC0Xc+gg+hzvXvoYLM72Aiv4Q+PnOOeScP0g4Uxqhd6uwsLi0vFJcLa2tb2xulbd3mipOJaENEvNYtgKsKGeCNjTTnLYSSXEUcOoGg/O87t5SqVgsbvQwoV6Ee4KFjGBtkOv6o2rgj/xyBdknKBdENpqbGXFmpFKrvl5+fuxt1v3yW6cbkzSiQhOOlWo7KNFehqVmhNNxqZMqmmAywD3aNlbgiCovm6w7hoeGdGEYS/OEhhP6fSLDkVLDKDCdEdZ99buWw79q7VSHZ17GRJJqKsj0ozDlUMcwvx12maRE86ExmEhmdoWkjyUm2iRUMiHML4X/m+ax7SDbuTJpXIOpimAfHIAj4IBTUAMXoA4agIABuAMP4NFKrHvryXqethas2cwu+CHr5Qu5WpLg\nAAAB7nicdZDLSgMxFIYz9VbrrSqu3ASL4KIMGUF0WXDjRqhiO4V2GDJppg3NZIYkI7RDH8KNC0Xc+gg+hzvXvoYLM72Aiv4Q+PnOOeScP0g4Uxqhd6uwsLi0vFJcLa2tb2xulbd3mipOJaENEvNYtgKsKGeCNjTTnLYSSXEUcOoGg/O87t5SqVgsbvQwoV6Ee4KFjGBtkOv6o2rgj/xyBdknKBdENpqbGXFmpFKrvl5+fuxt1v3yW6cbkzSiQhOOlWo7KNFehqVmhNNxqZMqmmAywD3aNlbgiCovm6w7hoeGdGEYS/OEhhP6fSLDkVLDKDCdEdZ99buWw79q7VSHZ17GRJJqKsj0ozDlUMcwvx12maRE86ExmEhmdoWkjyUm2iRUMiHML4X/m+ax7SDbuTJpXIOpimAfHIAj4IBTUAMXoA4agIABuAMP4NFKrHvryXqethas2cwu+CHr5Qu5WpLg\nAAAB7nicdZDLSsNAFIZP6q3WW9Wlm8EiuJAwEUSXBTcuq9im0IYwmU7aoZMLMxOhDX0INy4UcevzuPNtnLQpqOgPAz/fOYc55w9SwZXG+NOqrKyurW9UN2tb2zu7e/X9g45KMklZmyYikd2AKCZ4zNqaa8G6qWQkCgRzg/F1UXcfmFQ8ie/1JGVeRIYxDzkl2iDX9adngT/16w1sX+BCCNt4aUrilKQBpVp+/aM/SGgWsVhTQZTqOTjVXk6k5lSwWa2fKZYSOiZD1jM2JhFTXj5fd4ZODBmgMJHmxRrN6feJnERKTaLAdEZEj9TvWgH/qvUyHV55OY/TTLOYLj4KM4F0gorb0YBLRrWYGEOo5GZXREdEEqpNQjUTwvJS9L/pnNsOtp1b3GjelXFU4QiO4RQcuIQm3EAL2kBhDI/wDC9Waj1Zr9bborVilTOH8EPW+xcUjI9t\n\nWt, bt\n\nAAAB73icbVDLSgNBEOz1GeMr6tHLYBA8hLDrRY8BL54kinlAsiyzk9lkyOzsOtMrhCU/4cWDIl79A7/Dm3/j5HHQxIKGoqqb7q4wlcKg6347K6tr6xubha3i9s7u3n7p4LBpkkwz3mCJTHQ7pIZLoXgDBUreTjWncSh5KxxeTfzWI9dGJOoeRyn3Y9pXIhKMopXarQArJAwwKJXdqjsFWSbenJRrlc8bsKgHpa9uL2FZzBUySY3peG6Kfk41Cib5uNjNDE8pG9I+71iqaMyNn0/vHZNTq/RIlGhbCslU/T2R09iYURzazpjiwCx6E/E/r5NhdOnnQqUZcsVmi6JMEkzI5HnSE5ozlCNLKNPC3krYgGrK0EZUtCF4iy8vk+Z51XOr3q1N4w5mKMAxnMAZeHABNbiGOjSAgYQneIFX58F5dt6c91nrijOfOYI/cD5+AD9okPM=\nAAAB73icbVC7SgNBFL0bXzG+omJlMxgEixB2bbQM2NgIUcwDkmWZncwmQ2YfztwVwpKfsLFQxNY/8DvsrP0NCyePQhMPXDiccy/33uMnUmi07U8rt7S8srqWXy9sbG5t7xR39xo6ThXjdRbLWLV8qrkUEa+jQMlbieI09CVv+oOLsd+850qLOLrFYcLdkPYiEQhG0Uitpodl4nvoFUt2xZ6ALBJnRkrV8vvV99fBds0rfnS6MUtDHiGTVOu2YyfoZlShYJKPCp1U84SyAe3xtqERDbl2s8m9I3JslC4JYmUqQjJRf09kNNR6GPqmM6TY1/PeWPzPa6cYnLuZiJIUecSmi4JUEozJ+HnSFYozlENDKFPC3EpYnyrK0ERUMCE48y8vksZpxbErzrVJ4wamyMMhHMEJOHAGVbiEGtSBgYQHeIJn6856tF6s12lrzprN7MMfWG8/76aS9A==\nAAAB73icbVC7SgNBFL0bXzG+omJlMxgEixB2bbQM2NgIUcwDkmWZncwmQ2YfztwVwpKfsLFQxNY/8DvsrP0NCyePQhMPXDiccy/33uMnUmi07U8rt7S8srqWXy9sbG5t7xR39xo6ThXjdRbLWLV8qrkUEa+jQMlbieI09CVv+oOLsd+850qLOLrFYcLdkPYiEQhG0Uitpodl4nvoFUt2xZ6ALBJnRkrV8vvV99fBds0rfnS6MUtDHiGTVOu2YyfoZlShYJKPCp1U84SyAe3xtqERDbl2s8m9I3JslC4JYmUqQjJRf09kNNR6GPqmM6TY1/PeWPzPa6cYnLuZiJIUecSmi4JUEozJ+HnSFYozlENDKFPC3EpYnyrK0ERUMCE48y8vksZpxbErzrVJ4wamyMMhHMEJOHAGVbiEGtSBgYQHeIJn6856tF6s12lrzprN7MMfWG8/76aS9A==\nAAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBg5TEix4LXjxWsR/QhrDZbtqlm03cnQgl9E948aCIV/+ON/+N2zYHbX0w8Hhvhpl5YSqFQdf9dkpr6xubW+Xtys7u3v5B9fCobZJMM95iiUx0N6SGS6F4CwVK3k01p3EoeScc38z8zhPXRiTqAScp92M6VCISjKKVup0AL0gYYFCtuXV3DrJKvILUoEAzqH71BwnLYq6QSWpMz3NT9HOqUTDJp5V+ZnhK2ZgOec9SRWNu/Hx+75ScWWVAokTbUkjm6u+JnMbGTOLQdsYUR2bZm4n/eb0Mo2s/FyrNkCu2WBRlkmBCZs+TgdCcoZxYQpkW9lbCRlRThjaiig3BW355lbQv655b9+7cWuO+iKMMJ3AK5+DBFTTgFprQAgYSnuEV3pxH58V5dz4WrSWnmDmGP3A+fwBK2I+B\n\nW1\n\nAAAB6nicbVDLSgNBEOyNrxhfUY9eBoPiKex6iceAF4+JmgckS5idzCZDZmeXmV4hLPkELx4U8eqH+A3exJ9x8jhoYkFDUdVNd1eQSGHQdb+c3Nr6xuZWfruws7u3f1A8PGqaONWMN1gsY90OqOFSKN5AgZK3E81pFEjeCkbXU7/1wLURsbrHccL9iA6UCAWjaKW7Vs/rFUtu2Z2BrBJvQUrV8/r3BwDUesXPbj9macQVMkmN6Xhugn5GNQom+aTQTQ1PKBvRAe9YqmjEjZ/NTp2QM6v0SRhrWwrJTP09kdHImHEU2M6I4tAse1PxP6+TYnjlZ0IlKXLF5ovCVBKMyfRv0heaM5RjSyjTwt5K2JBqytCmU7AheMsvr5LmZdlzy17dpnELc+ThBE7hAjyoQBVuoAYNYDCAR3iGF0c6T86r8zZvzTmLmWP4A+f9B/43j90=\nAAAB6nicbVA9SwNBEJ2LXzF+RS1tFoNiFe5stAzYWCZqPiAJYW8zlyzZ2zt294Rw5CdYaBERW8H/Ymkn/hk3H4UmPhh4vDfDzDw/Flwb1/1yMiura+sb2c3c1vbO7l5+/6Cmo0QxrLJIRKrhU42CS6wabgQ2YoU09AXW/cHVxK/fo9I8kndmGGM7pD3JA86osdJtveN18gW36E5Blok3J4XSaeX7Y/z4Xu7kP1vdiCUhSsME1brpubFpp1QZzgSOcq1EY0zZgPawaamkIep2Oj11RE6s0iVBpGxJQ6bq74mUhloPQ992htT09aI3Ef/zmokJLtspl3FiULLZoiARxERk8jfpcoXMiKEllClubyWsTxVlxqaTsyF4iy8vk9p50XOLXsWmcQMzZOEIjuEMPLiAElxDGarAoAcPMIZnRzhPzovzOmvNOPOZQ/gD5+0HdPeRtQ==\nAAAB6nicbVA9SwNBEJ2LXzF+RS1tFoNiFe5stAzYWCZqPiAJYW8zlyzZ2zt294Rw5CdYaBERW8H/Ymkn/hk3H4UmPhh4vDfDzDw/Flwb1/1yMiura+sb2c3c1vbO7l5+/6Cmo0QxrLJIRKrhU42CS6wabgQ2YoU09AXW/cHVxK/fo9I8kndmGGM7pD3JA86osdJtveN18gW36E5Blok3J4XSaeX7Y/z4Xu7kP1vdiCUhSsME1brpubFpp1QZzgSOcq1EY0zZgPawaamkIep2Oj11RE6s0iVBpGxJQ6bq74mUhloPQ992htT09aI3Ef/zmokJLtspl3FiULLZoiARxERk8jfpcoXMiKEllClubyWsTxVlxqaTsyF4iy8vk9p50XOLXsWmcQMzZOEIjuEMPLiAElxDGarAoAcPMIZnRzhPzovzOmvNOPOZQ/gD5+0HdPeRtQ==\nAAAB6nicbVDLSgNBEOz1GeMr6tHLYBA8hR0vegx48RgfeUCyhNnJbDJkdnaZ6RXCkk/w4kERr36RN//GSbIHTSxoKKq66e4KUyUt+v63t7a+sbm1Xdop7+7tHxxWjo5bNskMF02eqMR0QmaFklo0UaISndQIFodKtMPxzcxvPwljZaIfcZKKIGZDLSPJGTrpod2n/UrVr/lzkFVCC1KFAo1+5as3SHgWC41cMWu71E8xyJlByZWYlnuZFSnjYzYUXUc1i4UN8vmpU3LulAGJEuNKI5mrvydyFls7iUPXGTMc2WVvJv7ndTOMroNc6jRDofliUZQpggmZ/U0G0giOauII40a6WwkfMcM4unTKLgS6/PIqaV3WqF+jd361fl/EUYJTOIMLoHAFdbiFBjSBwxCe4RXePOW9eO/ex6J1zStmTuAPvM8f3OuNiw==\n\nAAAB9XicdVDJSgNBEK1xjXGLevTSGIRcHHqGrLeAF49RzAJJDD2dTtKkZ6G7RwlD/sOLB0W8+i/evPkL/oGdREFFHxQ83quiqp4XCa40xq/W0vLK6tp6aiO9ubW9s5vZ22+oMJaU1WkoQtnyiGKCB6yuuRasFUlGfE+wpjc+nfnNayYVD4NLPYlY1yfDgA84JdpIV6Newk+caUeHaNTjvUwW27jilvMlhO1CIe8W84bgglspYuTYeI5s1cm9vwFArZd56fRDGvss0FQQpdoOjnQ3IVJzKtg03YkViwgdkyFrGxoQn6luMr96io6N0keDUJoKNJqr3ycS4is18T3T6RM9Ur+9mfiX1471oNxNeBDFmgV0sWgQC2SenEWA+lwyqsXEEEIlN7ciOiKSUG2CSpsQvj5F/5OGazvYds5NGhewQAoO4Qhy4EAJqnAGNagDBQm3cA8P1o11Zz1aT4vWJetz5gB+wHr+AJxtlN8=\nAAAB9XicdVC7SgNBFJ2NrxhfUUsLB4OQxmV2ybML2IhVFPOAJIbZyWx2yOyDmVklLPkPGwtFbG38Ejt7f8A/cJIoqOiBC4dz7uXee5yIM6kQejVSC4tLyyvp1cza+sbmVnZ7pynDWBDaICEPRdvBknIW0IZiitN2JCj2HU5bzuh46reuqJAsDC7UOKI9Hw8D5jKClZYuvX7CjqxJV4XQ67N+NodMVLUrhTJEZrFYsEsFTVDRrpYQtEw0Q65m5d/fnk/36/3sS3cQktingSIcS9mxUKR6CRaKEU4nmW4saYTJCA9pR9MA+1T2ktnVE3iolQF0Q6ErUHCmfp9IsC/l2Hd0p4+VJ397U/EvrxMrt9JLWBDFigZkvsiNOdRPTiOAAyYoUXysCSaC6Vsh8bDAROmgMjqEr0/h/6RpmxYyrTOdxjmYIw32wAHIAwuUQQ2cgDpoAAIEuAF34N64Nm6NB+Nx3poyPmd2wQ8YTx8fPpYA\nAAAB9XicdVC7SgNBFJ2NrxhfUUsLB4OQxmV2ybML2IhVFPOAJIbZyWx2yOyDmVklLPkPGwtFbG38Ejt7f8A/cJIoqOiBC4dz7uXee5yIM6kQejVSC4tLyyvp1cza+sbmVnZ7pynDWBDaICEPRdvBknIW0IZiitN2JCj2HU5bzuh46reuqJAsDC7UOKI9Hw8D5jKClZYuvX7CjqxJV4XQ67N+NodMVLUrhTJEZrFYsEsFTVDRrpYQtEw0Q65m5d/fnk/36/3sS3cQktingSIcS9mxUKR6CRaKEU4nmW4saYTJCA9pR9MA+1T2ktnVE3iolQF0Q6ErUHCmfp9IsC/l2Hd0p4+VJ397U/EvrxMrt9JLWBDFigZkvsiNOdRPTiOAAyYoUXysCSaC6Vsh8bDAROmgMjqEr0/h/6RpmxYyrTOdxjmYIw32wAHIAwuUQQ2cgDpoAAIEuAF34N64Nm6NB+Nx3poyPmd2wQ8YTx8fPpYA\nAAAB9XicdVDLSgMxFM34rPVVdekmWAQ3Dpmhz13Bjcsq9gHtOGTSTBuayQxJRilD/8ONC0Xc+i/u/BvTh6CiBy4czrmXe+8JEs6URujDWlldW9/YzG3lt3d29/YLB4dtFaeS0BaJeSy7AVaUM0FbmmlOu4mkOAo47QTji5nfuaNSsVjc6ElCvQgPBQsZwdpItyM/Y+fOtK9jOPKZXygiG9XdWqkKkV0ul9xKyRBUdusVBB0bzVEESzT9wnt/EJM0okITjpXqOSjRXoalZoTTab6fKppgMsZD2jNU4IgqL5tfPYWnRhnAMJamhIZz9ftEhiOlJlFgOiOsR+q3NxP/8nqpDmtexkSSairIYlGYcmienEUAB0xSovnEEEwkM7dCMsISE22CypsQvj6F/5O2azvIdq5QsXG9jCMHjsEJOAMOqIIGuARN0AIESPAAnsCzdW89Wi/W66J1xVrOHIEfsN4+AVr8knU=\n\nhi1 hi\nWh, bh\n\nAAAB73icdVDJSgNBEK2JW4xb1KOXxiB4CENPyHoLePEkUcwCSRh6Oj1Jk57F7h4hhPyEFw+KePUP/A5v/o2dREFFHxQ83quiqp4XC640xu9WamV1bX0jvZnZ2t7Z3cvuH7RUlEjKmjQSkex4RDHBQ9bUXAvWiSUjgSdY2xufzf32LZOKR+G1nsSsH5BhyH1OiTZSp+2O8shzR242h21cK1SLFYTtUqlYKBcNwaVCrYyRY+MFcvX86wUYNNzsW28Q0SRgoaaCKNV1cKz7UyI1p4LNMr1EsZjQMRmyrqEhCZjqTxf3ztCJUQbIj6SpUKOF+n1iSgKlJoFnOgOiR+q3Nxf/8rqJ9qv9KQ/jRLOQLhf5iUA6QvPn0YBLRrWYGEKo5OZWREdEEqpNRBkTwten6H/SKtgOtp1Lk8YVLJGGIziGU3CgAnU4hwY0gYKAO3iAR+vGureerOdla8r6nDmEH7BePgCL55Ep\nAAAB73icdVDLSgMxFM3UV61aq+LKTbAILsqQKX3uCm7cCFXsA9oyZNK0Dc1kxiQjlKE/4caFIm79A7/DnWt/w4Vpq6CiBy4czrmXe+/xQs6URujVSiwtr6yuJddTG5tb6e3Mzm5TBZEktEECHsi2hxXlTNCGZprTdigp9j1OW974ZOa3rqlULBCXehLSno+Hgg0YwdpI7ZY7ykHPHbmZLLJRNV8plCGyi8VCvlQwBBXz1RKCjo3myNZyz2fvb/vpupt56fYDEvlUaMKxUh0HhboXY6kZ4XSa6kaKhpiM8ZB2DBXYp6oXz++dwiOj9OEgkKaEhnP1+0SMfaUmvmc6faxH6rc3E//yOpEeVHoxE2GkqSCLRYOIQx3A2fOwzyQlmk8MwUQycyskIywx0SailAnh61P4P2nmbQfZzrlJ4wIskAQH4BAcAweUQQ2cgjpoAAI4uAF34N66sm6tB+tx0ZqwPmf2wA9YTx88NJMq\nAAAB73icdVDLSgMxFM3UV61aq+LKTbAILsqQKX3uCm7cCFXsA9oyZNK0Dc1kxiQjlKE/4caFIm79A7/DnWt/w4Vpq6CiBy4czrmXe+/xQs6URujVSiwtr6yuJddTG5tb6e3Mzm5TBZEktEECHsi2hxXlTNCGZprTdigp9j1OW974ZOa3rqlULBCXehLSno+Hgg0YwdpI7ZY7ykHPHbmZLLJRNV8plCGyi8VCvlQwBBXz1RKCjo3myNZyz2fvb/vpupt56fYDEvlUaMKxUh0HhboXY6kZ4XSa6kaKhpiM8ZB2DBXYp6oXz++dwiOj9OEgkKaEhnP1+0SMfaUmvmc6faxH6rc3E//yOpEeVHoxE2GkqSCLRYOIQx3A2fOwzyQlmk8MwUQycyskIywx0SailAnh61P4P2nmbQfZzrlJ4wIskAQH4BAcAweUQQ2cgjpoAAI4uAF34N66sm6tB+tx0ZqwPmf2wA9YTx88NJMq\nAAAB73icdVDLSgMxFM3UV62vqks3wSK4kCFT+twV3LisYh/QDkMmzXRCM5kxyQhl6E+4caGIW3/HnX9j+hBU9MCFwzn3cu89fsKZ0gh9WLm19Y3Nrfx2YWd3b/+geHjUVXEqCe2QmMey72NFORO0o5nmtJ9IiiOf054/uZz7vXsqFYvFrZ4m1I3wWLCAEayN1O954QX0vdArlpCNmuVGpQ6RXa1WyrWKIahabtYQdGy0QAms0PaK78NRTNKICk04VmrgoES7GZaaEU5nhWGqaILJBI/pwFCBI6rcbHHvDJ4ZZQSDWJoSGi7U7xMZjpSaRr7pjLAO1W9vLv7lDVIdNNyMiSTVVJDloiDlUMdw/jwcMUmJ5lNDMJHM3ApJiCUm2kRUMCF8fQr/J92y7SDbuUal1s0qjjw4AafgHDigDlrgCrRBBxDAwQN4As/WnfVovVivy9actZo5Bj9gvX0Cl1ePtw==\n\nAAAB8nicbVDLSgNBEOz1GeMr6tHLYBBySdj1oseAF49RzAOSZZmdzCZDZneWmV4hLPkMLx4U8erXePPmL/gHTh4HTSxoKKq66e4KUykMuu6ns7a+sbm1Xdgp7u7tHxyWjo5bRmWa8SZTUulOSA2XIuFNFCh5J9WcxqHk7XB0PfXbD1wboZJ7HKfcj+kgEZFgFK3UxUBUMchF1ZsEpbJbc2cgq8RbkHLdq3x/AUAjKH30+oplMU+QSWpM13NT9HOqUTDJJ8VeZnhK2YgOeNfShMbc+Pns5Ak5t0qfRErbSpDM1N8TOY2NGceh7YwpDs2yNxX/87oZRld+LpI0Q56w+aIokwQVmf5P+kJzhnJsCWVa2FsJG1JNGdqUijYEb/nlVdK6qHluzbu1adzBHAU4hTOogAeXUIcbaEATGCh4hGd4cdB5cl6dt3nrmrOYOYE/cN5/AAQnk1k=\nAAAB8nicbVC7SgNBFJ2NrxhfUUsLF4OQJmHHRsuAjVhFMQ/YLMvsZJIMmZ1ZZu4KYcln2FgoYmvpl9jZ+wP+gZNHoYkHLhzOuZd774kSwQ143qeTW1ldW9/Ibxa2tnd294r7B02jUk1ZgyqhdDsihgkuWQM4CNZONCNxJFgrGl5O/NY904YreQejhAUx6Uve45SAlXwIeQXCjFfwOCyWvKo3hbtM8JyUarj8/fV+fVwPix+drqJpzCRQQYzxsZdAkBENnAo2LnRSwxJCh6TPfEsliZkJsunJY/fUKl23p7QtCe5U/T2RkdiYURzZzpjAwCx6E/E/z0+hdxFkXCYpMElni3qpcEG5k//dLteMghhZQqjm9laXDogmFGxKBRsCXnx5mTTPqtir4hubxi2aIY+O0AkqI4zOUQ1doTpqIIoUekBP6NkB59F5cV5nrTlnPnOI/sB5+wGG6ZR6\nAAAB8nicbVC7SgNBFJ2NrxhfUUsLF4OQJmHHRsuAjVhFMQ/YLMvsZJIMmZ1ZZu4KYcln2FgoYmvpl9jZ+wP+gZNHoYkHLhzOuZd774kSwQ143qeTW1ldW9/Ibxa2tnd294r7B02jUk1ZgyqhdDsihgkuWQM4CNZONCNxJFgrGl5O/NY904YreQejhAUx6Uve45SAlXwIeQXCjFfwOCyWvKo3hbtM8JyUarj8/fV+fVwPix+drqJpzCRQQYzxsZdAkBENnAo2LnRSwxJCh6TPfEsliZkJsunJY/fUKl23p7QtCe5U/T2RkdiYURzZzpjAwCx6E/E/z0+hdxFkXCYpMElni3qpcEG5k//dLteMghhZQqjm9laXDogmFGxKBRsCXnx5mTTPqtir4hubxi2aIY+O0AkqI4zOUQ1doTpqIIoUekBP6NkB59F5cV5nrTlnPnOI/sB5+wGG6ZR6\nAAAB8nicbVBNS8NAEJ3Ur1q/qh69BIvgpSXxoseCF49V7Ae0IWy2m3bpZjfsToQS+jO8eFDEq7/Gm//GbZuDtj4YeLw3w8y8KBXcoOd9O6WNza3tnfJuZW//4PCoenzSMSrTlLWpEkr3ImKY4JK1kaNgvVQzkkSCdaPJ7dzvPjFtuJKPOE1ZkJCR5DGnBK3Ux5DXMcx53Z+F1ZrX8BZw14lfkBoUaIXVr8FQ0SxhEqkgxvR9L8UgJxo5FWxWGWSGpYROyIj1LZUkYSbIFyfP3AurDN1YaVsS3YX6eyIniTHTJLKdCcGxWfXm4n9eP8P4Jsi5TDNkki4XxZlwUbnz/90h14yimFpCqOb2VpeOiSYUbUoVG4K/+vI66Vw1fK/h33u15kMRRxnO4BwuwYdraMIdtKANFBQ8wyu8Oei8OO/Ox7K15BQzp/AHzucPwqeQ7w==\n\nti ti1\nWa\n\nAAAB6nicdVBNS8NAEJ3Ur1q/qh69LBbFU0iaauut4MVjq7YV2lA22227dLMJuxuhhP4ELx4U8eoP8Td4E/+M21ZBRR8MPN6bYWZeEHOmtOO8WZmFxaXllexqbm19Y3Mrv73TVFEiCW2QiEfyOsCKciZoQzPN6XUsKQ4DTlvB6Gzqt26oVCwSV3ocUz/EA8H6jGBtpMtWF3fzBccunZ5UiiXk2J7nlo4dQ9yy55U95NrODIXqYf39BQBq3fxrpxeRJKRCE46VartOrP0US80Ip5NcJ1E0xmSEB7RtqMAhVX46O3WCDozSQ/1ImhIazdTvEykOlRqHgekMsR6q395U/MtrJ7pf8VMm4kRTQeaL+glHOkLTv1GPSUo0HxuCiWTmVkSGWGKiTTo5E8LXp+h/0izarkmmbtK4gDmysAf7cAQulKEK51CDBhAYwC3cw4PFrTvr0Xqat2asz5ld+AHr+QOxBZBW\nAAAB6nicdVDLSgNBEOyNrxhfUY9eBoPiadnNRhNvAS8eEzUPSJYwO5kkQ2YfzMwKYckneNBDRLwK/otHb+LPOEkUVLSgoajqprvLiziTyrLejNTC4tLySno1s7a+sbmV3d6pyzAWhNZIyEPR9LCknAW0ppjitBkJin2P04Y3PJv6jWsqJAuDKzWKqOvjfsB6jGClpctGB3eyOcssnJ6U8gVkmY5jF44tTeyi4xQdZJvWDLnyYfX9ZXL7XOlkX9vdkMQ+DRThWMqWbUXKTbBQjHA6zrRjSSNMhrhPW5oG2KfSTWanjtGBVrqoFwpdgUIz9ftEgn0pR76nO32sBvK3NxX/8lqx6pXchAVRrGhA5ot6MUcqRNO/UZcJShQfaYKJYPpWRAZYYKJ0Ohkdwten6H9Sz5u2Tqaq07iAOdKwB/twBDYUoQznUIEaEOjDDUzg3uDGnfFgPM5bU8bnzC78gPH0ASfFki4=\nAAAB6nicdVDLSgNBEOyNrxhfUY9eBoPiadnNRhNvAS8eEzUPSJYwO5kkQ2YfzMwKYckneNBDRLwK/otHb+LPOEkUVLSgoajqprvLiziTyrLejNTC4tLySno1s7a+sbmV3d6pyzAWhNZIyEPR9LCknAW0ppjitBkJin2P04Y3PJv6jWsqJAuDKzWKqOvjfsB6jGClpctGB3eyOcssnJ6U8gVkmY5jF44tTeyi4xQdZJvWDLnyYfX9ZXL7XOlkX9vdkMQ+DRThWMqWbUXKTbBQjHA6zrRjSSNMhrhPW5oG2KfSTWanjtGBVrqoFwpdgUIz9ftEgn0pR76nO32sBvK3NxX/8lqx6pXchAVRrGhA5ot6MUcqRNO/UZcJShQfaYKJYPpWRAZYYKJ0Ohkdwten6H9Sz5u2Tqaq07iAOdKwB/twBDYUoQznUIEaEOjDDUzg3uDGnfFgPM5bU8bnzC78gPH0ASfFki4=\nAAAB6nicdVDLSsNAFL2pr1pfVZduBovgKiRNtXVXcOOyPvqANpTJdNIOnUzCzEQooZ/gxoUibv0id/6N04egogcuHM65l3vvCRLOlHacDyu3srq2vpHfLGxt7+zuFfcPWipOJaFNEvNYdgKsKGeCNjXTnHYSSXEUcNoOxpczv31PpWKxuNOThPoRHgoWMoK1kW7bfdwvlhy7cnFeK1eQY3ueWzlzDHGrnlf1kGs7c5RgiUa/+N4bxCSNqNCEY6W6rpNoP8NSM8LptNBLFU0wGeMh7RoqcESVn81PnaITowxQGEtTQqO5+n0iw5FSkygwnRHWI/Xbm4l/ed1UhzU/YyJJNRVksShMOdIxmv2NBkxSovnEEEwkM7ciMsISE23SKZgQvj5F/5NW2XZNMtdOqX6zjCMPR3AMp+BCFepwBQ1oAoEhPMATPFvcerRerNdFa85azhzCD1hvn4+5jgQ=\n\nW4\n\nAAAB6nicbVDLSgNBEOyNrxhfUY9eBoPiKeyKoMeAF4+JmgckS5id9CZDZmeXmVkhLPkELx4U8eqH+A3exJ9x8jhoYkFDUdVNd1eQCK6N6345uZXVtfWN/GZha3tnd6+4f9DQcaoY1lksYtUKqEbBJdYNNwJbiUIaBQKbwfB64jcfUGkey3szStCPaF/ykDNqrHTX7F50iyW37E5Blok3J6XKae37AwCq3eJnpxezNEJpmKBatz03MX5GleFM4LjQSTUmlA1pH9uWShqh9rPpqWNyYpUeCWNlSxoyVX9PZDTSehQFtjOiZqAXvYn4n9dOTXjlZ1wmqUHJZovCVBATk8nfpMcVMiNGllCmuL2VsAFVlBmbTsGG4C2+vEwa52XPLXs1m8YtzJCHIziGM/DgEipwA1WoA4M+PMIzvDjCeXJenbdZa86ZzxzCHzjvPwLSj+A=\nAAAB6nicbVDLSgNBEOz1GeMr6tHLYFA8hV0R9Bjw4jFR84BkCbOTTjJkdnaZmRXCkk/woIeIeBX8F4/exJ9x8jhoYkFDUdVNd1cQC66N6345S8srq2vrmY3s5tb2zm5ub7+qo0QxrLBIRKoeUI2CS6wYbgTWY4U0DATWgv7V2K/do9I8kndmEKMf0q7kHc6osdJtrXXeyuXdgjsBWSTejOSLJ+Xvj9Hje6mV+2y2I5aEKA0TVOuG58bGT6kynAkcZpuJxpiyPu1iw1JJQ9R+Ojl1SI6t0iadSNmShkzU3xMpDbUehIHtDKnp6XlvLP7nNRLTufRTLuPEoGTTRZ1EEBOR8d+kzRUyIwaWUKa4vZWwHlWUGZtO1obgzb+8SKpnBc8teGWbxg1MkYFDOIJT8OACinANJagAgy48wAieHeE8OS/O67R1yZnNHMAfOG8/eYORuA==\nAAAB6nicbVDLSgNBEOz1GeMr6tHLYFA8hV0R9Bjw4jFR84BkCbOTTjJkdnaZmRXCkk/woIeIeBX8F4/exJ9x8jhoYkFDUdVNd1cQC66N6345S8srq2vrmY3s5tb2zm5ub7+qo0QxrLBIRKoeUI2CS6wYbgTWY4U0DATWgv7V2K/do9I8kndmEKMf0q7kHc6osdJtrXXeyuXdgjsBWSTejOSLJ+Xvj9Hje6mV+2y2I5aEKA0TVOuG58bGT6kynAkcZpuJxpiyPu1iw1JJQ9R+Ojl1SI6t0iadSNmShkzU3xMpDbUehIHtDKnp6XlvLP7nNRLTufRTLuPEoGTTRZ1EEBOR8d+kzRUyIwaWUKa4vZWwHlWUGZtO1obgzb+8SKpnBc8teGWbxg1MkYFDOIJT8OACinANJagAgy48wAieHeE8OS/O67R1yZnNHMAfOG8/eYORuA==\nAAAB6nicbVBNS8NAEJ3Ur1q/oh69LBbBU0lE0GPBi8f60Q9oQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8MJXCoOd9O6W19Y3NrfJ2ZWd3b//APTxqmSTTjDdZIhPdCanhUijeRIGSd1LNaRxK3g7HNzO//cS1EYl6xEnKg5gOlYgEo2ilh3b/su9WvZo3B1klfkGqUKDRd796g4RlMVfIJDWm63spBjnVKJjk00ovMzylbEyHvGupojE3QT4/dUrOrDIgUaJtKSRz9fdETmNjJnFoO2OKI7PszcT/vG6G0XWQC5VmyBVbLIoySTAhs7/JQGjOUE4soUwLeythI6opQ5tOxYbgL7+8SloXNd+r+XdetX5fxFGGEziFc/DhCupwCw1oAoMhPMMrvDnSeXHenY9Fa8kpZo7hD5zPH+F3jY4=\n\nWf\n\nAAAB6nicdVDJSgNBEK1xjXGLevTSGARPoTuISW4BLx7jkgWSIfR0epImPQvdPUIYAv6AFw+KePUX/BFvfoh3exIFFX1Q8Hiviqp6XiyFNhi/OQuLS8srq7m1/PrG5tZ2YWe3paNEMd5kkYxUx6OaSxHyphFG8k6sOA08ydve+DTz29dcaRGFV2YSczegw1D4glFjpct23+8XiriEMSaEoIyQygm2pFarlkkVkcyyKNaP319uAKDRL7z2BhFLAh4aJqnWXYJj46ZUGcEkn+Z7ieYxZWM65F1LQxpw7aazU6fo0CoD5EfKVmjQTP0+kdJA60ng2c6AmpH+7WXiX143MX7VTUUYJ4aHbL7ITyQyEcr+RgOhODNyYgllSthbERtRRZmx6eRtCF+fov9Jq1wiuETObRoXMEcO9uEAjoBABepwBg1oAoMh3MI9PDjSuXMenad564LzObMHP+A8fwADC5CU\nAAAB6nicdVC7SgNBFL3rM8ZX1NJmMAhWYSaISbqAjWV85AFJCLOT2WTI7IOZWSEs6WxtLBSxtbOy8Efs/BB7ZxMFFT1w4XDOvdx7jxtJoQ3Gb87c/MLi0nJmJbu6tr6xmdvabugwVozXWShD1XKp5lIEvG6EkbwVKU59V/KmOzpO/eYlV1qEwYUZR7zr00EgPMGosdJ5s+f1cnlcwBgTQlBKSOkIW1KplIukjEhqWeSrh+8vV89PmVov99rphyz2eWCYpFq3CY5MN6HKCCb5JNuJNY8oG9EBb1saUJ/rbjI9dYL2rdJHXqhsBQZN1e8TCfW1Hvuu7fSpGerfXir+5bVj45W7iQii2PCAzRZ5sUQmROnfqC8UZ0aOLaFMCXsrYkOqKDM2nawN4etT9D9pFAsEF8ipTeMMZsjALuzBARAoQRVOoAZ1YDCAa7iFO0c6N8698zBrnXM+Z3bgB5zHD+S7kfw=\nAAAB6nicdVC7SgNBFL3rM8ZX1NJmMAhWYSaISbqAjWV85AFJCLOT2WTI7IOZWSEs6WxtLBSxtbOy8Efs/BB7ZxMFFT1w4XDOvdx7jxtJoQ3Gb87c/MLi0nJmJbu6tr6xmdvabugwVozXWShD1XKp5lIEvG6EkbwVKU59V/KmOzpO/eYlV1qEwYUZR7zr00EgPMGosdJ5s+f1cnlcwBgTQlBKSOkIW1KplIukjEhqWeSrh+8vV89PmVov99rphyz2eWCYpFq3CY5MN6HKCCb5JNuJNY8oG9EBb1saUJ/rbjI9dYL2rdJHXqhsBQZN1e8TCfW1Hvuu7fSpGerfXir+5bVj45W7iQii2PCAzRZ5sUQmROnfqC8UZ0aOLaFMCXsrYkOqKDM2nawN4etT9D9pFAsEF8ipTeMMZsjALuzBARAoQRVOoAZ1YDCAa7iFO0c6N8698zBrnXM+Z3bgB5zHD+S7kfw=\nAAAB6nicdVDLSgMxFL1TX7W+qi7dBIvgqiRd2HZXcOOyPvqAtpRMmmlDM5khyQhl6Ce4caGIW7/InX9jpq2gogcuHM65l3vv8WMpjMX4w8utrW9sbuW3Czu7e/sHxcOjtokSzXiLRTLSXZ8aLoXiLSus5N1Ycxr6knf86WXmd+65NiJSd3YW80FIx0oEglHrpNvOMBgWS7iMMSaEoIyQ6gV2pF6vVUgNkcxyKMEKzWHxvT+KWBJyZZmkxvQIju0gpdoKJvm80E8Mjymb0jHvOapoyM0gXZw6R2dOGaEg0q6URQv1+0RKQ2Nmoe86Q2on5reXiX95vcQGtUEqVJxYrthyUZBIZCOU/Y1GQnNm5cwRyrRwtyI2oZoy69IpuBC+PkX/k3alTHCZXONS42YVRx5O4BTOgUAVGnAFTWgBgzE8wBM8e9J79F6812VrzlvNHMMPeG+fekSN9g==\n\n,AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgIYRdL3oMePEkiZgHJEuYnfQmY2Znl5lZISz5Ai8eFPHqV/gd3vwbJ4+DJhY0FFXddHcFieDauO63k1tb39jcym8Xdnb39g+Kh0dNHaeKYYPFIlbtgGoUXGLDcCOwnSikUSCwFYyup37rEZXmsbw34wT9iA4kDzmjxkr1cq9YcivuDGSVeAtSqpY/b8Gi1it+dfsxSyOUhgmqdcdzE+NnVBnOBE4K3VRjQtmIDrBjqaQRaj+bHTohZ1bpkzBWtqQhM/X3REYjrcdRYDsjaoZ62ZuK/3md1IRXfsZlkhqUbL4oTAUxMZl+TfpcITNibAllittbCRtSRZmx2RRsCN7yy6ukeVHx3IpXt2ncwRx5OIFTOAcPLqEKN1CDBjBAeIIXeHUenGfnzXmft+acxcwx/IHz8QNrKY4u\n\nAAAB6HicbVC7SgNBFL0bXzFqjIqVzWAQLELYtdEyYGMjJGIekCxhdnI3GTP7YGZWCEu+wMZCEVu/wu+ws/Y3LJw8Ck08cOFwzr3ce48XC660bX9amZXVtfWN7GZua3snv1vY22+oKJEM6ywSkWx5VKHgIdY11wJbsUQaeAKb3vBy4jfvUSoehbd6FKMb0H7Ifc6oNlKt1C0U7bI9BVkmzpwUK6X36++vw3y1W/jo9CKWBBhqJqhSbceOtZtSqTkTOM51EoUxZUPax7ahIQ1Quen00DE5MUqP+JE0FWoyVX9PpDRQahR4pjOgeqAWvYn4n9dOtH/hpjyME40hmy3yE0F0RCZfkx6XyLQYGUKZ5OZWwgZUUqZNNjkTgrP48jJpnJUdu+zUTBo3MEMWjuAYTsGBc6jAFVShDgwQHuAJnq0769F6sV5nrRlrPnMAf2C9/QAbdpAv\nAAAB6HicbVC7SgNBFL0bXzFqjIqVzWAQLELYtdEyYGMjJGIekCxhdnI3GTP7YGZWCEu+wMZCEVu/wu+ws/Y3LJw8Ck08cOFwzr3ce48XC660bX9amZXVtfWN7GZua3snv1vY22+oKJEM6ywSkWx5VKHgIdY11wJbsUQaeAKb3vBy4jfvUSoehbd6FKMb0H7Ifc6oNlKt1C0U7bI9BVkmzpwUK6X36++vw3y1W/jo9CKWBBhqJqhSbceOtZtSqTkTOM51EoUxZUPax7ahIQ1Quen00DE5MUqP+JE0FWoyVX9PpDRQahR4pjOgeqAWvYn4n9dOtH/hpjyME40hmy3yE0F0RCZfkx6XyLQYGUKZ5OZWwgZUUqZNNjkTgrP48jJpnJUdu+zUTBo3MEMWjuAYTsGBc6jAFVShDgwQHuAJnq0769F6sV5nrRlrPnMAf2C9/QAbdpAv\nAAAB6HicbVA9SwNBEJ2LXzF+RS1tFoNgIeHOJpYBG8tEzAckR9jbzCVr9vaO3T0hHPkFNhaK2PqT7Pw3bpIrNPHBwOO9GWbmBYng2rjut1PY2Nza3inulvb2Dw6PyscnbR2nimGLxSJW3YBqFFxiy3AjsJsopFEgsBNMbud+5wmV5rF8MNME/YiOJA85o8ZKzatBueJW3QXIOvFyUoEcjUH5qz+MWRqhNExQrXuemxg/o8pwJnBW6qcaE8omdIQ9SyWNUPvZ4tAZubDKkISxsiUNWai/JzIaaT2NAtsZUTPWq95c/M/rpSa88TMuk9SgZMtFYSqIicn8azLkCpkRU0soU9zeStiYKsqMzaZkQ/BWX14n7euq51a9plup3+dxFOEMzuESPKhBHe6gAS1ggPAMr/DmPDovzrvzsWwtOPnMKfyB8/kDdpmMvA==\n\n,AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgIYRdL3oMePEkiZgHJEuYnfQmY2Znl5lZISz5Ai8eFPHqV/gd3vwbJ4+DJhY0FFXddHcFieDauO63k1tb39jcym8Xdnb39g+Kh0dNHaeKYYPFIlbtgGoUXGLDcCOwnSikUSCwFYyup37rEZXmsbw34wT9iA4kDzmjxkr1cq9YcivuDGSVeAtSqpY/b8Gi1it+dfsxSyOUhgmqdcdzE+NnVBnOBE4K3VRjQtmIDrBjqaQRaj+bHTohZ1bpkzBWtqQhM/X3REYjrcdRYDsjaoZ62ZuK/3md1IRXfsZlkhqUbL4oTAUxMZl+TfpcITNibAllittbCRtSRZmx2RRsCN7yy6ukeVHx3IpXt2ncwRx5OIFTOAcPLqEKN1CDBjBAeIIXeHUenGfnzXmft+acxcwx/IHz8QNrKY4u\n\nAAAB6HicbVC7SgNBFL0bXzFqjIqVzWAQLELYtdEyYGMjJGIekCxhdnI3GTP7YGZWCEu+wMZCEVu/wu+ws/Y3LJw8Ck08cOFwzr3ce48XC660bX9amZXVtfWN7GZua3snv1vY22+oKJEM6ywSkWx5VKHgIdY11wJbsUQaeAKb3vBy4jfvUSoehbd6FKMb0H7Ifc6oNlKt1C0U7bI9BVkmzpwUK6X36++vw3y1W/jo9CKWBBhqJqhSbceOtZtSqTkTOM51EoUxZUPax7ahIQ1Quen00DE5MUqP+JE0FWoyVX9PpDRQahR4pjOgeqAWvYn4n9dOtH/hpjyME40hmy3yE0F0RCZfkx6XyLQYGUKZ5OZWwgZUUqZNNjkTgrP48jJpnJUdu+zUTBo3MEMWjuAYTsGBc6jAFVShDgwQHuAJnq0769F6sV5nrRlrPnMAf2C9/QAbdpAv\nAAAB6HicbVC7SgNBFL0bXzFqjIqVzWAQLELYtdEyYGMjJGIekCxhdnI3GTP7YGZWCEu+wMZCEVu/wu+ws/Y3LJw8Ck08cOFwzr3ce48XC660bX9amZXVtfWN7GZua3snv1vY22+oKJEM6ywSkWx5VKHgIdY11wJbsUQaeAKb3vBy4jfvUSoehbd6FKMb0H7Ifc6oNlKt1C0U7bI9BVkmzpwUK6X36++vw3y1W/jo9CKWBBhqJqhSbceOtZtSqTkTOM51EoUxZUPax7ahIQ1Quen00DE5MUqP+JE0FWoyVX9PpDRQahR4pjOgeqAWvYn4n9dOtH/hpjyME40hmy3yE0F0RCZfkx6XyLQYGUKZ5OZWwgZUUqZNNjkTgrP48jJpnJUdu+zUTBo3MEMWjuAYTsGBc6jAFVShDgwQHuAJnq0769F6sV5nrRlrPnMAf2C9/QAbdpAv\nAAAB6HicbVA9SwNBEJ2LXzF+RS1tFoNgIeHOJpYBG8tEzAckR9jbzCVr9vaO3T0hHPkFNhaK2PqT7Pw3bpIrNPHBwOO9GWbmBYng2rjut1PY2Nza3inulvb2Dw6PyscnbR2nimGLxSJW3YBqFFxiy3AjsJsopFEgsBNMbud+5wmV5rF8MNME/YiOJA85o8ZKzatBueJW3QXIOvFyUoEcjUH5qz+MWRqhNExQrXuemxg/o8pwJnBW6qcaE8omdIQ9SyWNUPvZ4tAZubDKkISxsiUNWai/JzIaaT2NAtsZUTPWq95c/M/rpSa88TMuk9SgZMtFYSqIicn8azLkCpkRU0soU9zeStiYKsqMzaZkQ/BWX14n7euq51a9plup3+dxFOEMzuESPKhBHe6gAS1ggPAMr/DmPDovzrvzsWwtOPnMKfyB8/kDdpmMvA==\n\nbbAAAB6nicbVDLSgNBEOyNrxhfUY9eBoPiKex6iceAF4+JmgckS5id9CZDZmeXmVkhLPkELx4U8eqH+A3exJ9x8jhoYkFDUdVNd1eQCK6N6345ubX1jc2t/HZhZ3dv/6B4eNTUcaoYNlgsYtUOqEbBJTYMNwLbiUIaBQJbweh66rceUGkey3szTtCP6EDykDNqrHQX9IJeseSW3RnIKvEWpFQ9r39/AECtV/zs9mOWRigNE1Trjucmxs+oMpwJnBS6qcaEshEdYMdSSSPUfjY7dULOrNInYaxsSUNm6u+JjEZaj6PAdkbUDPWyNxX/8zqpCa/8jMskNSjZfFGYCmJiMv2b9LlCZsTYEsoUt7cSNqSKMmPTKdgQvOWXV0nzsuy5Za9u07iFOfJwAqdwAR5UoAo3UIMGMBjAIzzDiyOcJ+fVeZu35pzFzDH8gfP+A1lMkBk=\n\nAAAB6nicbVA9SwNBEJ2LXzF+RS1tFoNiFe5stAzYWCZqPiAJYW+zlyzZ2zt254Rw5CdYaBERW8H/Ymkn/hk3H4UmPhh4vDfDzDw/lsKg6345mZXVtfWN7GZua3tndy+/f1AzUaIZr7JIRrrhU8OlULyKAiVvxJrT0Je87g+uJn79nmsjInWHw5i3Q9pTIhCMopVu/Y7fyRfcojsFWSbenBRKp5Xvj/Hje7mT/2x1I5aEXCGT1Jim58bYTqlGwSQf5VqJ4TFlA9rjTUsVDblpp9NTR+TEKl0SRNqWQjJVf0+kNDRmGPq2M6TYN4veRPzPayYYXLZToeIEuWKzRUEiCUZk8jfpCs0ZyqEllGlhbyWsTzVlaNPJ2RC8xZeXSe286LlFr2LTuIEZsnAEx3AGHlxACa6hDFVg0IMHGMOzI50n58V5nbVmnPnMIfyB8/YDz/2R8Q==\nAAAB6nicbVA9SwNBEJ2LXzF+RS1tFoNiFe5stAzYWCZqPiAJYW+zlyzZ2zt254Rw5CdYaBERW8H/Ymkn/hk3H4UmPhh4vDfDzDw/lsKg6345mZXVtfWN7GZua3tndy+/f1AzUaIZr7JIRrrhU8OlULyKAiVvxJrT0Je87g+uJn79nmsjInWHw5i3Q9pTIhCMopVu/Y7fyRfcojsFWSbenBRKp5Xvj/Hje7mT/2x1I5aEXCGT1Jim58bYTqlGwSQf5VqJ4TFlA9rjTUsVDblpp9NTR+TEKl0SRNqWQjJVf0+kNDRmGPq2M6TYN4veRPzPayYYXLZToeIEuWKzRUEiCUZk8jfpCs0ZyqEllGlhbyWsTzVlaNPJ2RC8xZeXSe286LlFr2LTuIEZsnAEx3AGHlxACa6hDFVg0IMHGMOzI50n58V5nbVmnPnMIfyB8/YDz/2R8Q==\nAAAB6nicbVA9SwNBEJ2LXzF+RS1tFoNgFfZstAzYWMaPxEByhL3NXLJkb+/Y3RPCkZ9gY6GIrb/Izn/jJrlCEx8MPN6bYWZemEphLKXfXmltfWNzq7xd2dnd2z+oHh61TZJpji2eyER3QmZQCoUtK6zETqqRxaHEx3B8PfMfn1AbkagHO0kxiNlQiUhwZp10H/bDfrVG63QOskr8gtSgQLNf/eoNEp7FqCyXzJiuT1Mb5ExbwSVOK73MYMr4mA2x66hiMZogn586JWdOGZAo0a6UJXP190TOYmMmceg6Y2ZHZtmbif953cxGV0EuVJpZVHyxKMoksQmZ/U0GQiO3cuII41q4WwkfMc24delUXAj+8surpH1R92ndv6W1xl0RRxlO4BTOwYdLaMANNKEFHIbwDK/w5knvxXv3PhatJa+YOYY/8D5/ADgAjcc=\n\nW2\n\nAAAB6nicdVDLSgNBEOyNrxhfUY9eBoPiadmNWU1uAS8eEzWJkCxhdjKbDJl9MDMrhCWf4MWDIl79EL/Bm/gzTjYKKlrQUFR1093lxZxJZVlvRm5hcWl5Jb9aWFvf2Nwqbu+0ZZQIQlsk4pG49rCknIW0pZji9DoWFAcepx1vfDbzOzdUSBaFV2oSUzfAw5D5jGClpctOv9wvlizzxLEdp4Ys06rUqs5xRpxyuYJs08pQqh82318AoNEvvvYGEUkCGirCsZRd24qVm2KhGOF0WuglksaYjPGQdjUNcUClm2anTtGBVgbIj4SuUKFM/T6R4kDKSeDpzgCrkfztzcS/vG6i/KqbsjBOFA3JfJGfcKQiNPsbDZigRPGJJpgIpm9FZIQFJkqnU9AhfH2K/iftsmlbpt3UaVzAHHnYg304AhtOoQ7n0IAWEBjCLdzDg8GNO+PReJq35ozPmV34AeP5A2+hkCs=\nAAAB6nicdVDLSgMxFM3UV62vqks3waK4GjK1o+2u4MZlq/YB7VAyaaYNzTxIMkIZ+gkudFERt4L/4tKd+DOmUwUVPXDhcM693HuPG3EmFUJvRmZhcWl5JbuaW1vf2NzKb+80ZRgLQhsk5KFou1hSzgLaUExx2o4Exb7Lacsdnc381jUVkoXBlRpH1PHxIGAeI1hp6bLVK/byBWSe2JZtVyAyUalSto9TYheLJWiZKEWhelh/f5nePtd6+dduPySxTwNFOJayY6FIOQkWihFOJ7luLGmEyQgPaEfTAPtUOkl66gQeaKUPvVDoChRM1e8TCfalHPuu7vSxGsrf3kz8y+vEyis7CQuiWNGAzBd5MYcqhLO/YZ8JShQfa4KJYPpWSIZYYKJ0Ojkdwten8H/SLJoWMq26TuMCzJEFe2AfHAELnIIqOAc10AAEDMANmIJ7gxt3xoPxOG/NGJ8zu+AHjKcP5lKSAw==\nAAAB6nicdVDLSgMxFM3UV62vqks3waK4GjK1o+2u4MZlq/YB7VAyaaYNzTxIMkIZ+gkudFERt4L/4tKd+DOmUwUVPXDhcM693HuPG3EmFUJvRmZhcWl5JbuaW1vf2NzKb+80ZRgLQhsk5KFou1hSzgLaUExx2o4Exb7Lacsdnc381jUVkoXBlRpH1PHxIGAeI1hp6bLVK/byBWSe2JZtVyAyUalSto9TYheLJWiZKEWhelh/f5nePtd6+dduPySxTwNFOJayY6FIOQkWihFOJ7luLGmEyQgPaEfTAPtUOkl66gQeaKUPvVDoChRM1e8TCfalHPuu7vSxGsrf3kz8y+vEyis7CQuiWNGAzBd5MYcqhLO/YZ8JShQfa4KJYPpWSIZYYKJ0Ojkdwten8H/SLJoWMq26TuMCzJEFe2AfHAELnIIqOAc10AAEDMANmIJ7gxt3xoPxOG/NGJ8zu+AHjKcP5lKSAw==\nAAAB6nicdVDLSsNAFJ3UV62vqks3g0VwFSax0XZXcOOyPvqANpTJdNIOnUzCzEQooZ/gxoUibv0id/6N07SCih64cDjnXu69J0g4UxqhD6uwsrq2vlHcLG1t7+zulfcP2ipOJaEtEvNYdgOsKGeCtjTTnHYTSXEUcNoJJpdzv3NPpWKxuNPThPoRHgkWMoK1kW47A3dQriD73HM8rw6Rjar1mneWE891q9CxUY4KWKI5KL/3hzFJIyo04VipnoMS7WdYakY4nZX6qaIJJhM8oj1DBY6o8rP81Bk8McoQhrE0JTTM1e8TGY6UmkaB6YywHqvf3lz8y+ulOqz5GRNJqqkgi0VhyqGO4fxvOGSSEs2nhmAimbkVkjGWmGiTTsmE8PUp/J+0XdtBtnONKo2bZRxFcASOwSlwwAVogCvQBC1AwAg8gCfwbHHr0XqxXhetBWs5cwh+wHr7BE5Vjdk=\n\nWy, by\n\nAAAB73icdVDLSgNBEOz1GeMr6tHLYBA8yDIbE2NuAS8eo5gHJMsyO5kkQ2YfzswKSwj4DV48KOLVD/BHvPkh3p0kCipa0FBUddPd5ceCK43xmzU3v7C4tJxZya6urW9s5ra2GypKJGV1GolItnyimOAhq2uuBWvFkpHAF6zpD08nfvOaScWj8FKnMXMD0g95j1OijdRqeumhj7zUy+WxfVws4FIZYRsXK5WjypSUHKeMHBtPka8W319uAKDm5V473YgmAQs1FUSptoNj7Y6I1JwKNs52EsViQoekz9qGhiRgyh1N7x2jfaN0US+SpkKNpur3iREJlEoD33QGRA/Ub28i/uW1E907cUc8jBPNQjpb1EsE0hGaPI+6XDKqRWoIoZKbWxEdEEmoNhFlTQhfn6L/SaNgO9h2zk0aFzBDBnZhDw7AgTJU4QxqUAcKAm7hHh6sK+vOerSeZq1z1ufMDvyA9fwBUTOSdQ==\nAAAB73icdVC7SgNBFJ2NrxhfUUubwSBYyDIbE2O6gI1lFPOAuCyzk9lkyOzDmVlhWdL5BTYWitiKlYU/YueH2DvZKKjogQuHc+7l3nvciDOpEHozcjOzc/ML+cXC0vLK6lpxfaMtw1gQ2iIhD0XXxZJyFtCWYorTbiQo9l1OO+7oaOJ3LqmQLAzOVBJR28eDgHmMYKWlbsdJ9lzoJE6xhMyDShlVaxCZqFKv79czUrWsGrRMlKHUqLy/XD0/5ZtO8fW8H5LYp4EiHEvZs1Ck7BQLxQin48J5LGmEyQgPaE/TAPtU2ml27xjuaKUPvVDoChTM1O8TKfalTHxXd/pYDeVvbyL+5fVi5R3aKQuiWNGATBd5MYcqhJPnYZ8JShRPNMFEMH0rJEMsMFE6ooIO4etT+D9pl00LmdaJTuMUTJEHW2Ab7AIL1EADHIMmaAECOLgGt+DOuDBujHvjYdqaMz5nNsEPGI8fMvKT3Q==\nAAAB73icdVC7SgNBFJ2NrxhfUUubwSBYyDIbE2O6gI1lFPOAuCyzk9lkyOzDmVlhWdL5BTYWitiKlYU/YueH2DvZKKjogQuHc+7l3nvciDOpEHozcjOzc/ML+cXC0vLK6lpxfaMtw1gQ2iIhD0XXxZJyFtCWYorTbiQo9l1OO+7oaOJ3LqmQLAzOVBJR28eDgHmMYKWlbsdJ9lzoJE6xhMyDShlVaxCZqFKv79czUrWsGrRMlKHUqLy/XD0/5ZtO8fW8H5LYp4EiHEvZs1Ck7BQLxQin48J5LGmEyQgPaE/TAPtU2ml27xjuaKUPvVDoChTM1O8TKfalTHxXd/pYDeVvbyL+5fVi5R3aKQuiWNGATBd5MYcqhJPnYZ8JShRPNMFEMH0rJEMsMFE6ooIO4etT+D9pl00LmdaJTuMUTJEHW2Ab7AIL1EADHIMmaAECOLgGt+DOuDBujHvjYdqaMz5nNsEPGI8fMvKT3Q==\nAAAB73icdVDLSsNAFJ34rPVVdelmsAguJExqa+2u4MZlFfuANoTJdNIOnUzizEQIoT/hxoUibv0dd/6N07SCih64cDjnXu69x485UxqhD2tpeWV1bb2wUdzc2t7ZLe3td1SUSELbJOKR7PlYUc4EbWumOe3FkuLQ57TrTy5nfveeSsUicavTmLohHgkWMIK1kXpdLz31oZd6pTKyz6sVVKtDZKNqo3HWyEnNcerQsVGOMlig5ZXeB8OIJCEVmnCsVN9BsXYzLDUjnE6Lg0TRGJMJHtG+oQKHVLlZfu8UHhtlCINImhIa5ur3iQyHSqWhbzpDrMfqtzcT//L6iQ4u3IyJONFUkPmiIOFQR3D2PBwySYnmqSGYSGZuhWSMJSbaRFQ0IXx9Cv8nnYrtINu5RuXmzSKOAjgER+AEOKAOmuAKtEAbEMDBA3gCz9ad9Wi9WK/z1iVrMXMAfsB6+wTIbI/X\n\nV y\ncAAAB7XicdVBNTwIxEJ3FL8Qv1KOXRmLCiXRRWLyRePGIRhYSRNItBSrd7qbtmhDCf/DiQWO8+n+86cmfYgFN1OhLJnl5byYz84JYcG0wfnVSC4tLyyvp1cza+sbmVnZ7x9dRoiir00hEqhkQzQSXrG64EawZK0bCQLBGMDyZ+o0bpjSP5IUZxawdkr7kPU6JsZLvdyi6GnWyOVw48rxyGSNcOHSPXVyyxC17JVxBbgHPkKsW82/vAFDrZF8uuxFNQiYNFUTrlotj0x4TZTgVbJK5TDSLCR2SPmtZKknIdHs8u3aCDqzSRb1I2ZIGzdTvE2MSaj0KA9sZEjPQv72p+JfXSkyv0h5zGSeGSTpf1EsEMhGavo66XDFqxMgSQhW3tyI6IIpQYwPK2BC+PkX/E79YcG0yZzaNc5gjDXuwD3lwwYMqnEIN6kDhGm7hHh6cyLlzHp2neWvK+ZzZhR9wnj8AokWRcQ==\n\nAAAB7XicdVDLSgMxFM3UV62vqks3wSJ0NWSq7dRdwY3LKnZaaGvJpJk2NvMgyQjD0H9w40IRt/6I+AHudCv4HaatgooeuHA4517uvceNOJMKoRcjMze/sLiUXc6trK6tb+Q3txwZxoLQBgl5KFoulpSzgDYUU5y2IkGx73LadEdHE795SYVkYXCmkoh2fTwImMcIVlpynB6B50kvX0DmgW1XKggic986tFBZE6til1EVWiaaolArFV/f3h+f6r38c6cfktingSIcS9m2UKS6KRaKEU7HuU4saYTJCA9oW9MA+1R20+m1Y7inlT70QqErUHCqfp9IsS9l4ru608dqKH97E/Evrx0rr9pNWRDFigZktsiLOVQhnLwO+0xQoniiCSaC6VshGWKBidIB5XQIX5/C/4lTMi2dzIlO4xTMkAU7YBcUgQVsUAPHoA4agIALcAVuwK0RGtfGnXE/a80YnzPb4AeMhw/oOpPk\nAAAB7XicdVDLSgMxFM3UV62vqks3wSJ0NWSq7dRdwY3LKnZaaGvJpJk2NvMgyQjD0H9w40IRt/6I+AHudCv4HaatgooeuHA4517uvceNOJMKoRcjMze/sLiUXc6trK6tb+Q3txwZxoLQBgl5KFoulpSzgDYUU5y2IkGx73LadEdHE795SYVkYXCmkoh2fTwImMcIVlpynB6B50kvX0DmgW1XKggic986tFBZE6til1EVWiaaolArFV/f3h+f6r38c6cfktingSIcS9m2UKS6KRaKEU7HuU4saYTJCA9oW9MA+1R20+m1Y7inlT70QqErUHCqfp9IsS9l4ru608dqKH97E/Evrx0rr9pNWRDFigZktsiLOVQhnLwO+0xQoniiCSaC6VshGWKBidIB5XQIX5/C/4lTMi2dzIlO4xTMkAU7YBcUgQVsUAPHoA4agIALcAVuwK0RGtfGnXE/a80YnzPb4AeMhw/oOpPk\nAAAB7XicdVDLSsNAFJ3UV62vqks3g0VwFSZqm7oruHFZxaaFNpbJdNKOnWTCzEQIof/gxoUibv0fd/6N04egogcuHM65l3vvCRLOlEbowyosLa+srhXXSxubW9s75d09T4lUEtoiggvZCbCinMW0pZnmtJNIiqOA03Ywvpj67XsqFRPxjc4S6kd4GLOQEayN5Hl9Am+zfrmC7DPXrdUQRPapc+6gqiFOza2iOnRsNEMFLNDsl997A0HSiMaacKxU10GJ9nMsNSOcTkq9VNEEkzEe0q6hMY6o8vPZtRN4ZJQBDIU0FWs4U79P5DhSKosC0xlhPVK/van4l9dNdVj3cxYnqaYxmS8KUw61gNPX4YBJSjTPDMFEMnMrJCMsMdEmoJIJ4etT+D/xTmzHJHOFKo3rRRxFcAAOwTFwgAsa4BI0QQsQcAcewBN4toT1aL1Yr/PWgrWY2Qc/YL19An54jx0=\n\nV, b, !\n\nAAACB3icdVDLSgMxFL1TX7W+Rl0KEiyCi1Iyah/uCm5cSRVbC20ZMmnahmYeJBmhDN258VfcuFCkW3/BnX9j+hBU9EDgcM653NzjRYIrjfGHlVpYXFpeSa9m1tY3Nrfs7Z26CmNJWY2GIpQNjygmeMBqmmvBGpFkxPcEu/UG5xP/9o5JxcPgRg8j1vZJL+BdTok2kmvv192kJUy+Q0Y55LlznkOt0Gc94tpZnD8tlYpFjHD+xDlzcMEQp1gq4DJy8niKbCU3vgSDqmu/tzohjX0WaCqIUk0HR7qdEKk5FWyUacWKRYQOSI81DQ2Iz1Q7md4xQodG6aBuKM0LNJqq3ycS4is19D2T9Inuq9/eRPzLa8a6W24nPIhizQI6W9SNBdIhmpSCOlwyqsXQEEIlN39FtE8kodpUlzElfF2K/if147xjmrkybVzDDGnYgwM4AgdKUIELqEINKNzDIzzDi/VgPVmv1ngWTVnzmV34AevtE2XUmoY=\nAAACB3icdVDLSgMxFM3UV61aR8WVIMEiuCglo7ZTdwU3boQqthXaMmQyaRuaeZBkhDJ058ZfceNCkW79BXeu/Q0Xpg9BRQ8EDuecy809bsSZVAi9Gam5+YXFpfRyZmV1LbtubmzWZRgLQmsk5KG4drGknAW0ppji9DoSFPsupw23fzr2GzdUSBYGV2oQ0baPuwHrMIKVlhxzt+4kLa7zHh7moevMeB62Qp92sWPmUOHYtkslBFHhyDqxUFETq2QXURlaBTRBrpIfnX+8b2erjvna8kIS+zRQhGMpmxaKVDvBQjHC6TDTiiWNMOnjLm1qGmCfynYyuWMI97XiwU4o9AsUnKjfJxLsSznwXZ30serJ395Y/MtrxqpTbicsiGJFAzJd1Ik5VCEclwI9JihRfKAJJoLpv0LSwwITpavL6BK+LoX/k/phwdLNXOg2LsEUabAD9sABsIANKuAMVEENEHAL7sEjeDLujAfj2RhNoyljNrMFfsB4+QQWIZyH\nAAACB3icdVDLSgMxFM3UV61aR8WVIMEiuCglo7ZTdwU3boQqthXaMmQyaRuaeZBkhDJ058ZfceNCkW79BXeu/Q0Xpg9BRQ8EDuecy809bsSZVAi9Gam5+YXFpfRyZmV1LbtubmzWZRgLQmsk5KG4drGknAW0ppji9DoSFPsupw23fzr2GzdUSBYGV2oQ0baPuwHrMIKVlhxzt+4kLa7zHh7moevMeB62Qp92sWPmUOHYtkslBFHhyDqxUFETq2QXURlaBTRBrpIfnX+8b2erjvna8kIS+zRQhGMpmxaKVDvBQjHC6TDTiiWNMOnjLm1qGmCfynYyuWMI97XiwU4o9AsUnKjfJxLsSznwXZ30serJ395Y/MtrxqpTbicsiGJFAzJd1Ik5VCEclwI9JihRfKAJJoLpv0LSwwITpavL6BK+LoX/k/phwdLNXOg2LsEUabAD9sABsIANKuAMVEENEHAL7sEjeDLujAfj2RhNoyljNrMFfsB4+QQWIZyH\nAAACB3icdVDLSgMxFM34rPU16lKQYBFclCGjtlN3BTcuq9gHdIYhk0nb0MyDJCOUoTs3/oobF4q49Rfc+TemD0FFDwQO55zLzT1ByplUCH0YC4tLyyurhbXi+sbm1ra5s9uSSSYIbZKEJ6ITYEk5i2lTMcVpJxUURwGn7WB4MfHbt1RIlsQ3apRSL8L9mPUYwUpLvnnQ8nOX63yIx2UY+HNehm4S0T72zRKyzhynWkUQWaf2uY0qmthVp4Jq0LbQFCUwR8M3390wIVlEY0U4lrJro1R5ORaKEU7HRTeTNMVkiPu0q2mMIyq9fHrHGB5pJYS9ROgXKzhVv0/kOJJyFAU6GWE1kL+9ifiX181Ur+blLE4zRWMyW9TLOFQJnJQCQyYoUXykCSaC6b9CMsACE6WrK+oSvi6F/5PWiWXrZq5QqX49r6MA9sEhOAY2cEAdXIIGaAIC7sADeALPxr3xaLwYr7PogjGf2QM/YLx9AnFEmRQ=\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\n(b) Policy parametrization\n\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\nAAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nt AAAB63icbVA9SwNBEJ2LXzF+RS1tFoNgFe5E0DJgYxnFxEByhL3NJlmye3vszgnhyF+wsVDE1j9k579xc7lCEx8MPN6bYWZelEhh0fe/vdLa+sbmVnm7srO7t39QPTxqW50axltMS206EbVcipi3UKDkncRwqiLJH6PJzdx/fOLGCh0/4DThoaKjWAwFo5hLPdT9as2v+znIKgkKUoMCzX71qzfQLFU8Riaptd3ATzDMqEHBJJ9VeqnlCWUTOuJdR2OquA2z/NYZOXPKgAy1cRUjydXfExlV1k5V5DoVxbFd9ubif143xeF1mIk4SZHHbLFomEqCmswfJwNhOEM5dYQyI9ythI2poQxdPBUXQrD88ippX9QDvx7cXdYa90UcZTiBUziHAK6gAbfQhBYwGMMzvMKbp7wX7937WLSWvGLmGP7A+/wBOdyOZQ==\n\nFigure 2: Reinforcement learning (RL) of of marked temporal point processes (MTPPs). Panel\n(a) shows the type of data and representation used in RL of MTPPs. Panel (b) shows the policy\nparametrization used by our policy gradient method.\n\nin Figure 1. Second, our policy is de\ufb01ned by a conditional intensity function (and a mark distribution),\nwhich is used to sample the times (and marks) of the agent\u2019s actions. Here, note that a sampled agent\u2019s\naction may need to be resampled due to the occurrence of new feedback events before the sampled\ntime. In contrast, previous work has used probability distributions (or, in some cases, deterministic\nfunctions) as policies.\nRemarkably, the above problem de\ufb01nition naturally \ufb01ts numerous problems in a wide variety of\napplication domains, particularly in the context of social and information online systems. For example,\nin personalized teaching in online learning platforms, the platform that shows content items to learners\nis the agent, the platform takes an action when it shows an item to a learner, the learners are the\nenvironment, and the probability that the learner recalls an item de\ufb01nes the reward. In viral marketing\nin social networks, a user who aims to increase the visibility of her posts is the agent, the user takes\nan action when she posts a message, her followers\u2019 feeds form the environment and the visibility (or\nattention) she receives de\ufb01nes the reward. In all these cases, the environment distribution p\u21e4\nF; may\nbe highly complex and thus our policy gradient method will only assume that it can sample from\nF;. In other words, the environment distribution will be considered a black box.\np\u21e4\n3 Proposed policy gradient method\n\nIn this section, we tackle the reinforcement learning problem de\ufb01ned by Eq. 3 using a novel policy\ngradient method for marked temporal point processes. More speci\ufb01cally, we \ufb01rst leverage recurrent\nA;\u2713 and then use stochastic gradient descent\nneural networks (RNNs) to parametrize the policy p\u21e4\n(SGD) to \ufb01nd the policy parameters \u2713 that maximizes the expected reward E [R\u21e4].\nPolicy parametrization. In many application domains, at any time t, the (optimal) policy p\u21e4\nA;\u2713 that\nmaximizes the reward may depend on the previous history of the action events and the feedback\nevents, Ht = At[F t, in an unknown and complex way. To capture such dependence, we parametrize\nthe policy p\u21e4\nA;\u2713 using a recurrent neural network (RNN), where we embed both the actions events and\nthe feedback events into real-valued vectors h, similarly as in several recent state of the art MTPP\ndeep learning models [5, 11, 17]3. Next, we elaborate further on our architecture4, which we also\nsummarize in Figure 2, and then discuss how to ef\ufb01ciently sample action events from the (optimal)\npolicy.\n\u2014 Input layer. After the i-th event occurs, be it an action event or a feedback event, the input\nlayer converts the associated information, i.e., the time ti, the marker zi (or yi), and the type of\nevent ei 2{ 0, 1}, where ei = 0 denotes action and ei = 1 denotes feedback, into compact vectors.\nSpeci\ufb01cally, it computes:\n\n\u2327i = Wt(ti ti1) + bt,\nbi = Wa(1 ei) + Wf ei + bb,\n\nyi = Wyyi + by if ei = 0\nzi = Wzzi + bz if ei = 1\n\n3Note that previous MTPP deep learning models aims to provide event predictions. This is contrast with the current work, which aims to\n\nprovide optimal event interventions.\n\nsimpler.\n\n4Depending on the application domains, action events or feedback events may not contain marks and, thus, the architecture may be slightly\n\n4\n\n\fAlgorithm 1: Returns the next action time\n1: Input: Parameters b, wt, V, hi, last event time t0\n2: Output: Next action time t\n3: CDF (\u2022) Cumulative distribution of next arrival time\n4: u UNIF[0, 1]\n5: t CDF 1(u)\n6: while t < T do\n7:\n8:\n9:\n10:\n11:\n12:\nend if\n13:\n14: end while\n15: return t\n\n(s, z) WAITUNTILNEXTFEEDBACK(t)\nif feedback arrived before t then\nCDF (\u2022) MODIFY(CDF (\u2022), s, z)\nt CDF 1(u)\nreturn t\n\nelse\n\nwhere W\u2022, bt, by, bz and bb are trainable weights. Moreover, note that we encode the action marks\nyi and feedback marks zi separately since they may belong to different domains. To this aim, one of\nthe inputs yi and zi will be marked as absent using sentinel values depending on whether ei = 0 or\nei = 1, respectively. Finally, these signals are fed into the hidden layer, which we describe next.\n\u2014 Hidden layer. This layer iteratively updates the latent embedding hi1, by taking inputs of previous\nevents from the input layer:\n\nhi = tanh(Whhi1 + W1\u2327i + W2yi + W3zi + W4bi + bh),\n\n(4)\n\nwhere W\u2022 and bh are trainable weights.\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713), i.e., the intensity function\n\u2014 Output layer. The output layer computes the policy p\u21e4\n\u21e4\u2713 and the mark distribution m\u21e4\u2713. Assume the agent has generated i events by time t, then, the output\nlayer computes the intensity as:\n\n\u21e4\u2713(t) = exp (b + wt(t ti) + Vhi)\n\n(5)\nwhere V, b and wt are trainable weights and ti denotes the time of the i-th action event. Here, the\nb encodes a base intensity level for the occurrence of the (i + 1)-th action event, the term wt(t ti)\nencodes the in\ufb02uence of the i-th action event, and the term V encodes the in\ufb02uence of previous\nevents. The particular choice of mark distribution m\u21e4\u2713 depends on the application domain. Here, we\nexperiment with discrete marks and thus model the marks using a multinomial distribution, i.e.,\n\nP[yi+1 = c] =\n\n,\n\n(6)\n\nexp(V y\n\nc,:hi)\n\nPl2Y exp(V y\n\nl,:hi)\n\nwhere Y denote the domain of the marks and V y are trainable weights.\nSampling action events from the policy. To implement the above policy p\u21e4\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713), we need\nto be able to sample the action times t and marks y from the intensity function de\ufb01ned by Eq. 5 and\nthe mark distribution de\ufb01ned by Eq. 6, respectively. While the latter reduces to sampling from a\nmultinomial distribution, which is straightforward, the former requires developing a novel sampling\nalgorithm leveraging inverse transform sampling, which we describe in Algorithm 1. The details of\ncalculating CDF (\u2022) and the related modi\ufb01cations are provided in Appendix C.\nMaximizing the expected reward. In the following, we denote the expected reward as a function of\nthe policy parameters \u2713 as:\n\nA;\u2713(\u00b7),FT \u21e0p\u21e4\n\nJ(\u2713) = EAT \u21e0p\u21e4\n\n(7)\nA;\u2713 that maximizes the expected reward function J(\u2713) using\nThen, we \ufb01nd the optimal policy p\u21e4\nstochastic gradient descent (SGD) [23], i.e., \u2713l+1 = \u2713l + \u21b5lr\u2713J(\u2713)|\u2713=\u2713l. To do so, we need to\ncompute the gradient of the expected reward function r\u2713J(\u2713), however, this may seem challenging\nat \ufb01rst especially since the expectation is taken over realizations of marked temporal point processes.\nPerhaps surprisingly, we can compute such gradient using the following proposition (proved in\nAppendix A).\n\nF;(\u00b7) [R\u21e4(T )]\n\n5\n\n\fProposition 1. Given an agent with p\u21e4\ngradient of the expected reward function J(\u2713) with respect to \u2713 is given by:\n\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713), an environment with p\u21e4\n\nF; = (\u21e4, m\u21e4), the\n\nr\u2713J(\u2713) = EAT \u21e0p\u21e4\n\nA;\u2713(\u00b7),FT \u21e0p\u21e4\n\nF;(\u00b7) [R\u21e4(T )r\u2713 log P\u2713(AT )] ,\n\n(8)\n\n0 \u21e4\u2713(s) ds.\n\n(log \u21e4\u2713(ti) + log m\u21e4\u2713(zi)) R T\n\nwhere log P\u2713(AT ) =Pei2AT\nIn the above proposition, the gradient of the log-likelihood of the times and marks of a realization\nof the marked temporal point process associated to the agent\u2019s actions, r\u2713 log P\u21e4\nA;\u2713(HT ), can be\neasily computed using the policy parametrization de\ufb01ned by Eqs. 5 and 6. Moreover, note that the\nproposition formally shows that the REINFORCE trick [32] is still valid if the expectation is taken\nover realizations of marked temporal point processes, which are a type of random elements [3] whose\nvalues are discrete events localized in continuous time.\nUnfortunately, the above procedure does not limit the intensity of actions by the agent and this may\nbe problematic in practice (e.g., in viral marketing in social networks, a user who aims to increase\nthe visibility of her posts may only be able to post a certain number of times). To overcome this,\nwe consider instead a penalized expected reward function Jr(\u2713) with differentiable regularizers\ng(\u21e4\u2713(t)) and gm(m\u21e4\u2713(t)), which implicitly impose a budget on the number of action events and\nmarks, respectively, i.e.,\n\nJr(\u2713) = EAT \u21e0p\u21e4\n\nA;\u2713(\u00b7),FT \u21e0p\u21e4\n\nF;(\u00b7)\"R\u21e4(T ) qlZ T\n\n0\n\ng(\u21e4\u2713(t)) qmZ T\n\n0\n\ngm(m\u21e4\u2713(t))dt# .\n\n(9)\n\nThe gradient of the penalized reward can be readily computed using the following proposition (proved\nin Appendix B):\nProposition 2. Given an agent with p\u21e4\ngradient of Jr(\u2713) is given by,\n\nF; = (\u21e4, m\u21e4), the\n\nr\u2713Jr(\u2713) = EAT \u21e0p\u21e4\n\nA;\u2713(\u00b7),FT \u21e0p\u21e4\n\nA;\u2713 = (\u21e4\u2713, m\u21e4\u2713), an environment with p\u21e4\nF;(\u00b7)\"\ng(\u21e4\u2713(t)) qmZ T\ng0(\u21e4\u2713(t))r\u2713\u21e4\u2713(t) dt + qmZ T\n\ngm(m\u21e4\u2713(t)) dt!r\u2713 log P\u2713(AT )\ng0m(m\u21e4\u2713(t))r\u2713m\u21e4\u2713(t) dt!# ,\n\n0\n\n0\n\n0\n\n R\u21e4(T ) qlZ T\n qlZ T\n\n0\n\n(10)\n\nwhere g0(\u21e4\u2713(t)) = d g(\u21e4\u2713 (t))\nd \u21e4\u2713 (t)\n\nand g0m(m\u21e4\u2713(t)) = d gm(m\u21e4\u2713 (t))\nd m\u21e4\u2713 (t)\n\n.\n\nIn our experiments, we will approximate the expectation in Eq. (10) by \ufb01rst running a batch of\nrealizations (or episodes) of the corresponding marked temporal point processes5 and then calculating\nthe mean of the resulting gradients for each batch.\n\n4 Experiments on spaced repetition\n\nProblem de\ufb01nition. It is well known in the psychology literature that repeated and temporally\ndistributed reviewing of information aids long term memorization [14, 16, 19, 18]. Following recent\nwork in the machine learning literature [18, 22, 27], we will consider the following setting: an online\nlearning platform needs to teach one student some number of items with varying dif\ufb01culty, say, words\nfrom the vocabulary of a foreign language. To this aim, the platform interacts with the student during\na studying period by asking her to review each item multiple times, i.e., show a word to the student,\nask for its translation, and then show the correct answer. Then, the goal is to help the platform decide\nwhen to ask the student to review each item to better prepare her for a test, which will take place\nsometime after the learning period is over. Under our problem de\ufb01nition, the online platform is\nthe agent, it generates action events A when it asks a student to review an item, the student is the\n5In some applications, we may be able to play back historical data from the environment against our policy and, in other domains, we may\n\nneed to resort to a (complex) environment simulator.\n\n6\n\n\f(a) Recall\n\n(b) Items\u2019 dif\ufb01culty\n\n(c) Reviewing events\n\nFigure 3: Spaced Repetition. Performance of our policy gradient method against MEMORIZE [27] and\na uniform baseline, which follows a constant reviewing rate and chooses items uniformly at random.\nPanel (a) shows the empirical recall probability at time T + \u2327 and Panel (b) shows the dif\ufb01culty\nlevel of the items selected for review by different methods. In both cases, the solid horizontal line\n(triangle) shows the median (average) value across review sequences and the box limits correspond\nto the 25%-75% percentiles. All methods schedule (within a small tolerance) the same number of\nreview events. Panel (c) compares the average fraction of review events per day across all items for\nour method (above) and MEMORIZE (below).\n\nenvironment and she generates feedback events F when she reviews an item, indicating whether she\nwas able to recall the item or not, and the recall probability at the test time de\ufb01nes the reward.\nInterestingly, the above setting has been recently studied from the point of view of stochastic optimal\ncontrol [27], where the authors have derived the optimal scheduling algorithm for a set of items.\nHowever, their solution assumes that the dif\ufb01culty of the items and the student model are known [24]\nand that the objective function\u2014the reward\u2014has a particular functional form which depends on the\naverage recall probability over time (and not the actual sampled recall at test time). Here, we use\nour reinforcement learning method to derive (optimal) policies for arbitrarily complex and unknown\nstudent models, items with unknown dif\ufb01culties and more intuitive reward de\ufb01nitions.\nExperimental setup. Since we cannot make real interventions in an online learning platform, we use\ndata from Duolingo to \ufb01t a probabilistic student model, as reported in previous work [24, 27], which\nwe then use to simulate a student\u2019s performance over time (refer to Appendix E for further details\non the student model). Here, the optimal policy p\u21e4\nA;\u2713 = (\u21e4\u2713(t), m\u21e4\u2713(t)) comprises of a reviewing\nintensity function and a multinomial mark distribution. The former characterizes when to review and\nthe latter characterizes which item to review each time. Then, we train and test our policy gradient\nmethod as follows.\nA;\u2713 by using SGD with a\nGiven a student model and a set of items, we train the platform\u2019s policy p\u21e4\nquadratic (entropy) regularizer on the reviewing intensity (mark distribution), i.e., g(\u21e4\u2713(t), m\u21e4\u2713(t)) =\n(\u21e4\u2713(t))2 + H(m\u21e4\u2713(t)) where H(m\u21e4\u2713(ti)) := Pc2Y P[yi = c] log P[yi = c], on a training consisting\n\nof simulated reviewing and test sequences. More speci\ufb01cally, on iteration i, we build a batch of b\nreviewing (or studying) sequences of time length T , where we sample student\u2019s recalls from the\nstudent model every time our policy p\u21e4\u2713i generates a reviewing events and compute the reward at the\nend of each sequence. Here, the reward is the sampled recall at test time T + \u2327, which is a natural\nperformance measure for the goal stated in the problem de\ufb01nition. To test the trained model, we\njust generate additional reviewing sequences using the student model and the trained policy and\ncompute the reward at the end of each sequence. Appendix D for further details on the training and\ntesting procedure. Here, we compare the performance of our method with two alternatives: (i) a state\nof the art method called MEMORIZE [27] which, in contrast with our work, has full access to the\nstudent model and is specially designed to maximize the average recall probability over time, and (ii)\na baseline reviewing schedule which follows a constant reviewing rate and choose items uniformly at\nrandom.\nResults. Figures 3(a-b) summarize the results, where the number of reviewing events by each method\nis the same. The results show that: (i) by maximizing the actual reward one is aiming for, our method\nis able to outperform both MEMORIZE and the baseline by large margins; and, (ii) given the limited\nstudy time, our method tends to focus on less dif\ufb01cult items. Finally, in Figure 3(c), we compare\nhow our method and MEMORIZE distribute reviewing events during the studying period. While our\nmethod keeps a constant load over time, MEMORIZE provides initially a heavier studying load.\n\n7\n\nTPPRLMemorizeUniform0%25%50%TPPRLMemorizeUniform0.500.751.001.251.5014710144%8%TPPRL14710144%8%Memorize\f(a) Average rank\n\n(b) Time at top\n\n(c) Example\n\nFigure 4: Smart broadcasting. Performance of our policy gradient method against REDQUEEN [34]\n(RQ), a variant of REDQUEEN which has access to true ranks (RQ\u21e4), and Karimi\u2019s method [12] on\nfeeds using a sorting algorithm based on a priority queue (refer to Appendix F). Panels (a) and (b)\nshow the average rank and time at the top, where the solid horizontal line shows the median value\nacross users, normalized with respect to the value achieved by a user who follows a uniform Poisson\nintensity, and the box limits correspond to the 25%-75% percentiles. For the average rank, lower is\nbetter and, for time at the top, higher is better. In both cases, the number of messages posted by each\nmethod is the same. Panel (c) shows a user\u2019s intensity \u21e4\u2713(\u00b7) (in blue), as provided by our method, the\ncounts of the user\u2019s posts (in green), the average rank (in red), the posting times of a competing user\nwith higher priority (in purple), and the posting times of another competing user with lower priority\n(in yellow).\n\n5 Experiments on smart broadcasting\n\nA;\u2713 that maximizes the reward.\n\nor rank, i.e., R\u21e4(T ) =R T\nfeeds, or time at the top, i.e., R\u21e4(T ) =R T\n\nProblem de\ufb01nition. In the smart broadcasting problem, \ufb01rst introduced by Spasojevic et al. [25],\nthe goal is to help a social media user decide when to post to achieve high visibility in her followers\u2019\nfeeds, i.e., to elicit attention from her followers. Under our problem de\ufb01nition, the user is the agent,\nshe generates action events A when she posts, her followers\u2019 feeds forms the environment, the\nenvironment generates feedback events F when any of the other users her followers follow post, and\nthe visibility she receives de\ufb01nes the reward. Then, the problem reduces to \ufb01nding the (optimal)\npolicy p\u21e4\nFollowing previous work [29, 33, 34], we measure visibility a user achieves, i.e., the reward, using\ntwo different metrics: (i) the position of her most recent post on her followers\u2019 feeds over time,\n0 r(t)dt, where the position zero, r(t) = 0, corresponds to top and thus\nlower is better; (ii) the (amount of) time that her most recent post is at the top of her followers\u2019\n0 I(r(t) < 1)dt, and thus higher is better. If the followers\u2019\nfeeds are sorted in reverse chronological order, previous work has derived optimal of\ufb02ine [12] and\nonline [34] algorithms for (i) and (ii), respectively, under the additional assumption that the posting\nintensity of other users her followers follow adopts certain functional form. However, as pointed\nout by previous work, feeds are typically algorithmically sorted, the posting intensity of other users\nmay be highly complex, and thus the derived algorithms may be of limited use in practice. Here,\nwe use our reinforcement learning method to derive (optimal) policies for algorithmically sorted\nfeeds and, by doing so, we are able to help users achieve higher visibility than the above algorithms.\nAppendix G contains additional experiments for feeds sorted in reverse chronological order.\nExperimental setup. We use data gathered from Twitter as reported in previous work [2], which\ncomprises pro\ufb01les of 52 million users, 1.9 billion directed follow links among these users, and 1.7\nbillion public tweets posted by the collected users. The follow link information is based on a snapshot\ntaken at the time of data collection, in September 2009. Here, we focus on the tweets published during\na two month period, from July 1, 2009 to September 1, 2009, and sample 1000 users uniformly at\nrandom. For each of these users, we retrieve \ufb01ve of her followers (chosen at random), select \ufb01ve other\nfollowees of each follower (chosen at random), and collect all the (re)tweets they published. Each\nfollower represents a wall and our broadcaster is competing with the other followees of follower for\nattention. Since we do not have access to the feed sorting algorithm used by Twitter, we experiment\nwith a relatively simple sorting algorithm based on a priority queue6 (refer to Appendix F). Here,\n\n6We expect that, the more complex the sorting algorithm, the larger the competitive advantage our algorithm will offer in comparison with\n\ncompeting methods designed for feeds sorted in reverse chronological order.\n\n8\n\nTPPRLRQ\u21e4RQKarimi0.60.81.01.2RelativedecreaseTPPRLRQ\u21e4RQKarimi0.951.001.051.101.15Relativeincrease0510\u21e4\u2713(t)Posts\u00afr(t)0T\fA;\u2713 = \u21e4\u2713(t). Then, we train and test our policy gradient method as follows.\n\nsince our feed sorting algorithm does only depends on the time of the post and the identity of the\nuser who posts, not marks (e.g., content of the post), the optimal policy only comprises an intensity\nfunction, i.e., p\u21e4\nFor each user, we divide her feedback events, i.e., the posts by other users her followers follow, into\na training set and a test set. The latter contains all feedback events generated in a time window of\nlength T at the end of the recording period and the former contains all other feedback events. Here,\nwe set the length T such that the overall expected number of events in the test set is \u21e0200. Then, we\ntrain each user\u2019s policy \u21e4\u2713(t) by using stochastic gradient descent (SGD) with a quadratic regularizer\ng(\u21e4(t)) = (\u21e4(t))2. More speci\ufb01cally, on each iteration i, we build a batch of b sequences of\nlength T , taken uniformly at random from the training set, we replay the feedback events from these\nsequences while interleaving the posts generated by our policy \u21e4\u2713i, and compute the reward at the\nend of each sequence. To test the trained policy \u21e4\u2713(t), we just replay the feedback events from the\ntest set while interleaving the posts generated by the policy and compute the reward at the end of the\nsequence. Appendix D contain additional implementation details.\nIn the above, we experiment both with rank and time at the top as rewards and compare our method\nwith two state of the art methods, REDQUEEN [34] and the method by Karimi et al. [12]. The former\nis an online algorithm specially designed to minimize the average rank in feeds sorted in reverse\nchronological order and the latter is an of\ufb02ine algorithm specially designed to maximize the time at\nthe top in feeds sorted in reverse chronological order. However, because REDQUEEN assumes that\nthe feed is inverse chronologically sorted and posts with intensity / rankchrono(t), we also compare\nour method TPPRL against a stronger heuristic RQ\u21e4, which posts with intensity / rankpriority(t).\nResults. Figures 4(a-b) summarize the results, where the number of messages posted by each method\nis the same and all rewards are normalized by the reward achieved by a baseline user who follows\na uniform Poisson intensity. The results show that, by not making any assumption about the feed\nsorting algorithm, our method is able to outperform both REDQUEEN and Karimi\u2019s method, which\nwere specially designed to minimize the average rank and time at the top in feeds sorted in reverse\nchronological order, respectively. Moreover, our method provides solutions with smaller variance in\nperformance than REDQUEEN. Finally, in Figure 4(c), we give some intuition on the type of policy\nour method learns using a toy example, where a user competes for attention with two other users in a\nfollower\u2019s feed, one with higher priority and another with lower priority. Our method learns to avoid\nposting whenever the user with higher priority posts.\n\n6 Conclusions\n\nIn this paper, we approached a novel reinforcement learning problem where both actions and feedback\nare asynchronous stochastic events in continuous time, characterized using marked temporal point\nprocesses (MTPPs).\nIn this problem, the policy is a conditional intensity function (and mark\ndistribution), which is then used to sample the times (and marks) of the agent\u2019s actions. Then, we\nderived a \ufb02exible policy gradient method, which does not make any assumptions on the functional\nform of the intensity and mark distribution of the feedback and it allows for arbitrarily complex reward\nfunctions. Experiments on two different applications in personalized teaching and viral marketing\nshow that our method beats competing methods.\nThere are many interesting venues for future work. For example, we have taken a \ufb01rst step towards\ndeveloping reinforcement learning algorithms for MTPPs, however, a natural follow up would be\nderiving more sophisticated reinforcement learning algorithms, e.g., actor-critic algorithms, for our\nnovel problem setting. We have evaluated in two real-world applications in personalized teaching and\nviral marketing, however, there are many other (high impact) applications \ufb01tting our novel problem\nsetting, e.g., quantitative trading. Finally, it would be very interesting to develop multiple agent\nreinforcement learning algorithms for MTPPs.\n\nReferences\n[1] O. Aalen, O. Borgan, and H. Gjessing. Survival and event history analysis: a process point of view.\n\nSpringer Science & Business Media, 2008.\n\n[2] M. Cha, H. Haddadi, F. Benevenuto, and P. K. Gummadi. Measuring user in\ufb02uence in twitter: The million\n\nfollower fallacy. ICWSM, 10(10-17):30, 2010.\n\n9\n\n\f[3] D. J. Daley and D. Vere-Jones. An introduction to the theory of point processes: volume II: general theory\n\nand structure. Springer Science & Business Media, 2007.\n\n[4] K. Doya. Reinforcement learning in continuous time and space. Neural computation, 12(1):219\u2013245,\n\n2000.\n\n[5] N. Du, H. Dai, R. Trivedi, U. Upadhyay, M. Gomez-Rodriguez, and L. Song. Recurrent marked temporal\n\npoint processes: Embedding event history to vector. In KDD, 2016.\n\n[6] Y. Duan, X. Chen, R. Houthooft, J. Schulman, and P. Abbeel. Benchmarking deep reinforcement learning\n\nfor continuous control. In ICML, 2016.\n\n[7] H. Ebbinghaus. Memory: a contribution to experimental psychology. Teachers College, Columbia\n\nUniversity, 1885.\n\n[8] M. Farajtabar, J. Yang, X. Ye, H. Xu, R. Trivedi, E. Khalil, S. Li, L. Song, and H. Zha. Fake news\n\nmitigation via point process based intervention. In ICML, 2017.\n\n[9] N. Fr\u00e9maux, H. Sprekeler, and W. Gerstner. Reinforcement learning using a continuous time actor-critic\n\nframework with spiking neurons. PLoS computational biology, 9(4):e1003024, 2013.\n\n[10] F. B. Hanson. Applied stochastic processes and control for Jump-diffusions: modeling, analysis, and\n\ncomputation, volume 13. Siam, 2007.\n\n[11] H. Jing and A. J. Smola. Neural survival recommender. In WSDM, 2017.\n\n[12] M. R. Karimi, E. Tavakoli, M. Farajtabar, L. Song, and M. Gomez Rodriguez. Smart broadcasting: Do you\n\nwant to be seen? In KDD, 2016.\n\n[13] J. Kim, B. Tabibian, A. Oh, B. Sch\u00f6lkopf, and M. Gomez-Rodriguez. Leveraging the crowd to detect and\n\nreduce the spread of fake news and misinformation. In WSDM, 2018.\n\n[14] S. Leitner. So lernt man lernen: Der weg zum erfolg. Herder, 1972.\n\n[15] T. P. Lillicrap, J. J. Hunt, A. Pritzel, N. Heess, T. Erez, Y. Tassa, D. Silver, and D. Wierstra. Continuous\n\ncontrol with deep reinforcement learning. arXiv preprint arXiv:1509.02971, 2015.\n\n[16] R. V. Lindsey, J. D. Shroyer, H. Pashler, and M. C. Mozer. Improving students\u2019 long-term knowledge\n\nretention through personalized review. Psychological science, 25(3):639\u2013647, 2014.\n\n[17] H. Mei and J. M. Eisner. The neural hawkes process: A neurally self-modulating multivariate point process.\n\nIn NIPS, 2017.\n\n[18] E. Mettler, C. M. Massey, and P. J. Kellman. A comparison of adaptive and \ufb01xed schedules of practice.\n\nJournal of Experimental Psychology: General, 145(7):897, 2016.\n\n[19] C. Metzler-Baddeley and R. J. Baddeley. Does adaptive training work? Applied Cognitive Psychology,\n\n23(2):254\u2013266, 2009.\n\n[20] V. Mnih, A. P. Badia, M. Mirza, A. Graves, T. Lillicrap, T. Harley, D. Silver, and K. Kavukcuoglu.\n\nAsynchronous methods for deep reinforcement learning. In ICML, 2016.\n\n[21] H. Pashler, N. Cepeda, R. V. Lindsey, E. Vul, and M. C. Mozer. Predicting the optimal spacing of study: A\n\nmultiscale context model of memory. In NIPS, 2009.\n\n[22] S. Reddy, I. Labutov, S. Banerjee, and T. Joachims. Unbounded human learning: Optimal scheduling for\n\nspaced repetition. In KDD, 2016.\n\n[23] D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning representations by back-propagating errors.\n\nNature, 323(6088):533, 1986.\n\n[24] B. Settles and B. Meeder. A trainable spaced repetition model for language learning. In Proceedings of the\n54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), volume 1,\npages 1848\u20131858, 2016.\n\n[25] N. Spasojevic, Z. Li, A. Rao, and P. Bhattacharyya. When-to-post on social networks. In KDD, 2015.\n\n[26] R. S. Sutton and A. G. Barto. Reinforcement learning: An introduction, volume 1. MIT press Cambridge,\n\n1998.\n\n10\n\n\f[27] B. Tabibian, U. Upadhyay, A. De, A. Zarezade, B. Schoelkopf, and M. Gomez-Rodriguez. Enhancing\nhuman learning via spaced repetition optimization. In Proceedings of the National Academy of Sciences,\n2019.\n\n[28] E. Vasilaki, N. Fr\u00e9maux, R. Urbanczik, W. Senn, and W. Gerstner. Spike-based reinforcement learning\nin continuous state and action space: when policy gradient methods fail. PLoS computational biology,\n5(12):e1000586, 2009.\n\n[29] Y. Wang, E. Theodorou, A. Verma, and L. Song. A stochastic differential equation framework for guiding\n\nonline user activities in closed loop. In AISTATS, 2018.\n\n[30] Y. Wang, G. Williams, E. Theodorou, and L. Song. Variational policy for guiding point processes. In\n\nICML, 2017.\n\n[31] D. Wierstra, A. Foerster, J. Peters, and J. Schmidhuber. Solving deep memory POMDPs with recurrent\n\npolicy gradients. In ICANN, 2007.\n\n[32] R. J. Williams. Simple statistical gradient-following algorithms for connectionist reinforcement learning.\n\nMachine learning, 8(3-4):229\u2013256, 1992.\n\n[33] A. Zarezade, A. De, U. Upadhyay, H. Rabiee, and M. Gomez-Rodriguez. Steering social activity: A\n\nstochastic optimal control point of view. JMLR, 2018.\n\n[34] A. Zarezade, U. Upadhyay, H. Rabiee, and M. Gomez-Rodriguez. Redqueen: An online algorithm for\n\nsmart broadcasting in social networks. In WSDM, 2017.\n\n11\n\n\f", "award": [], "sourceid": 1615, "authors": [{"given_name": "Utkarsh", "family_name": "Upadhyay", "institution": "Max Plank Institute for Software Systems"}, {"given_name": "Abir", "family_name": "De", "institution": "Max Planck Insitute for Software Systems"}, {"given_name": "Manuel", "family_name": "Gomez Rodriguez", "institution": "Max Planck Institute for Software Systems"}]}