GP-VAE: Deep Probabilistic Multivariate Time Series...

10
GP-VAE: Deep Probabilistic Multivariate Time Series Imputation Vincent Fortuin 1, 2 Dmitry Baranchuk 1, 3 Gunnar Rätsch Stephan Mandt ETH Zürich, Switzerland Yandex, Russia ETH Zürich, Switzerland UC Irvine, USA Abstract Multivariate time series with missing values are common in areas such as healthcare and finance, and have grown in number and com- plexity over the years. This raises the question whether deep learning methodologies can out- perform classical data imputation methods in this domain. However, naïve applications of deep learning fall short in giving reliable confidence estimates and lack interpretability. We propose a new deep sequential latent vari- able model for dimensionality reduction and data imputation. Our modeling assumption is simple and interpretable: the high dimen- sional time series has a lower-dimensional rep- resentation which evolves smoothly in time according to a Gaussian process. The non- linear dimensionality reduction in the pres- ence of missing data is achieved using a VAE approach with a novel structured variational approximation. We demonstrate that our ap- proach outperforms several classical and deep learning-based data imputation methods on high-dimensional data from the domains of computer vision and healthcare, while addi- tionally improving the smoothness of the im- putations and providing interpretable uncer- tainty estimates. 1 Introduction Time series are often associated with missing values, for instance due to faulty measurement devices, par- tially observed states, or costly measurement proce- dures [Honaker and King, 2010]. These missing values impair the usefulness and interpretability of the data, leading to the problem of data imputation : estimating Proceedings of the 23 rd International Conference on Artificial Intelligence and Statistics (AISTATS) 2020, Palermo, Italy. PMLR: Volume 108. Copyright 2020 by the author(s). those missing values from the observed ones [Rubin, 1976]. Multivariate time series, consisting of multiple corre- lated univariate time series or channels, give rise to two distinct ways of imputing missing information: (1) by exploiting temporal correlations within each channel, and (2) by exploiting correlations across channels, for example by using lower-dimensional representations of the data. For instance in a medical setting, if the blood pressure of a patient is unobserved, it can be informative that the heart rate at the current time is higher than normal and that the blood pressure was also elevated an hour ago. An ideal imputation model for multivariate time series should therefore take both of these sources of information into account. An- other desirable property of such models is to offer a probabilistic interpretation, allowing for uncertainty estimation. Unfortunately, current imputation approaches fall short with respect to at least one of these desiderata. While there are many time-tested statistical methods for mul- tivariate time series analysis (e.g., Gaussian processes [Roberts et al., 2013]) that work well in the case of complete data, these methods are generally not appli- cable when features are missing. On the other hand, classical methods for time series imputation often do not take the potentially complex interactions between the different channels into account [Little and Rubin, 2002, Pedersen et al., 2017]. Finally, recent work has explored the use of non-linear dimensionality reduction using variational autoencoders for i.i.d. data points with missing values [Nazabal et al., 2018, Ainsworth et al., 2018, Ma et al., 2018], but this work has not considered temporal data and strategies for sharing statistical strength across time. Following these considerations, it is promising to com- bine non-linear dimensionality reduction with an ex- pressive time series model. This can be done by jointly learning a mapping from the data space (where features are missing) into a latent space (where all dimensions 1 Equal contribution. 2 Contact: [email protected] 3 Work partially done at ETH Zürich.

Transcript of GP-VAE: Deep Probabilistic Multivariate Time Series...

Page 1: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

Vincent Fortuin1,2 Dmitry Baranchuk1,3 Gunnar Rätsch Stephan MandtETH Zürich, Switzerland Yandex, Russia ETH Zürich, Switzerland UC Irvine, USA

Abstract

Multivariate time series with missing valuesare common in areas such as healthcare andfinance, and have grown in number and com-plexity over the years. This raises the questionwhether deep learning methodologies can out-perform classical data imputation methodsin this domain. However, naïve applicationsof deep learning fall short in giving reliableconfidence estimates and lack interpretability.We propose a new deep sequential latent vari-able model for dimensionality reduction anddata imputation. Our modeling assumptionis simple and interpretable: the high dimen-sional time series has a lower-dimensional rep-resentation which evolves smoothly in timeaccording to a Gaussian process. The non-linear dimensionality reduction in the pres-ence of missing data is achieved using a VAEapproach with a novel structured variationalapproximation. We demonstrate that our ap-proach outperforms several classical and deeplearning-based data imputation methods onhigh-dimensional data from the domains ofcomputer vision and healthcare, while addi-tionally improving the smoothness of the im-putations and providing interpretable uncer-tainty estimates.

1 Introduction

Time series are often associated with missing values,for instance due to faulty measurement devices, par-tially observed states, or costly measurement proce-dures [Honaker and King, 2010]. These missing valuesimpair the usefulness and interpretability of the data,leading to the problem of data imputation: estimating

Proceedings of the 23rdInternational Conference on ArtificialIntelligence and Statistics (AISTATS) 2020, Palermo, Italy.PMLR: Volume 108. Copyright 2020 by the author(s).

those missing values from the observed ones [Rubin,1976].

Multivariate time series, consisting of multiple corre-lated univariate time series or channels, give rise to twodistinct ways of imputing missing information: (1) byexploiting temporal correlations within each channel,and (2) by exploiting correlations across channels, forexample by using lower-dimensional representationsof the data. For instance in a medical setting, if theblood pressure of a patient is unobserved, it can beinformative that the heart rate at the current timeis higher than normal and that the blood pressurewas also elevated an hour ago. An ideal imputationmodel for multivariate time series should therefore takeboth of these sources of information into account. An-other desirable property of such models is to offer aprobabilistic interpretation, allowing for uncertaintyestimation.

Unfortunately, current imputation approaches fall shortwith respect to at least one of these desiderata. Whilethere are many time-tested statistical methods for mul-tivariate time series analysis (e.g., Gaussian processes[Roberts et al., 2013]) that work well in the case ofcomplete data, these methods are generally not appli-cable when features are missing. On the other hand,classical methods for time series imputation often donot take the potentially complex interactions betweenthe different channels into account [Little and Rubin,2002, Pedersen et al., 2017]. Finally, recent work hasexplored the use of non-linear dimensionality reductionusing variational autoencoders for i.i.d. data pointswith missing values [Nazabal et al., 2018, Ainsworthet al., 2018, Ma et al., 2018], but this work has notconsidered temporal data and strategies for sharingstatistical strength across time.

Following these considerations, it is promising to com-bine non-linear dimensionality reduction with an ex-pressive time series model. This can be done by jointlylearning a mapping from the data space (where featuresare missing) into a latent space (where all dimensions

1Equal contribution.2Contact: [email protected] partially done at ETH Zürich.

Page 2: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

are fully determined). A statistical model of choice canthen be applied in this latent space to model temporaldynamics. If the dynamics model and the mapping fordimensionality reduction are both differentiable, theapproach can be trained end-to-end.

In this paper, we propose an architecture that uses deepvariational autoencoders (VAEs) to map the missingdata time series into a latent space without missing-ness, where we model the low-dimensional dynamicswith a Gaussian process (GP). As we will discuss be-low, we hereby propose a prior model that efficientlyoperates at multiple time scales, taking into accountthat the multivariate time series may have differentchannels (e.g., heart rate, blood pressure, etc.) thatchange with different characteristic frequencies. Fi-nally, our variational inference approach makes use ofefficient structured variational approximations, wherewe fit another multivariate Gaussian process in orderto approximate the intractable true posterior.

We make the following contributions:

• A new model. We propose a VAE architecturefor multivariate time series imputation with a GPprior in the latent space to capture temporal dy-namics. We propose a Cauchy kernel to allow thetime series to display dynamics at multiple scalesin a reduced dimensionality.

• Efficient inference. We use a structured variationalapproximation that models posterior correlationsin the time domain. By construction, inferenceis efficient and the time complexity for samplingfrom the variational distribution, used for training,is linear in the number of time steps (as opposedto cubic when done naïvely).

• Benchmarking on real-world data. We carry out ex-tensive comparisons to classical imputation meth-ods as well as state-of-the-art deep learning ap-proaches, and perform experiments on data fromtwo different domains. Our method shows favor-able performance in both cases.

We start by reviewing the related literature in Sec. 2, de-scribe the general setting in Sec. 3.1 and introduce ourmodel and inference scheme in Sec. 3.2 and Sec. 3.3, re-spectively. Experiments and conclusions are presentedin Sec. 4 and 5.

2 Related work

Classical statistical approaches. The problem ofmissing values has been a long-standing challenge inmany time series applications, especially in the fieldof medicine [Pedersen et al., 2017]. The earliest ap-proaches to deal with this problem often relied on

heuristics, such as mean imputation or forward im-putation. Despite their simplicity, these methods arestill widely applied today due to their efficiency andinterpretability [Honaker and King, 2010]. Orthogonalto these ideas, methods along the lines of expectation-maximization (EM) have been proposed, but they oftenrequire additional modeling assumptions [Bashir andWei, 2018].

Bayesian methods. When it comes to estimatinglikelihoods and uncertainties relating to the imputa-tions, Bayesian methods, such as Gaussian processes(GPs) [Rasmussen, 2003], have a clear advantage overnon-Bayesian methods such as single imputation [Littleand Rubin, 2002]. There has been much recent workin making these methods more expressive and incorpo-rating prior knowledge from the domain (e.g., medicaltime series) [Wilson et al., 2016, Fortuin and Rätsch,2019] or adapting them to work on discrete domains[Fortuin et al., 2018a], but their wide-spread adoptionis hindered by their limited scalability and the chal-lenges in designing kernels that are robust to missingvalues. Our latent GP prior bears certain similaritiesto the GP latent variable model (GP-LVM) [Lawrence,2004, Titsias and Lawrence, 2010], but in contrast tothis line of work, we propose an efficient amortizedinference scheme.

Deep learning techniques. Another avenue of re-search in this area uses deep learning techniques, suchas variational autoencoders (VAEs) [Nazabal et al.,2018, Ainsworth et al., 2018, Ma et al., 2018, Dalcaet al., 2019] or generative adversarial networks (GANs)[Yoon et al., 2018, Li et al., 2019]. It should be notedthat VAEs allow for tractable likelihoods, while GANsgenerally do not and have to rely on additional opti-mization processes to find latent representations of agiven input [Lipton and Tripathi, 2017]. Unfortunately,none of these models explicitly take the temporal dy-namics of time series data into account. Conversely,there are deep probabilistic models for time series [e.g.,Krishnan et al., 2015, 2017, Fortuin et al., 2018b], butthose do not explicitly handle missing data. Thereare also some VAE-based imputation methods that aredesigned for a setting where the data is complete attraining time and the missingness only occurs at testtime [Garnelo et al., 2018a,b, Ivanov et al., 2018]. Wedo not regard this setting in our work.

HI-VAE. Our approach borrows some ideas from theHI-VAE [Nazabal et al., 2018]. This model deals withmissing data by defining an ELBO whose reconstructionerror term only sums over the observed part of thedata. For inference, the incomplete data are filled witharbitrary values (e.g., zeros) before they are fed intothe inference network, which induces an unavoidablebias. The main difference to our approach is that the

Page 3: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

Vincent Fortuin, Dmitry Baranchuk, Gunnar Rätsch, Stephan Mandt

x1<latexit sha1_base64="dMT0jIYN/pJw+QqvkaSv9azVLxI=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVnHS17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6UzELpUbFdRytl1rUIZcicmhEAQkKSUH3yaIE9zmZVKcYtgtKYZPQPF4XlFEZe0fwEvCwYB3i2cfsvLCGmGtmwrsdRPExJHZW6QXGOyba8NZRBfQE8g/GV7b150ZImawznEDaTMyZH7CiaACf2/YGhWe29E5dl6IeHXA9HjKM2aIrda44b7AiYRvySpXesmcfDJvnl+hBCNlGBrrP/whVrtmFtFgKpoQF0QJPQur+63zmV2eDH3F0HTFZcE4pnhJ6jKr1qY7Z97a+6R4FBTWE/UvRtYjzDeDns9T9zTA7mpotoo+vvSG5p+5Cj/snplZ1BA+7Boi8hRp9EAXH+kTv6RVrJu3h3qQG3t9cNmi/yHoe1eX3cTiX2QR1E0MztTyAXEyczrPP/dEnOpvpPkh5hk3mNjFfNtN51N0i1z3OdQf38plqojvDa1wRd8HKZXPpLP9P1Cei687PJ6+5jH+YOfvHuqmcNLZN6B93aoet4m+2Ss9pg15yPw+pRcf4whz6Tj/oJ/0yjgxpTIw0dy2tFHue0bXL+PYXSH0fzQ==</latexit><latexit sha1_base64="dMT0jIYN/pJw+QqvkaSv9azVLxI=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVnHS17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6UzELpUbFdRytl1rUIZcicmhEAQkKSUH3yaIE9zmZVKcYtgtKYZPQPF4XlFEZe0fwEvCwYB3i2cfsvLCGmGtmwrsdRPExJHZW6QXGOyba8NZRBfQE8g/GV7b150ZImawznEDaTMyZH7CiaACf2/YGhWe29E5dl6IeHXA9HjKM2aIrda44b7AiYRvySpXesmcfDJvnl+hBCNlGBrrP/whVrtmFtFgKpoQF0QJPQur+63zmV2eDH3F0HTFZcE4pnhJ6jKr1qY7Z97a+6R4FBTWE/UvRtYjzDeDns9T9zTA7mpotoo+vvSG5p+5Cj/snplZ1BA+7Boi8hRp9EAXH+kTv6RVrJu3h3qQG3t9cNmi/yHoe1eX3cTiX2QR1E0MztTyAXEyczrPP/dEnOpvpPkh5hk3mNjFfNtN51N0i1z3OdQf38plqojvDa1wRd8HKZXPpLP9P1Cei687PJ6+5jH+YOfvHuqmcNLZN6B93aoet4m+2Ss9pg15yPw+pRcf4whz6Tj/oJ/0yjgxpTIw0dy2tFHue0bXL+PYXSH0fzQ==</latexit><latexit sha1_base64="dMT0jIYN/pJw+QqvkaSv9azVLxI=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVnHS17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6UzELpUbFdRytl1rUIZcicmhEAQkKSUH3yaIE9zmZVKcYtgtKYZPQPF4XlFEZe0fwEvCwYB3i2cfsvLCGmGtmwrsdRPExJHZW6QXGOyba8NZRBfQE8g/GV7b150ZImawznEDaTMyZH7CiaACf2/YGhWe29E5dl6IeHXA9HjKM2aIrda44b7AiYRvySpXesmcfDJvnl+hBCNlGBrrP/whVrtmFtFgKpoQF0QJPQur+63zmV2eDH3F0HTFZcE4pnhJ6jKr1qY7Z97a+6R4FBTWE/UvRtYjzDeDns9T9zTA7mpotoo+vvSG5p+5Cj/snplZ1BA+7Boi8hRp9EAXH+kTv6RVrJu3h3qQG3t9cNmi/yHoe1eX3cTiX2QR1E0MztTyAXEyczrPP/dEnOpvpPkh5hk3mNjFfNtN51N0i1z3OdQf38plqojvDa1wRd8HKZXPpLP9P1Cei687PJ6+5jH+YOfvHuqmcNLZN6B93aoet4m+2Ss9pg15yPw+pRcf4whz6Tj/oJ/0yjgxpTIw0dy2tFHue0bXL+PYXSH0fzQ==</latexit><latexit sha1_base64="dMT0jIYN/pJw+QqvkaSv9azVLxI=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVnHS17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6UzELpUbFdRytl1rUIZcicmhEAQkKSUH3yaIE9zmZVKcYtgtKYZPQPF4XlFEZe0fwEvCwYB3i2cfsvLCGmGtmwrsdRPExJHZW6QXGOyba8NZRBfQE8g/GV7b150ZImawznEDaTMyZH7CiaACf2/YGhWe29E5dl6IeHXA9HjKM2aIrda44b7AiYRvySpXesmcfDJvnl+hBCNlGBrrP/whVrtmFtFgKpoQF0QJPQur+63zmV2eDH3F0HTFZcE4pnhJ6jKr1qY7Z97a+6R4FBTWE/UvRtYjzDeDns9T9zTA7mpotoo+vvSG5p+5Cj/snplZ1BA+7Boi8hRp9EAXH+kTv6RVrJu3h3qQG3t9cNmi/yHoe1eX3cTiX2QR1E0MztTyAXEyczrPP/dEnOpvpPkh5hk3mNjFfNtN51N0i1z3OdQf38plqojvDa1wRd8HKZXPpLP9P1Cei687PJ6+5jH+YOfvHuqmcNLZN6B93aoet4m+2Ss9pg15yPw+pRcf4whz6Tj/oJ/0yjgxpTIw0dy2tFHue0bXL+PYXSH0fzQ==</latexit>

x2<latexit sha1_base64="Cj6Ane2GATijYs2dnmLnunTaxs8=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVonT17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yaZeH9zadJ+kfU8qsvv43AuswnqJoZmankAuZg4nWef+6NPdDbTfZDyDJvMbWK+bKbzqLtFrnuc6w7u5TPVRHeGZ14Rd8HKZXPpLP9P1Cei687PJ6+5jH9YY/aPdVM5Mbcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vTcEfzg==</latexit><latexit sha1_base64="Cj6Ane2GATijYs2dnmLnunTaxs8=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVonT17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yaZeH9zadJ+kfU8qsvv43AuswnqJoZmankAuZg4nWef+6NPdDbTfZDyDJvMbWK+bKbzqLtFrnuc6w7u5TPVRHeGZ14Rd8HKZXPpLP9P1Cei687PJ6+5jH9YY/aPdVM5Mbcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vTcEfzg==</latexit><latexit sha1_base64="Cj6Ane2GATijYs2dnmLnunTaxs8=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVonT17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yaZeH9zadJ+kfU8qsvv43AuswnqJoZmankAuZg4nWef+6NPdDbTfZDyDJvMbWK+bKbzqLtFrnuc6w7u5TPVRHeGZ14Rd8HKZXPpLP9P1Cei687PJ6+5jH9YY/aPdVM5Mbcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vTcEfzg==</latexit><latexit sha1_base64="Cj6Ane2GATijYs2dnmLnunTaxs8=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVonT17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yaZeH9zadJ+kfU8qsvv43AuswnqJoZmankAuZg4nWef+6NPdDbTfZDyDJvMbWK+bKbzqLtFrnuc6w7u5TPVRHeGZ14Rd8HKZXPpLP9P1Cei687PJ6+5jH9YY/aPdVM5Mbcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vTcEfzg==</latexit>

x3<latexit sha1_base64="KpVCTEY/gBT4r2vGkVAvttn3uNk=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVknc17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yY18f7mskn7RdbzqC6/j8O5TBPUTQzN1PIAcjFxOs8+90ef6Gym+yDlGZrMNTFfNtN51N0i1z3OdQf38plqojvDa14Rd8HKpbl0lv8n6hPRdefnk9dcxj+sMfvHuqmcNLcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vUwUfzw==</latexit><latexit sha1_base64="KpVCTEY/gBT4r2vGkVAvttn3uNk=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVknc17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yY18f7mskn7RdbzqC6/j8O5TBPUTQzN1PIAcjFxOs8+90ef6Gym+yDlGZrMNTFfNtN51N0i1z3OdQf38plqojvDa14Rd8HKpbl0lv8n6hPRdefnk9dcxj+sMfvHuqmcNLcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vUwUfzw==</latexit><latexit sha1_base64="KpVCTEY/gBT4r2vGkVAvttn3uNk=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVknc17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yY18f7mskn7RdbzqC6/j8O5TBPUTQzN1PIAcjFxOs8+90ef6Gym+yDlGZrMNTFfNtN51N0i1z3OdQf38plqojvDa14Rd8HKpbl0lv8n6hPRdefnk9dcxj+sMfvHuqmcNLcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vUwUfzw==</latexit><latexit sha1_base64="KpVCTEY/gBT4r2vGkVAvttn3uNk=">AAAFynicjZRLb9NAEMenAUMJj6Zw5BI1QuLQVknc17HimUORCiJtpaaK/NgkVvzSelMSLN848Gm4wlfhu3Dgv2NTpSlJY2s9s7Ozv3msbTv2vUTV679XSnfuGvfurz4oP3z0+MlaZf3pSRKNpCPaTuRH8sy2EuF7oWgrT/niLJbCCmxfnNrD13r99FLIxIvCz2oSi4vA6odez3MsBVO3stFRYqyYk7qWHG5J4WZpJ7DUwO6l46xrZt1Krb5d56t6U2kUSo2K6zhaL7WoQy5F5NCIAhIUkoLuk0UJ7nNqUJ1i2C4ohU1C83hdUEZl7B3BS8DDgnWIZx+z88IaYq6ZCe92EMXHkNhZpRcY75how1tHFdATyD8YX9nWnxshZbLOcAJpMzFnfsCKogF8btsbFJ7Z0jt1XYp6dMD1eMgwZouu1LnivMGKhG3IK1V6y559MGyeX6IHIWQbGeg+/yNUuWYX0mIpmBIWRAs8Can7r/OZX50NfsTRdcRkwTmleEroMarWpzpm39v6pnsUFNQQ9i9F1yLON4Cfz1L3N8PsaGq2iD6+9obknroLPe6fmFrVETzsGiDyFmr0QRQc6xO9p1esNWgP9yY18f7mskn7RdbzqC6/j8O5TBPUTQzN1PIAcjFxOs8+90ef6Gym+yDlGZrMNTFfNtN51N0i1z3OdQf38plqojvDa14Rd8HKpbl0lv8n6hPRdefnk9dcxj+sMfvHuqmcNLcb0D/u1A5bxd9slZ7TBr3kfh5Si47xhTn0nX7QT/plHBnSmBhp7lpaKfY8o2uX8e0vUwUfzw==</latexit>

xT<latexit sha1_base64="ZK5bFa43axHgbs5xAR/Ti3Bw214=">AAAFynicjZRLb9NAEMenAUMJrxSOXKJGSBzaKo++jhXPHIpUUNNWaqrIj01ixS+tNyXB8o0Dn4YrfBW+Cwf+OzZVmuI0ttYzOzv7m8fatiLPjVW9/nuldOeuce/+6oPyw0ePnzytrD07icOxtEXHDr1QnllmLDw3EB3lKk+cRVKYvuWJU2v0Rq+fXgoZu2FwrKaRuPDNQeD2XdtUMPUq610lJoo5iWPK0aYUTpp0fVMNrX4ySXvHaa9Sq2/V+areVBq5UqP8OgrXSm3qkkMh2TQmnwQFpKB7ZFKM+5waVKcItgtKYJPQXF4XlFIZe8fwEvAwYR3hOcDsPLcGmGtmzLttRPEwJHZW6SXGeyZa8NZRBfQY8g/GV7YNCiMkTNYZTiEtJmbMj1hRNITPbXv93DNdeqeuS1Gf9rkeFxlGbNGV2lect1iRsI14pUrv2HMAhsXzS/QggOwgA93nf4Qq1+xAmiwFU4KcaIInIXX/dT7F1VnghxxdR4wXnFOCp4QeoWp9qhP2va1vukd+Tg1g/5J3LeR8ffh5LHV/U8wOZ2aL6JNrb0jmqbvQ5/6JmVUdwcWuISJvokYPRMGxPtMHes1ag3Zxb1AT728mm7SXZ11Edfh9HBUyW6BuYGimlvuQi4mzeQ64P/pE5zPdAynLsMXcFubLZlpE3clz3eVct3Evn6kmOnO85hVxB6xMtpbO8v9EfSK67ux8sprL+Ic15v9YN5WT5lYD+qft2kE7/5ut0gtap1fczwNq0xG+MJu+0w/6Sb+MQ0MaUyPJXEsr+Z7ndO0yvv0FANgf8A==</latexit><latexit sha1_base64="ZK5bFa43axHgbs5xAR/Ti3Bw214=">AAAFynicjZRLb9NAEMenAUMJrxSOXKJGSBzaKo++jhXPHIpUUNNWaqrIj01ixS+tNyXB8o0Dn4YrfBW+Cwf+OzZVmuI0ttYzOzv7m8fatiLPjVW9/nuldOeuce/+6oPyw0ePnzytrD07icOxtEXHDr1QnllmLDw3EB3lKk+cRVKYvuWJU2v0Rq+fXgoZu2FwrKaRuPDNQeD2XdtUMPUq610lJoo5iWPK0aYUTpp0fVMNrX4ySXvHaa9Sq2/V+areVBq5UqP8OgrXSm3qkkMh2TQmnwQFpKB7ZFKM+5waVKcItgtKYJPQXF4XlFIZe8fwEvAwYR3hOcDsPLcGmGtmzLttRPEwJHZW6SXGeyZa8NZRBfQY8g/GV7YNCiMkTNYZTiEtJmbMj1hRNITPbXv93DNdeqeuS1Gf9rkeFxlGbNGV2lect1iRsI14pUrv2HMAhsXzS/QggOwgA93nf4Qq1+xAmiwFU4KcaIInIXX/dT7F1VnghxxdR4wXnFOCp4QeoWp9qhP2va1vukd+Tg1g/5J3LeR8ffh5LHV/U8wOZ2aL6JNrb0jmqbvQ5/6JmVUdwcWuISJvokYPRMGxPtMHes1ag3Zxb1AT728mm7SXZ11Edfh9HBUyW6BuYGimlvuQi4mzeQ64P/pE5zPdAynLsMXcFubLZlpE3clz3eVct3Evn6kmOnO85hVxB6xMtpbO8v9EfSK67ux8sprL+Ic15v9YN5WT5lYD+qft2kE7/5ut0gtap1fczwNq0xG+MJu+0w/6Sb+MQ0MaUyPJXEsr+Z7ndO0yvv0FANgf8A==</latexit><latexit sha1_base64="ZK5bFa43axHgbs5xAR/Ti3Bw214=">AAAFynicjZRLb9NAEMenAUMJrxSOXKJGSBzaKo++jhXPHIpUUNNWaqrIj01ixS+tNyXB8o0Dn4YrfBW+Cwf+OzZVmuI0ttYzOzv7m8fatiLPjVW9/nuldOeuce/+6oPyw0ePnzytrD07icOxtEXHDr1QnllmLDw3EB3lKk+cRVKYvuWJU2v0Rq+fXgoZu2FwrKaRuPDNQeD2XdtUMPUq610lJoo5iWPK0aYUTpp0fVMNrX4ySXvHaa9Sq2/V+areVBq5UqP8OgrXSm3qkkMh2TQmnwQFpKB7ZFKM+5waVKcItgtKYJPQXF4XlFIZe8fwEvAwYR3hOcDsPLcGmGtmzLttRPEwJHZW6SXGeyZa8NZRBfQY8g/GV7YNCiMkTNYZTiEtJmbMj1hRNITPbXv93DNdeqeuS1Gf9rkeFxlGbNGV2lect1iRsI14pUrv2HMAhsXzS/QggOwgA93nf4Qq1+xAmiwFU4KcaIInIXX/dT7F1VnghxxdR4wXnFOCp4QeoWp9qhP2va1vukd+Tg1g/5J3LeR8ffh5LHV/U8wOZ2aL6JNrb0jmqbvQ5/6JmVUdwcWuISJvokYPRMGxPtMHes1ag3Zxb1AT728mm7SXZ11Edfh9HBUyW6BuYGimlvuQi4mzeQ64P/pE5zPdAynLsMXcFubLZlpE3clz3eVct3Evn6kmOnO85hVxB6xMtpbO8v9EfSK67ux8sprL+Ic15v9YN5WT5lYD+qft2kE7/5ut0gtap1fczwNq0xG+MJu+0w/6Sb+MQ0MaUyPJXEsr+Z7ndO0yvv0FANgf8A==</latexit><latexit sha1_base64="ZK5bFa43axHgbs5xAR/Ti3Bw214=">AAAFynicjZRLb9NAEMenAUMJrxSOXKJGSBzaKo++jhXPHIpUUNNWaqrIj01ixS+tNyXB8o0Dn4YrfBW+Cwf+OzZVmuI0ttYzOzv7m8fatiLPjVW9/nuldOeuce/+6oPyw0ePnzytrD07icOxtEXHDr1QnllmLDw3EB3lKk+cRVKYvuWJU2v0Rq+fXgoZu2FwrKaRuPDNQeD2XdtUMPUq610lJoo5iWPK0aYUTpp0fVMNrX4ySXvHaa9Sq2/V+areVBq5UqP8OgrXSm3qkkMh2TQmnwQFpKB7ZFKM+5waVKcItgtKYJPQXF4XlFIZe8fwEvAwYR3hOcDsPLcGmGtmzLttRPEwJHZW6SXGeyZa8NZRBfQY8g/GV7YNCiMkTNYZTiEtJmbMj1hRNITPbXv93DNdeqeuS1Gf9rkeFxlGbNGV2lect1iRsI14pUrv2HMAhsXzS/QggOwgA93nf4Qq1+xAmiwFU4KcaIInIXX/dT7F1VnghxxdR4wXnFOCp4QeoWp9qhP2va1vukd+Tg1g/5J3LeR8ffh5LHV/U8wOZ2aL6JNrb0jmqbvQ5/6JmVUdwcWuISJvokYPRMGxPtMHes1ag3Zxb1AT728mm7SXZ11Edfh9HBUyW6BuYGimlvuQi4mzeQ64P/pE5zPdAynLsMXcFubLZlpE3clz3eVct3Evn6kmOnO85hVxB6xMtpbO8v9EfSK67ux8sprL+Ic15v9YN5WT5lYD+qft2kE7/5ut0gtap1fczwNq0xG+MJu+0w/6Sb+MQ0MaUyPJXEsr+Z7ndO0yvv0FANgf8A==</latexit>

xrecT<latexit sha1_base64="y1CoVHxts1jZ/PCbFVp0+WysN/g=">AAAF13icjZRLb9NAEMenAUMJr5QeuURESBzakkdfx6q8cgCpoKYtakrkxyax4pfWTkmxLG6IKwc+DVf4EHwXDvx3vFRpStLYWs/s7OxvHmvbijw3TqrV3wuFa9eNGzcXbxVv37l7735p6cFBHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqfXDUyFjNwz2k7NInPhmL3C7rm0mMHVKT9uJGCXMSR1TDlalcLK07ZtJ3+qmo+xDyg6pFHaWdfazTqlSXavyVb6s1LRSIX3thUuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0GOMlEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJlvsJJQHz5X7fW1Zzb3TlVXQl3a5npcZBixRVVqn3OeY0XCNuCVMr1gzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6PzWbRRxfekNxTdaHL/RNjqyqCi119RF5FjR6IgmO9o1e0y1qNNnGvUB3vby7rtKWznkZ1+H0cTGU2QF3BUEwltyFnE8fz7HF/1IlOZroFUp5hg7kNzOfNdBp1Q+e6ybmu454/U0V0Jnj1c+IGWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf1tRr0t+uVnab+my3SQ3pET7ifO9SkPXxhNn2nH/STfhnvjc/GF+Nr7lpY0HuW6cJlfPsLu6MmGw==</latexit><latexit sha1_base64="y1CoVHxts1jZ/PCbFVp0+WysN/g=">AAAF13icjZRLb9NAEMenAUMJr5QeuURESBzakkdfx6q8cgCpoKYtakrkxyax4pfWTkmxLG6IKwc+DVf4EHwXDvx3vFRpStLYWs/s7OxvHmvbijw3TqrV3wuFa9eNGzcXbxVv37l7735p6cFBHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqfXDUyFjNwz2k7NInPhmL3C7rm0mMHVKT9uJGCXMSR1TDlalcLK07ZtJ3+qmo+xDyg6pFHaWdfazTqlSXavyVb6s1LRSIX3thUuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0GOMlEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJlvsJJQHz5X7fW1Zzb3TlVXQl3a5npcZBixRVVqn3OeY0XCNuCVMr1gzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6PzWbRRxfekNxTdaHL/RNjqyqCi119RF5FjR6IgmO9o1e0y1qNNnGvUB3vby7rtKWznkZ1+H0cTGU2QF3BUEwltyFnE8fz7HF/1IlOZroFUp5hg7kNzOfNdBp1Q+e6ybmu454/U0V0Jnj1c+IGWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf1tRr0t+uVnab+my3SQ3pET7ifO9SkPXxhNn2nH/STfhnvjc/GF+Nr7lpY0HuW6cJlfPsLu6MmGw==</latexit><latexit sha1_base64="y1CoVHxts1jZ/PCbFVp0+WysN/g=">AAAF13icjZRLb9NAEMenAUMJr5QeuURESBzakkdfx6q8cgCpoKYtakrkxyax4pfWTkmxLG6IKwc+DVf4EHwXDvx3vFRpStLYWs/s7OxvHmvbijw3TqrV3wuFa9eNGzcXbxVv37l7735p6cFBHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqfXDUyFjNwz2k7NInPhmL3C7rm0mMHVKT9uJGCXMSR1TDlalcLK07ZtJ3+qmo+xDyg6pFHaWdfazTqlSXavyVb6s1LRSIX3thUuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0GOMlEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJlvsJJQHz5X7fW1Zzb3TlVXQl3a5npcZBixRVVqn3OeY0XCNuCVMr1gzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6PzWbRRxfekNxTdaHL/RNjqyqCi119RF5FjR6IgmO9o1e0y1qNNnGvUB3vby7rtKWznkZ1+H0cTGU2QF3BUEwltyFnE8fz7HF/1IlOZroFUp5hg7kNzOfNdBp1Q+e6ybmu454/U0V0Jnj1c+IGWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf1tRr0t+uVnab+my3SQ3pET7ifO9SkPXxhNn2nH/STfhnvjc/GF+Nr7lpY0HuW6cJlfPsLu6MmGw==</latexit><latexit sha1_base64="y1CoVHxts1jZ/PCbFVp0+WysN/g=">AAAF13icjZRLb9NAEMenAUMJr5QeuURESBzakkdfx6q8cgCpoKYtakrkxyax4pfWTkmxLG6IKwc+DVf4EHwXDvx3vFRpStLYWs/s7OxvHmvbijw3TqrV3wuFa9eNGzcXbxVv37l7735p6cFBHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqfXDUyFjNwz2k7NInPhmL3C7rm0mMHVKT9uJGCXMSR1TDlalcLK07ZtJ3+qmo+xDyg6pFHaWdfazTqlSXavyVb6s1LRSIX3thUuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0GOMlEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJlvsJJQHz5X7fW1Zzb3TlVXQl3a5npcZBixRVVqn3OeY0XCNuCVMr1gzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6PzWbRRxfekNxTdaHL/RNjqyqCi119RF5FjR6IgmO9o1e0y1qNNnGvUB3vby7rtKWznkZ1+H0cTGU2QF3BUEwltyFnE8fz7HF/1IlOZroFUp5hg7kNzOfNdBp1Q+e6ybmu454/U0V0Jnj1c+IGWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf1tRr0t+uVnab+my3SQ3pET7ifO9SkPXxhNn2nH/STfhnvjc/GF+Nr7lpY0HuW6cJlfPsLu6MmGw==</latexit>

xrec1<latexit sha1_base64="1XI0ZTjMfpT97+P/snDwEWuJQ2w=">AAAF13icjZRLb9NAEMenAUMJj6Zw5BIRIXFoS5z0dax45gBSQaQtakrkxyax4pfW25JiWdwQVw58Gq7wIfguHPjv2FRpStLYWs/s7OxvHmvbjn0vUfX674XSlavGteuLN8o3b92+s1RZvruXRMfSEW0n8iN5YFuJ8L1QtJWnfHEQS2EFti/27eFTvb5/ImTiReE7dRqLo8Dqh17PcywFU7fyuKPESDEndS05XJXCzdJOYKmB3UtH2YeUHVIpnCzrmlm3Uquv1fmqXlTMQqlRce1Gy6UWdciliBw6poAEhaSg+2RRgvuQTKpTDNsRpbBJaB6vC8qojL3H8BLwsGAd4tnH7LCwhphrZsK7HUTxMSR2Vukhxgsm2vDWUQX0BPIPxie29adGSJmsMzyFtJmYM19jRdEAPpftDQrPbO6dui5FPdrmejxkGLNFV+qccZ5hRcI25JUqPWfPPhg2z0/QgxCyjQx0n/8RqlyzC2mxFEwJC6IFnoTU/df5TK/OBj/i6DpiMuOcUjwl9BhV61Mdse9lfdM9CgpqCPvHomsR5xvAz2ep+5th9mpsNos+OveG5J66Cz3unxhb1RE87Bog8ipq9EEUHOstvaQnrJm0iXuFGnh/c9mgrSLraVSX38fhVGYT1BUMzdRyG3I2cTzPPvdHn+hkplsg5Rk2mdvEfN5Mp1E3ilw3Odd13PNnqonuBK9xRtwAK5fNubP8P1GfiK47P5+85jL+YebkH+uistdYM6G/Wa/ttIq/2SLdpwf0iPu5Qy3axRfm0Hf6QT/pl/He+Gx8Mb7mrqWFYs89OncZ3/4CA1cl+A==</latexit><latexit sha1_base64="1XI0ZTjMfpT97+P/snDwEWuJQ2w=">AAAF13icjZRLb9NAEMenAUMJj6Zw5BIRIXFoS5z0dax45gBSQaQtakrkxyax4pfW25JiWdwQVw58Gq7wIfguHPjv2FRpStLYWs/s7OxvHmvbjn0vUfX674XSlavGteuLN8o3b92+s1RZvruXRMfSEW0n8iN5YFuJ8L1QtJWnfHEQS2EFti/27eFTvb5/ImTiReE7dRqLo8Dqh17PcywFU7fyuKPESDEndS05XJXCzdJOYKmB3UtH2YeUHVIpnCzrmlm3Uquv1fmqXlTMQqlRce1Gy6UWdciliBw6poAEhaSg+2RRgvuQTKpTDNsRpbBJaB6vC8qojL3H8BLwsGAd4tnH7LCwhphrZsK7HUTxMSR2Vukhxgsm2vDWUQX0BPIPxie29adGSJmsMzyFtJmYM19jRdEAPpftDQrPbO6dui5FPdrmejxkGLNFV+qccZ5hRcI25JUqPWfPPhg2z0/QgxCyjQx0n/8RqlyzC2mxFEwJC6IFnoTU/df5TK/OBj/i6DpiMuOcUjwl9BhV61Mdse9lfdM9CgpqCPvHomsR5xvAz2ep+5th9mpsNos+OveG5J66Cz3unxhb1RE87Bog8ipq9EEUHOstvaQnrJm0iXuFGnh/c9mgrSLraVSX38fhVGYT1BUMzdRyG3I2cTzPPvdHn+hkplsg5Rk2mdvEfN5Mp1E3ilw3Odd13PNnqonuBK9xRtwAK5fNubP8P1GfiK47P5+85jL+YebkH+uistdYM6G/Wa/ttIq/2SLdpwf0iPu5Qy3axRfm0Hf6QT/pl/He+Gx8Mb7mrqWFYs89OncZ3/4CA1cl+A==</latexit><latexit sha1_base64="1XI0ZTjMfpT97+P/snDwEWuJQ2w=">AAAF13icjZRLb9NAEMenAUMJj6Zw5BIRIXFoS5z0dax45gBSQaQtakrkxyax4pfW25JiWdwQVw58Gq7wIfguHPjv2FRpStLYWs/s7OxvHmvbjn0vUfX674XSlavGteuLN8o3b92+s1RZvruXRMfSEW0n8iN5YFuJ8L1QtJWnfHEQS2EFti/27eFTvb5/ImTiReE7dRqLo8Dqh17PcywFU7fyuKPESDEndS05XJXCzdJOYKmB3UtH2YeUHVIpnCzrmlm3Uquv1fmqXlTMQqlRce1Gy6UWdciliBw6poAEhaSg+2RRgvuQTKpTDNsRpbBJaB6vC8qojL3H8BLwsGAd4tnH7LCwhphrZsK7HUTxMSR2Vukhxgsm2vDWUQX0BPIPxie29adGSJmsMzyFtJmYM19jRdEAPpftDQrPbO6dui5FPdrmejxkGLNFV+qccZ5hRcI25JUqPWfPPhg2z0/QgxCyjQx0n/8RqlyzC2mxFEwJC6IFnoTU/df5TK/OBj/i6DpiMuOcUjwl9BhV61Mdse9lfdM9CgpqCPvHomsR5xvAz2ep+5th9mpsNos+OveG5J66Cz3unxhb1RE87Bog8ipq9EEUHOstvaQnrJm0iXuFGnh/c9mgrSLraVSX38fhVGYT1BUMzdRyG3I2cTzPPvdHn+hkplsg5Rk2mdvEfN5Mp1E3ilw3Odd13PNnqonuBK9xRtwAK5fNubP8P1GfiK47P5+85jL+YebkH+uistdYM6G/Wa/ttIq/2SLdpwf0iPu5Qy3axRfm0Hf6QT/pl/He+Gx8Mb7mrqWFYs89OncZ3/4CA1cl+A==</latexit><latexit sha1_base64="1XI0ZTjMfpT97+P/snDwEWuJQ2w=">AAAF13icjZRLb9NAEMenAUMJj6Zw5BIRIXFoS5z0dax45gBSQaQtakrkxyax4pfW25JiWdwQVw58Gq7wIfguHPjv2FRpStLYWs/s7OxvHmvbjn0vUfX674XSlavGteuLN8o3b92+s1RZvruXRMfSEW0n8iN5YFuJ8L1QtJWnfHEQS2EFti/27eFTvb5/ImTiReE7dRqLo8Dqh17PcywFU7fyuKPESDEndS05XJXCzdJOYKmB3UtH2YeUHVIpnCzrmlm3Uquv1fmqXlTMQqlRce1Gy6UWdciliBw6poAEhaSg+2RRgvuQTKpTDNsRpbBJaB6vC8qojL3H8BLwsGAd4tnH7LCwhphrZsK7HUTxMSR2Vukhxgsm2vDWUQX0BPIPxie29adGSJmsMzyFtJmYM19jRdEAPpftDQrPbO6dui5FPdrmejxkGLNFV+qccZ5hRcI25JUqPWfPPhg2z0/QgxCyjQx0n/8RqlyzC2mxFEwJC6IFnoTU/df5TK/OBj/i6DpiMuOcUjwl9BhV61Mdse9lfdM9CgpqCPvHomsR5xvAz2ep+5th9mpsNos+OveG5J66Cz3unxhb1RE87Bog8ipq9EEUHOstvaQnrJm0iXuFGnh/c9mgrSLraVSX38fhVGYT1BUMzdRyG3I2cTzPPvdHn+hkplsg5Rk2mdvEfN5Mp1E3ilw3Odd13PNnqonuBK9xRtwAK5fNubP8P1GfiK47P5+85jL+YebkH+uistdYM6G/Wa/ttIq/2SLdpwf0iPu5Qy3axRfm0Hf6QT/pl/He+Gx8Mb7mrqWFYs89OncZ3/4CA1cl+A==</latexit>

xrec2<latexit sha1_base64="DOnfc8vXf87vzi7l0ulCsuTOVQg=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews69SzTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXCJsl+Q==</latexit><latexit sha1_base64="DOnfc8vXf87vzi7l0ulCsuTOVQg=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews69SzTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXCJsl+Q==</latexit><latexit sha1_base64="DOnfc8vXf87vzi7l0ulCsuTOVQg=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews69SzTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXCJsl+Q==</latexit><latexit sha1_base64="DOnfc8vXf87vzi7l0ulCsuTOVQg=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews69SzTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXCJsl+Q==</latexit>

xrec3

<latexit sha1_base64="acjUoH/qWfQbzQvebvCfPuTTaiU=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews6zSyTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXDd8l+g==</latexit><latexit sha1_base64="acjUoH/qWfQbzQvebvCfPuTTaiU=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews6zSyTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXDd8l+g==</latexit><latexit sha1_base64="acjUoH/qWfQbzQvebvCfPuTTaiU=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews6zSyTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXDd8l+g==</latexit><latexit sha1_base64="acjUoH/qWfQbzQvebvCfPuTTaiU=">AAAF13icjZRLb9NAEMenKYYSHk3hyCUiQuLQljz6OlbllQNIBZG2qCmRH5vEil9aOyXFsrghrhz4NFzhQ/BdOPDf8VKlKUljaz2zs7O/eaxtK/LcOKlWfy8UFq8Z128s3Szeun3n7nJp5d5BHA6lLVp26IXyyDJj4bmBaCVu4omjSArTtzxxaA2eqvXDUyFjNwzeJWeROPHNXuB2XdtMYOqUnrQTMUqYkzqmHKxJ4WRp2zeTvtVNR9mHlB1SKews6zSyTqlSXa/yVb6s1LRSIX3thyuFJrXJoZBsGpJPggJKoHtkUoz7mGpUpQi2E0phk9BcXheUURF7h/AS8DBhHeDZw+xYWwPMFTPm3TaieBgSO8v0COMFEy14q6gCegz5B+MT23pTI6RMVhmeQVpMzJmvsZJQHz5X7fW1Zzb3TlVXQl3a4XpcZBixRVVqn3OeYUXCNuCVMj1nzx4YFs9P0YMAsoUMVJ//EcpcswNpshRMCTTRBE9Cqv6rfKZXZ4EfcnQVMZ5xTimeEnqEqtWpjtj3qr6pHvmaGsD+UXct5Hx9+HksVX8zzF6NzWbRRxfekNxTdaHL/RNjqyqCi119RF5DjR6IgmO9pZe0x1qNtnCvUh3vby7rtK2znkZ1+H0cTGU2QF3FUEwldyBnE8fz7HF/1IlOZroNUp5hg7kNzOfNdBp1U+e6xblu4J4/U0V0Jnj1c+ImWLlszJ3l/4nqRFTd+fnkNRfxD6tN/rEuKwf19Rr0NxuV3ab+my3RA3pIj7mfu9SkfXxhNn2nH/STfhnvjc/GF+Nr7lpY0Hvu04XL+PYXDd8l+g==</latexit>

. . .<latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit>

. . .<latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit><latexit sha1_base64="uymionaINI6e05psYgGq5hWkTDA=">AAAFw3icjZRLb9NAEMenAUMJrxSOvURESBzaKo++jhVQyAGkgkhbqakqPzap5afWm5IS+cCn4Qofh+/Cgf+OlypNcRpb65mdnf3NY207aehnqtn8vVS5c9e6d3/5QfXho8dPntZWnh1myUi6oucmYSKPHTsToR+LnvJVKI5TKezICcWRE7zR60cXQmZ+En9Rl6k4jexh7A9811YwndVW+0qMFXMmni2DdSm8fNL3EpXlZ7VGc6PJV/2m0jJKg8x1kKxUutQnjxJyaUQRCYpJQQ/Jpgz3CbWoSSlspzSBTULzeV1QTlXsHcFLwMOGNcBziNmJscaYa2bGu11ECTEkdtbpJcY7Jjrw1lEF9AzyD8Y3tg1LI0yYrDO8hHSYWDA/YkXROXxu2xsZz3zhnbouRQPa5Xp8ZJiyRVfqXnHeYkXCFvBKnfbZcwiGw/ML9CCG7CED3ed/hDrX7EHaLAVTYkO0wZOQuv86n/LqHPATjq4jZnPOaYKnhJ6ian2qY/a9rW+6R5GhxrB/NV1LON8IfiFL3d8csw9Ts3n08bU3pPDUXRhw/8TUqo7gY9c5Iq+jxhBEwbE+03t6zVqLtnGvURvvbyHbtGOyLqN6/D4GpcwOqGsYmqnlLuR84nSeQ+6PPtHZTHdAKjLsMLeD+aKZllG3TK7bnOsm7sUz1URvhte+Im6BVcjOwln+n6hPRNddnE9RcxX/sNbsH+umctjeaEH/tNnY65q/2TKt0gt6xf3coy4d4Atz6Tv9oJ/0y9q3AktaqnCtLJk9z+naZeV/AYN+HM8=</latexit>

z1<latexit sha1_base64="aR17N6m1L6YpRFm8ACnRgjuN6+4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tFSd9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVdMrdRJXwfRaqVNXepRRA6NKCBBIUnoPlmU4j4hkxoUw3ZKGWwJNI/XBeVUxd4RvAQ8LFiHeLqYnWhriLliprzbQRQfI8HOGr3AeMdEG94qqoCeQv7BmLDNLY2QMVlleAFpM7Fg7mNF0gA+N+0NtGe+8E5Vl6Q+7XI9HjKM2aIqdS45b7CSwDbklRq9ZU8XDJvn5+hBCNlBBqrP/wg1rrkHabEUTAk10QIvgVT9V/mUV2eDH3F0FTGdc04Zngn0GFWrUx2z7019Uz0KNDWE/YvuWsT5BvDzWar+5ph9mJrNo4+vvCGFp+pCn/snplZVBA+7Boi8jhp9EAXH+kTv6RVrJm3jXqMm3t9CNmlHZ11G7fH7OCxltkBdw1BMJXch5xOn83S5P+pEZzPdAanIsMXcFuaLZlpG3dK5bnOum7gXz1QRezO85iVxC6xCthbO8v9EdSKq7uJ8ipqr+IeZs3+s68phc8OE/nGzvtfWf7NlekbP6SX3c4/adIAvzKHv9IN+0i9j30iNifG1cK0s6T1P6cplfPsLx8sgRg==</latexit><latexit sha1_base64="aR17N6m1L6YpRFm8ACnRgjuN6+4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tFSd9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVdMrdRJXwfRaqVNXepRRA6NKCBBIUnoPlmU4j4hkxoUw3ZKGWwJNI/XBeVUxd4RvAQ8LFiHeLqYnWhriLliprzbQRQfI8HOGr3AeMdEG94qqoCeQv7BmLDNLY2QMVlleAFpM7Fg7mNF0gA+N+0NtGe+8E5Vl6Q+7XI9HjKM2aIqdS45b7CSwDbklRq9ZU8XDJvn5+hBCNlBBqrP/wg1rrkHabEUTAk10QIvgVT9V/mUV2eDH3F0FTGdc04Zngn0GFWrUx2z7019Uz0KNDWE/YvuWsT5BvDzWar+5ph9mJrNo4+vvCGFp+pCn/snplZVBA+7Boi8jhp9EAXH+kTv6RVrJm3jXqMm3t9CNmlHZ11G7fH7OCxltkBdw1BMJXch5xOn83S5P+pEZzPdAanIsMXcFuaLZlpG3dK5bnOum7gXz1QRezO85iVxC6xCthbO8v9EdSKq7uJ8ipqr+IeZs3+s68phc8OE/nGzvtfWf7NlekbP6SX3c4/adIAvzKHv9IN+0i9j30iNifG1cK0s6T1P6cplfPsLx8sgRg==</latexit><latexit sha1_base64="aR17N6m1L6YpRFm8ACnRgjuN6+4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tFSd9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVdMrdRJXwfRaqVNXepRRA6NKCBBIUnoPlmU4j4hkxoUw3ZKGWwJNI/XBeVUxd4RvAQ8LFiHeLqYnWhriLliprzbQRQfI8HOGr3AeMdEG94qqoCeQv7BmLDNLY2QMVlleAFpM7Fg7mNF0gA+N+0NtGe+8E5Vl6Q+7XI9HjKM2aIqdS45b7CSwDbklRq9ZU8XDJvn5+hBCNlBBqrP/wg1rrkHabEUTAk10QIvgVT9V/mUV2eDH3F0FTGdc04Zngn0GFWrUx2z7019Uz0KNDWE/YvuWsT5BvDzWar+5ph9mJrNo4+vvCGFp+pCn/snplZVBA+7Boi8jhp9EAXH+kTv6RVrJm3jXqMm3t9CNmlHZ11G7fH7OCxltkBdw1BMJXch5xOn83S5P+pEZzPdAanIsMXcFuaLZlpG3dK5bnOum7gXz1QRezO85iVxC6xCthbO8v9EdSKq7uJ8ipqr+IeZs3+s68phc8OE/nGzvtfWf7NlekbP6SX3c4/adIAvzKHv9IN+0i9j30iNifG1cK0s6T1P6cplfPsLx8sgRg==</latexit><latexit sha1_base64="aR17N6m1L6YpRFm8ACnRgjuN6+4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tFSd9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVdMrdRJXwfRaqVNXepRRA6NKCBBIUnoPlmU4j4hkxoUw3ZKGWwJNI/XBeVUxd4RvAQ8LFiHeLqYnWhriLliprzbQRQfI8HOGr3AeMdEG94qqoCeQv7BmLDNLY2QMVlleAFpM7Fg7mNF0gA+N+0NtGe+8E5Vl6Q+7XI9HjKM2aIqdS45b7CSwDbklRq9ZU8XDJvn5+hBCNlBBqrP/wg1rrkHabEUTAk10QIvgVT9V/mUV2eDH3F0FTGdc04Zngn0GFWrUx2z7019Uz0KNDWE/YvuWsT5BvDzWar+5ph9mJrNo4+vvCGFp+pCn/snplZVBA+7Boi8jhp9EAXH+kTv6RVrJm3jXqMm3t9CNmlHZ11G7fH7OCxltkBdw1BMJXch5xOn83S5P+pEZzPdAanIsMXcFuaLZlpG3dK5bnOum7gXz1QRezO85iVxC6xCthbO8v9EdSKq7uJ8ipqr+IeZs3+s68phc8OE/nGzvtfWf7NlekbP6SX3c4/adIAvzKHv9IN+0i9j30iNifG1cK0s6T1P6cplfPsLx8sgRg==</latexit>

z2<latexit sha1_base64="frhMF5L7qNdn3WKuCUf0Ow31gL4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlTh9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVeaWqmTvg6i1UqbutSjiBwaUUCCQpLQfbIoxX1CTWpQDNspZbAl0DxeF5RTFXtH8BLwsGAd4ulidqKtIeaKmfJuB1F8jAQ7a/QC4x0TbXirqAJ6CvkHY8I2tzRCxmSV4QWkzcSCuY8VSQP43LQ30J75wjtVXZL6tMv1eMgwZouq1LnkvMFKAtuQV2r0lj1dMGyen6MHIWQHGag+/yPUuOYepMVSMCXURAu8BFL1X+VTXp0NfsTRVcR0zjlleCbQY1StTnXMvjf1TfUo0NQQ9i+6axHnG8DPZ6n6m2P2YWo2jz6+8oYUnqoLfe6fmFpVETzsGiDyOmr0QRQc6xO9p1esNWkb9xqZeH8LadKOzrqM2uP3cVjKbIG6hqGYSu5CzidO5+lyf9SJzma6A1KRYYu5LcwXzbSMuqVz3eZcN3Evnqki9mZ45iVxC6xCthbO8v9EdSKq7uJ8ipqr+Ic1Z/9Y15VDc6MJ/eNmfa+t/2bL9Iye00vu5x616QBfmEPf6Qf9pF/GvpEaE+Nr4VpZ0nue0pXL+PYXzQ8gRw==</latexit><latexit sha1_base64="frhMF5L7qNdn3WKuCUf0Ow31gL4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlTh9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVeaWqmTvg6i1UqbutSjiBwaUUCCQpLQfbIoxX1CTWpQDNspZbAl0DxeF5RTFXtH8BLwsGAd4ulidqKtIeaKmfJuB1F8jAQ7a/QC4x0TbXirqAJ6CvkHY8I2tzRCxmSV4QWkzcSCuY8VSQP43LQ30J75wjtVXZL6tMv1eMgwZouq1LnkvMFKAtuQV2r0lj1dMGyen6MHIWQHGag+/yPUuOYepMVSMCXURAu8BFL1X+VTXp0NfsTRVcR0zjlleCbQY1StTnXMvjf1TfUo0NQQ9i+6axHnG8DPZ6n6m2P2YWo2jz6+8oYUnqoLfe6fmFpVETzsGiDyOmr0QRQc6xO9p1esNWkb9xqZeH8LadKOzrqM2uP3cVjKbIG6hqGYSu5CzidO5+lyf9SJzma6A1KRYYu5LcwXzbSMuqVz3eZcN3Evnqki9mZ45iVxC6xCthbO8v9EdSKq7uJ8ipqr+Ic1Z/9Y15VDc6MJ/eNmfa+t/2bL9Iye00vu5x616QBfmEPf6Qf9pF/GvpEaE+Nr4VpZ0nue0pXL+PYXzQ8gRw==</latexit><latexit sha1_base64="frhMF5L7qNdn3WKuCUf0Ow31gL4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlTh9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVeaWqmTvg6i1UqbutSjiBwaUUCCQpLQfbIoxX1CTWpQDNspZbAl0DxeF5RTFXtH8BLwsGAd4ulidqKtIeaKmfJuB1F8jAQ7a/QC4x0TbXirqAJ6CvkHY8I2tzRCxmSV4QWkzcSCuY8VSQP43LQ30J75wjtVXZL6tMv1eMgwZouq1LnkvMFKAtuQV2r0lj1dMGyen6MHIWQHGag+/yPUuOYepMVSMCXURAu8BFL1X+VTXp0NfsTRVcR0zjlleCbQY1StTnXMvjf1TfUo0NQQ9i+6axHnG8DPZ6n6m2P2YWo2jz6+8oYUnqoLfe6fmFpVETzsGiDyOmr0QRQc6xO9p1esNWkb9xqZeH8LadKOzrqM2uP3cVjKbIG6hqGYSu5CzidO5+lyf9SJzma6A1KRYYu5LcwXzbSMuqVz3eZcN3Evnqki9mZ45iVxC6xCthbO8v9EdSKq7uJ8ipqr+Ic1Z/9Y15VDc6MJ/eNmfa+t/2bL9Iye00vu5x616QBfmEPf6Qf9pF/GvpEaE+Nr4VpZ0nue0pXL+PYXzQ8gRw==</latexit><latexit sha1_base64="frhMF5L7qNdn3WKuCUf0Ow31gL4=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlTh9HSueOVCpINJWaqrKdjaOFb9kb0oa4yMHPg1X+Ch8Fw78d7xUaYrT2FrP7Ozsbx5r2459L5WNxu+lyq3bxp27y/eq9x88fPR4ZfXJYRqNEkd0nMiPkmPbSoXvhaIjPemL4zgRVmD74sgevlbrR+ciSb0o/CwvYnEaWG7o9T3HkjCdrdS7Uowlc7KelQzXbX8k8qwbWHJg97NJfmbm8GpsNPiqXVeaWqmTvg6i1UqbutSjiBwaUUCCQpLQfbIoxX1CTWpQDNspZbAl0DxeF5RTFXtH8BLwsGAd4ulidqKtIeaKmfJuB1F8jAQ7a/QC4x0TbXirqAJ6CvkHY8I2tzRCxmSV4QWkzcSCuY8VSQP43LQ30J75wjtVXZL6tMv1eMgwZouq1LnkvMFKAtuQV2r0lj1dMGyen6MHIWQHGag+/yPUuOYepMVSMCXURAu8BFL1X+VTXp0NfsTRVcR0zjlleCbQY1StTnXMvjf1TfUo0NQQ9i+6axHnG8DPZ6n6m2P2YWo2jz6+8oYUnqoLfe6fmFpVETzsGiDyOmr0QRQc6xO9p1esNWkb9xqZeH8LadKOzrqM2uP3cVjKbIG6hqGYSu5CzidO5+lyf9SJzma6A1KRYYu5LcwXzbSMuqVz3eZcN3Evnqki9mZ45iVxC6xCthbO8v9EdSKq7uJ8ipqr+Ic1Z/9Y15VDc6MJ/eNmfa+t/2bL9Iye00vu5x616QBfmEPf6Qf9pF/GvpEaE+Nr4VpZ0nue0pXL+PYXzQ8gRw==</latexit>

z3<latexit sha1_base64="lB5QihQ4zoLDquFEDWwrgokspBc=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlUdfx4pnDlQqiLSVmqryY+NY8Uv2pqQxPnLg03CFj8J34cB/x0uVpjiNrfXMzs7+5rG2rdj3Utlo/F6q3Lpt3Lm7fK96/8HDR49XVp8cptEosUXXjvwoObbMVPheKLrSk744jhNhBpYvjqzha7V+dC6S1IvCz/IiFqeB6YZe37NNCdPZSr0nxVgyJ3PMZLhu+SORZ73AlAOrn03ys3YOr8ZGg6/adaWplTrp6yBarXSoRw5FZNOIAhIUkoTuk0kp7hNqUoNi2E4pgy2B5vG6oJyq2DuCl4CHCesQTxezE20NMVfMlHfbiOJjJNhZoxcY75howVtFFdBTyD8YE7a5pREyJqsMLyAtJhbMfaxIGsDnpr2B9swX3qnqktSnXa7HQ4YxW1Sl9iXnDVYS2Ia8UqO37OmCYfH8HD0IIbvIQPX5H6HGNTuQJkvBlFATTfASSNV/lU95dRb4EUdXEdM555ThmUCPUbU61TH73tQ31aNAU0PYv+iuRZxvAD+fpepvjtmHqdk8+vjKG1J4qi70uX9ialVF8LBrgMjrqNEHUXCsT/SeXrHWpG3ca9TC+1vIFu3orMuoDr+Pw1JmG9Q1DMVUchdyPnE6T5f7o050NtMdkIoM28xtY75opmXULZ3rNue6iXvxTBXRmeG1LolbYBWyvXCW/yeqE1F1F+dT1FzFP6w5+8e6rhy2NprQP27W9zr6b7ZMz+g5veR+7lGHDvCF2fSdftBP+mXsG6kxMb4WrpUlvecpXbmMb38B0lMgSA==</latexit><latexit sha1_base64="lB5QihQ4zoLDquFEDWwrgokspBc=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlUdfx4pnDlQqiLSVmqryY+NY8Uv2pqQxPnLg03CFj8J34cB/x0uVpjiNrfXMzs7+5rG2rdj3Utlo/F6q3Lpt3Lm7fK96/8HDR49XVp8cptEosUXXjvwoObbMVPheKLrSk744jhNhBpYvjqzha7V+dC6S1IvCz/IiFqeB6YZe37NNCdPZSr0nxVgyJ3PMZLhu+SORZ73AlAOrn03ys3YOr8ZGg6/adaWplTrp6yBarXSoRw5FZNOIAhIUkoTuk0kp7hNqUoNi2E4pgy2B5vG6oJyq2DuCl4CHCesQTxezE20NMVfMlHfbiOJjJNhZoxcY75howVtFFdBTyD8YE7a5pREyJqsMLyAtJhbMfaxIGsDnpr2B9swX3qnqktSnXa7HQ4YxW1Sl9iXnDVYS2Ia8UqO37OmCYfH8HD0IIbvIQPX5H6HGNTuQJkvBlFATTfASSNV/lU95dRb4EUdXEdM555ThmUCPUbU61TH73tQ31aNAU0PYv+iuRZxvAD+fpepvjtmHqdk8+vjKG1J4qi70uX9ialVF8LBrgMjrqNEHUXCsT/SeXrHWpG3ca9TC+1vIFu3orMuoDr+Pw1JmG9Q1DMVUchdyPnE6T5f7o050NtMdkIoM28xtY75opmXULZ3rNue6iXvxTBXRmeG1LolbYBWyvXCW/yeqE1F1F+dT1FzFP6w5+8e6rhy2NprQP27W9zr6b7ZMz+g5veR+7lGHDvCF2fSdftBP+mXsG6kxMb4WrpUlvecpXbmMb38B0lMgSA==</latexit><latexit sha1_base64="lB5QihQ4zoLDquFEDWwrgokspBc=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlUdfx4pnDlQqiLSVmqryY+NY8Uv2pqQxPnLg03CFj8J34cB/x0uVpjiNrfXMzs7+5rG2rdj3Utlo/F6q3Lpt3Lm7fK96/8HDR49XVp8cptEosUXXjvwoObbMVPheKLrSk744jhNhBpYvjqzha7V+dC6S1IvCz/IiFqeB6YZe37NNCdPZSr0nxVgyJ3PMZLhu+SORZ73AlAOrn03ys3YOr8ZGg6/adaWplTrp6yBarXSoRw5FZNOIAhIUkoTuk0kp7hNqUoNi2E4pgy2B5vG6oJyq2DuCl4CHCesQTxezE20NMVfMlHfbiOJjJNhZoxcY75howVtFFdBTyD8YE7a5pREyJqsMLyAtJhbMfaxIGsDnpr2B9swX3qnqktSnXa7HQ4YxW1Sl9iXnDVYS2Ia8UqO37OmCYfH8HD0IIbvIQPX5H6HGNTuQJkvBlFATTfASSNV/lU95dRb4EUdXEdM555ThmUCPUbU61TH73tQ31aNAU0PYv+iuRZxvAD+fpepvjtmHqdk8+vjKG1J4qi70uX9ialVF8LBrgMjrqNEHUXCsT/SeXrHWpG3ca9TC+1vIFu3orMuoDr+Pw1JmG9Q1DMVUchdyPnE6T5f7o050NtMdkIoM28xtY75opmXULZ3rNue6iXvxTBXRmeG1LolbYBWyvXCW/yeqE1F1F+dT1FzFP6w5+8e6rhy2NprQP27W9zr6b7ZMz+g5veR+7lGHDvCF2fSdftBP+mXsG6kxMb4WrpUlvecpXbmMb38B0lMgSA==</latexit><latexit sha1_base64="lB5QihQ4zoLDquFEDWwrgokspBc=">AAAFy3icjZRLb9NAEMenAUMJrxaOXCIiJA5tlUdfx4pnDlQqiLSVmqryY+NY8Uv2pqQxPnLg03CFj8J34cB/x0uVpjiNrfXMzs7+5rG2rdj3Utlo/F6q3Lpt3Lm7fK96/8HDR49XVp8cptEosUXXjvwoObbMVPheKLrSk744jhNhBpYvjqzha7V+dC6S1IvCz/IiFqeB6YZe37NNCdPZSr0nxVgyJ3PMZLhu+SORZ73AlAOrn03ys3YOr8ZGg6/adaWplTrp6yBarXSoRw5FZNOIAhIUkoTuk0kp7hNqUoNi2E4pgy2B5vG6oJyq2DuCl4CHCesQTxezE20NMVfMlHfbiOJjJNhZoxcY75howVtFFdBTyD8YE7a5pREyJqsMLyAtJhbMfaxIGsDnpr2B9swX3qnqktSnXa7HQ4YxW1Sl9iXnDVYS2Ia8UqO37OmCYfH8HD0IIbvIQPX5H6HGNTuQJkvBlFATTfASSNV/lU95dRb4EUdXEdM555ThmUCPUbU61TH73tQ31aNAU0PYv+iuRZxvAD+fpepvjtmHqdk8+vjKG1J4qi70uX9ialVF8LBrgMjrqNEHUXCsT/SeXrHWpG3ca9TC+1vIFu3orMuoDr+Pw1JmG9Q1DMVUchdyPnE6T5f7o050NtMdkIoM28xtY75opmXULZ3rNue6iXvxTBXRmeG1LolbYBWyvXCW/yeqE1F1F+dT1FzFP6w5+8e6rhy2NprQP27W9zr6b7ZMz+g5veR+7lGHDvCF2fSdftBP+mXsG6kxMb4WrpUlvecpXbmMb38B0lMgSA==</latexit>

zT<latexit sha1_base64="6iYoDJ+lpJKTT9Oc6DJP5Avk3n4=">AAAFy3icjZRLT9tAEMeHULc0fQDtsZeoUaUeADkJryPqM4ci0YoAEkHIdjaJFb9kbyDE9bGHfppe24/S79JD/zveohDqEFvrmZ2d/c1jbduR5ybSNH8vlBbvGfcfLD0sP3r85OnyyuqzoyQcxo5oOaEXxie2lQjPDURLutITJ1EsLN/2xLE9eKvWjy9EnLhhcCivInHmW73A7bqOJWE6X6m2pRhJ5qQdKx6s295QZGnbt2Tf7qbj7Pwwg5e5YfJVua3UtFIlfR2Eq6UmtalDITk0JJ8EBSShe2RRgvuUamRSBNsZpbDF0FxeF5RRGXuH8BLwsGAd4NnD7FRbA8wVM+HdDqJ4GDF2VugVxgcm2vBWUQX0BPIPxphtvcIIKZNVhleQNhNz5j5WJPXhc9deX3tmc+9UdUnq0i7X4yLDiC2qUuea8w4rMWwDXqnQe/bsgWHz/AI9CCBbyED1+R+hwjV3IC2WgimBJlrgxZCq/yqf4ups8EOOriImM84pxTOGHqFqdaoj9r2rb6pHvqYGsF/qroWcrw8/j6Xqb4bZp4nZLProxhuSe6oudLl/YmJVRXCxq4/I66jRA1FwrC/0kd6wVqNt3GtUx/ubyzrt6KyLqB1+HweFzAaoaxiKqeQu5GziZJ497o860elMd0DKM2wwt4H5vJkWUbd0rtuc6ybu+TNVxM4Ur35N3AIrl425s/w/UZ2Iqjs/n7zmMv5htek/1m3lqL5Rg/55s7rX1H+zJXpBL+k193OPmnSAL8yh7/SDftIvY99IjLHxNXctLeg9z+nGZXz7C4AmIGk=</latexit><latexit sha1_base64="6iYoDJ+lpJKTT9Oc6DJP5Avk3n4=">AAAFy3icjZRLT9tAEMeHULc0fQDtsZeoUaUeADkJryPqM4ci0YoAEkHIdjaJFb9kbyDE9bGHfppe24/S79JD/zveohDqEFvrmZ2d/c1jbduR5ybSNH8vlBbvGfcfLD0sP3r85OnyyuqzoyQcxo5oOaEXxie2lQjPDURLutITJ1EsLN/2xLE9eKvWjy9EnLhhcCivInHmW73A7bqOJWE6X6m2pRhJ5qQdKx6s295QZGnbt2Tf7qbj7Pwwg5e5YfJVua3UtFIlfR2Eq6UmtalDITk0JJ8EBSShe2RRgvuUamRSBNsZpbDF0FxeF5RRGXuH8BLwsGAd4NnD7FRbA8wVM+HdDqJ4GDF2VugVxgcm2vBWUQX0BPIPxphtvcIIKZNVhleQNhNz5j5WJPXhc9deX3tmc+9UdUnq0i7X4yLDiC2qUuea8w4rMWwDXqnQe/bsgWHz/AI9CCBbyED1+R+hwjV3IC2WgimBJlrgxZCq/yqf4ups8EOOriImM84pxTOGHqFqdaoj9r2rb6pHvqYGsF/qroWcrw8/j6Xqb4bZp4nZLProxhuSe6oudLl/YmJVRXCxq4/I66jRA1FwrC/0kd6wVqNt3GtUx/ubyzrt6KyLqB1+HweFzAaoaxiKqeQu5GziZJ497o860elMd0DKM2wwt4H5vJkWUbd0rtuc6ybu+TNVxM4Ur35N3AIrl425s/w/UZ2Iqjs/n7zmMv5htek/1m3lqL5Rg/55s7rX1H+zJXpBL+k193OPmnSAL8yh7/SDftIvY99IjLHxNXctLeg9z+nGZXz7C4AmIGk=</latexit><latexit sha1_base64="6iYoDJ+lpJKTT9Oc6DJP5Avk3n4=">AAAFy3icjZRLT9tAEMeHULc0fQDtsZeoUaUeADkJryPqM4ci0YoAEkHIdjaJFb9kbyDE9bGHfppe24/S79JD/zveohDqEFvrmZ2d/c1jbduR5ybSNH8vlBbvGfcfLD0sP3r85OnyyuqzoyQcxo5oOaEXxie2lQjPDURLutITJ1EsLN/2xLE9eKvWjy9EnLhhcCivInHmW73A7bqOJWE6X6m2pRhJ5qQdKx6s295QZGnbt2Tf7qbj7Pwwg5e5YfJVua3UtFIlfR2Eq6UmtalDITk0JJ8EBSShe2RRgvuUamRSBNsZpbDF0FxeF5RRGXuH8BLwsGAd4NnD7FRbA8wVM+HdDqJ4GDF2VugVxgcm2vBWUQX0BPIPxphtvcIIKZNVhleQNhNz5j5WJPXhc9deX3tmc+9UdUnq0i7X4yLDiC2qUuea8w4rMWwDXqnQe/bsgWHz/AI9CCBbyED1+R+hwjV3IC2WgimBJlrgxZCq/yqf4ups8EOOriImM84pxTOGHqFqdaoj9r2rb6pHvqYGsF/qroWcrw8/j6Xqb4bZp4nZLProxhuSe6oudLl/YmJVRXCxq4/I66jRA1FwrC/0kd6wVqNt3GtUx/ubyzrt6KyLqB1+HweFzAaoaxiKqeQu5GziZJ497o860elMd0DKM2wwt4H5vJkWUbd0rtuc6ybu+TNVxM4Ur35N3AIrl425s/w/UZ2Iqjs/n7zmMv5htek/1m3lqL5Rg/55s7rX1H+zJXpBL+k193OPmnSAL8yh7/SDftIvY99IjLHxNXctLeg9z+nGZXz7C4AmIGk=</latexit><latexit sha1_base64="6iYoDJ+lpJKTT9Oc6DJP5Avk3n4=">AAAFy3icjZRLT9tAEMeHULc0fQDtsZeoUaUeADkJryPqM4ci0YoAEkHIdjaJFb9kbyDE9bGHfppe24/S79JD/zveohDqEFvrmZ2d/c1jbduR5ybSNH8vlBbvGfcfLD0sP3r85OnyyuqzoyQcxo5oOaEXxie2lQjPDURLutITJ1EsLN/2xLE9eKvWjy9EnLhhcCivInHmW73A7bqOJWE6X6m2pRhJ5qQdKx6s295QZGnbt2Tf7qbj7Pwwg5e5YfJVua3UtFIlfR2Eq6UmtalDITk0JJ8EBSShe2RRgvuUamRSBNsZpbDF0FxeF5RRGXuH8BLwsGAd4NnD7FRbA8wVM+HdDqJ4GDF2VugVxgcm2vBWUQX0BPIPxphtvcIIKZNVhleQNhNz5j5WJPXhc9deX3tmc+9UdUnq0i7X4yLDiC2qUuea8w4rMWwDXqnQe/bsgWHz/AI9CCBbyED1+R+hwjV3IC2WgimBJlrgxZCq/yqf4ups8EOOriImM84pxTOGHqFqdaoj9r2rb6pHvqYGsF/qroWcrw8/j6Xqb4bZp4nZLProxhuSe6oudLl/YmJVRXCxq4/I66jRA1FwrC/0kd6wVqNt3GtUx/ubyzrt6KyLqB1+HweFzAaoaxiKqeQu5GziZJ497o860elMd0DKM2wwt4H5vJkWUbd0rtuc6ybu+TNVxM4Ur35N3AIrl425s/w/UZ2Iqjs/n7zmMv5htek/1m3lqL5Rg/55s7rX1H+zJXpBL+k193OPmnSAL8yh7/SDftIvY99IjLHxNXctLeg9z+nGZXz7C4AmIGk=</latexit>

. . .<latexit sha1_base64="oniLNB+97jHfxnUorJL93TlfZ54=">AAAFxHicjZRLb9NAEMenKYYSHk3hCIeICIlDW+XR17HiUXIAqSDSVmqqyo9Nann9kL1pUyJz4NNwhW/Dd+HAf8dLlaYkja31zM7O/uaxtp1E+pmq138vlBbvWHfvLd0vP3j46PFyZeXJQRYPUld03FjG6ZFjZ0L6kegoX0lxlKTCDh0pDp3gjV4/PBdp5sfRF3WZiJPQ7kd+z3dtBdNp5XlXiaFizsiz02DNkQORj7perLL8tFKrr9f5qt5UGkapkbn245VSm7rkUUwuDSgkQREp6JJsynAfU4PqlMB2QiPYUmg+rwvKqYy9A3gJeNiwBnj2MTs21ghzzcx4t4soEiPFziq9xNhjogNvHVVAzyD/YHxlW39qhBGTdYaXkA4TC+ZHrCg6g89te0Pjmc+9U9elqEc7XI+PDBO26ErdK85brKSwBbxSpXfs2QfD4fk5ehBBdpCB7vM/QpVr9iBtloIpkSHa4KWQuv86n+nVOeDHHF1HzGac0wjPFHqCqvWpDtn3tr7pHoWGGsF+YboWc74h/CRL3d8csw9js1n04bU3pPDUXehx/8TYqo7gY9cZIq+hRgmi4Fif6T29Zq1BW7hXqYn3t5BN2jZZT6N6/D4GU5ktUFcxNFPLHcjZxPE8+9wffaKTmW6DVGTYYm4L83kznUbdNLluca4buOfPVBO9CV7zirgJViFbc2f5f6I+EV13cT5FzWX8wxqTf6ybykFzvQH900Ztt23+Zkv0jF7QK+7nLrVpH1+YS9/pB/2kX9aeJa3MGhSupQWz5yldu6xvfwH1NR1G</latexit><latexit sha1_base64="oniLNB+97jHfxnUorJL93TlfZ54=">AAAFxHicjZRLb9NAEMenKYYSHk3hCIeICIlDW+XR17HiUXIAqSDSVmqqyo9Nann9kL1pUyJz4NNwhW/Dd+HAf8dLlaYkja31zM7O/uaxtp1E+pmq138vlBbvWHfvLd0vP3j46PFyZeXJQRYPUld03FjG6ZFjZ0L6kegoX0lxlKTCDh0pDp3gjV4/PBdp5sfRF3WZiJPQ7kd+z3dtBdNp5XlXiaFizsiz02DNkQORj7perLL8tFKrr9f5qt5UGkapkbn245VSm7rkUUwuDSgkQREp6JJsynAfU4PqlMB2QiPYUmg+rwvKqYy9A3gJeNiwBnj2MTs21ghzzcx4t4soEiPFziq9xNhjogNvHVVAzyD/YHxlW39qhBGTdYaXkA4TC+ZHrCg6g89te0Pjmc+9U9elqEc7XI+PDBO26ErdK85brKSwBbxSpXfs2QfD4fk5ehBBdpCB7vM/QpVr9iBtloIpkSHa4KWQuv86n+nVOeDHHF1HzGac0wjPFHqCqvWpDtn3tr7pHoWGGsF+YboWc74h/CRL3d8csw9js1n04bU3pPDUXehx/8TYqo7gY9cZIq+hRgmi4Fif6T29Zq1BW7hXqYn3t5BN2jZZT6N6/D4GU5ktUFcxNFPLHcjZxPE8+9wffaKTmW6DVGTYYm4L83kznUbdNLluca4buOfPVBO9CV7zirgJViFbc2f5f6I+EV13cT5FzWX8wxqTf6ybykFzvQH900Ztt23+Zkv0jF7QK+7nLrVpH1+YS9/pB/2kX9aeJa3MGhSupQWz5yldu6xvfwH1NR1G</latexit><latexit sha1_base64="oniLNB+97jHfxnUorJL93TlfZ54=">AAAFxHicjZRLb9NAEMenKYYSHk3hCIeICIlDW+XR17HiUXIAqSDSVmqqyo9Nann9kL1pUyJz4NNwhW/Dd+HAf8dLlaYkja31zM7O/uaxtp1E+pmq138vlBbvWHfvLd0vP3j46PFyZeXJQRYPUld03FjG6ZFjZ0L6kegoX0lxlKTCDh0pDp3gjV4/PBdp5sfRF3WZiJPQ7kd+z3dtBdNp5XlXiaFizsiz02DNkQORj7perLL8tFKrr9f5qt5UGkapkbn245VSm7rkUUwuDSgkQREp6JJsynAfU4PqlMB2QiPYUmg+rwvKqYy9A3gJeNiwBnj2MTs21ghzzcx4t4soEiPFziq9xNhjogNvHVVAzyD/YHxlW39qhBGTdYaXkA4TC+ZHrCg6g89te0Pjmc+9U9elqEc7XI+PDBO26ErdK85brKSwBbxSpXfs2QfD4fk5ehBBdpCB7vM/QpVr9iBtloIpkSHa4KWQuv86n+nVOeDHHF1HzGac0wjPFHqCqvWpDtn3tr7pHoWGGsF+YboWc74h/CRL3d8csw9js1n04bU3pPDUXehx/8TYqo7gY9cZIq+hRgmi4Fif6T29Zq1BW7hXqYn3t5BN2jZZT6N6/D4GU5ktUFcxNFPLHcjZxPE8+9wffaKTmW6DVGTYYm4L83kznUbdNLluca4buOfPVBO9CV7zirgJViFbc2f5f6I+EV13cT5FzWX8wxqTf6ybykFzvQH900Ztt23+Zkv0jF7QK+7nLrVpH1+YS9/pB/2kX9aeJa3MGhSupQWz5yldu6xvfwH1NR1G</latexit><latexit sha1_base64="oniLNB+97jHfxnUorJL93TlfZ54=">AAAFxHicjZRLb9NAEMenKYYSHk3hCIeICIlDW+XR17HiUXIAqSDSVmqqyo9Nann9kL1pUyJz4NNwhW/Dd+HAf8dLlaYkja31zM7O/uaxtp1E+pmq138vlBbvWHfvLd0vP3j46PFyZeXJQRYPUld03FjG6ZFjZ0L6kegoX0lxlKTCDh0pDp3gjV4/PBdp5sfRF3WZiJPQ7kd+z3dtBdNp5XlXiaFizsiz02DNkQORj7perLL8tFKrr9f5qt5UGkapkbn245VSm7rkUUwuDSgkQREp6JJsynAfU4PqlMB2QiPYUmg+rwvKqYy9A3gJeNiwBnj2MTs21ghzzcx4t4soEiPFziq9xNhjogNvHVVAzyD/YHxlW39qhBGTdYaXkA4TC+ZHrCg6g89te0Pjmc+9U9elqEc7XI+PDBO26ErdK85brKSwBbxSpXfs2QfD4fk5ehBBdpCB7vM/QpVr9iBtloIpkSHa4KWQuv86n+nVOeDHHF1HzGac0wjPFHqCqvWpDtn3tr7pHoWGGsF+YboWc74h/CRL3d8csw9js1n04bU3pPDUXehx/8TYqo7gY9cZIq+hRgmi4Fif6T29Zq1BW7hXqYn3t5BN2jZZT6N6/D4GU5ktUFcxNFPLHcjZxPE8+9wffaKTmW6DVGTYYm4L83kznUbdNLluca4buOfPVBO9CV7zirgJViFbc2f5f6I+EV13cT5FzWX8wxqTf6ybykFzvQH900Ztt23+Zkv0jF7QK+7nLrVpH1+YS9/pB/2kX9aeJa3MGhSupQWz5yldu6xvfwH1NR1G</latexit>

CNN CNN CNN CNN

(a) Architecture sketch

zt xtGP

m(·)k(·, ·)

t ∈ {1, . . . , T}

(b) Graphical model

Figure 1: Overview of our proposed model with a convolutional inference network, a deep feed-forward generativenetwork and a Gaussian process prior with mean function m(·) and kernel function k(·, ·) in latent space. Theseveral CNN blocks as well as the MLP blocks are each sharing their respective parameters.

HI-VAE was not formulated for sequential data andtherefore does not exploit temporal information in theimputation task.

Deep learning for time series imputation.While the mentioned deep learning approaches arevery promising, most of them do not take the timeseries nature of the data directly into account, that is,they do not model the temporal dynamics of the datawhen dealing with missing values. To the best of ourknowledge, the only deep generative model for missingvalue imputation that does account for the time seriesnature of the data is the GRUI-GAN [Luo et al., 2018],which we describe in Sec. 4. Another deep learningmodel for time series imputation is BRITS [Cao et al.,2018], which uses recurrent neural networks (RNNs).It is trained in a self-supervised way, predicting theobservations in a time series sequentially. We compareagainst both of these models in our experiments.

Other related work. Our proposed model combinesseveral ideas from the domains of Bayesian deep learn-ing and classical probabilistic modeling; thus, removingelements from our model naturally relates to other ap-proaches. For example, removing the latent GP formodeling dynamics as well as our proposed structuredvariational distribution results in the HI-VAE [Nazabalet al., 2018] described above. Furthermore, our idea ofusing a latent GP in the context of a deep generativemodel bears similarities to the GPPVAE [Casale et al.,2018] and can be seen as an extension of this model.While the GPPVAE allows for a joint GP prior overthe whole data set through the use of a specializedinference mechanism, our model puts a separate GPprior on every time series and can hence rely on morestandard inference techniques. However, our modeltakes missingness into account and uses a structuredvariational distribution, while the GPPVAE uses meanfield inference and is designed for fully observed data.

Lastly, the GP prior with the Cauchy kernel is remi-niscent of Jähnichen et al. [2018] and the structuredvariational distribution is similar to the one used byBamler and Mandt [2017b] in the context of model-ing word embeddings over time, neither of which usedamortized inference.

3 Model

We propose a novel architecture for missing value im-putation, an overview of which is depicted in Figure 1.Our model can be seen as a way to perform amortizedapproximate inference on a latent Gaussian processmodel.

The main idea of our proposed approach is to embedthe data into a latent space of reduced dimensional-ity, in which every dimension is fully determined, andthen model the temporal dynamics in this latent space.Since many features in the data might be correlated,the latent representation captures these correlationsand uses them to reconstruct the missing values. More-over, the GP prior in the latent space encourages themodel to embed the data into a representation in whichthe temporal dynamics are smoother and more easilyexplainable than in the original data space. Finally,the structured variational distribution of the inferencenetwork allows the model to integrate temporal infor-mation into the representations, such that the recon-structions of missing values cannot only be informedby correlated observed features at the same time point,but also by the rest of time series.

Specifically, we combine ideas from VAEs [Kingma andWelling, 2014], GPs [Rasmussen, 2003], Cauchy kernels[Jähnichen et al., 2018], structured variational distri-butions with efficient inference [Bamler and Mandt,2017b], and a special ELBO for missing data [Nazabalet al., 2018] and synthesize these ideas into a general

Page 4: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

framework for missing data imputation on time series.In the following, we will outline the problem setting,describe the assumed generative model, and derive ourproposed inference scheme.

3.1 Problem setting and notation

We assume a data set X ∈ RT×d with T data pointsxt = [xt1, . . . , xtj , . . . , xtd]

> ∈ Rd. Let us assume thatthe T data points were measured at T consecutivetime points τ = [τ1, . . . , τT ]

> with τt < τt+1 ∀t. Byconvention, we usually set τ1 = 0. The data X canthus be viewed as a time series of length τT in time.

We moreover assume that any number of these datafeatures xtj can be missing, that is, that their val-ues can be unknown. We can now partition eachdata point into observed and unobserved features.The observed features of data point xt are xot :=[xtj |xtj is observed]. Equivalently, the missing fea-tures are xmt := [xtj |xtj is missing] with xot ∪xmt ≡ xt.

We can now use this partitioning to define the problemof missing value imputation. Missing value imputationdescribes the problem of estimating the true values ofthe missing features Xm := [xmt ]1:T given the observedfeatures Xo := [xot ]1:T . Many methods assume thedifferent data points to be independent, in which casethe inference problem reduces to T separate problemsof estimating p(xmt |xot ). In the time series setting, thisindependence assumption is not satisfied, which leads tothe more complex estimation problem of p(xmt |xo1:T ).

3.2 Generative model

In this subsection, we describe the details of our pro-posed approach of reducing the observed time serieswith missing data into a representation without missing-ness, and modeling dynamics in this lower-dimensionalrepresentation using Gaussian processes. Yet, it istempting to try to skip the step of dimensionality reduc-tion and instead directly try to model the incompletedata in the observed space using GPs. We argue thatthis is not practical for several reasons.

Gaussian processes are well suited for time series mod-eling [Roberts et al., 2013] and offer many advantages,such as data-efficiency and calibrated posterior proba-bilities. However, they come at the cost of inverting thekernel matrix, which has a time complexity of O(n3).Moreover, designing a kernel function that accuratelycaptures correlations in feature space and also in thetemporal dimension is difficult.

This problem becomes even worse if certain observa-tions are missing. One option is to fill the missingvalues with some numerical value (e.g., zero) to makethe kernel computable. However, this arbitrary filling

may make two data points with different missingnesspatterns look very dissimilar when in fact they are closeto each other in the ground-truth space. Another alter-native is to treat every channel of the multivariate timeseries separately and let the GP infer missing values,but this ignores valuable correlations across channels.

In this work, we overcome the problem of defining asuitable GP kernel in the data space with missing ob-servations by instead applying the GP in the latentspace of a variational autoencoder where the encodedfeature representations are complete. That is, we as-sign a latent variable zt ∈ Rk for every xt, and modeltemporal correlations in this reduced representationusing a GP, z(τ) ∼ GP(mz(·), kz(·, ·)). This way, wedecouple the step of filling in missing values and cap-turing instantaneous correlations between the differentfeature dimensions from modeling dynamical aspects.The graphical model is depicted in Figure 1b.

A remaining practical difficulty that we encountered isthat many multivariate time series display dynamics atmultiple time scales. One of our main motivations isto model time series that arise in medical setups wheredoctors measure different patient variables and vitalsigns, such as heart rate, blood pressure, etc. Whenusing conventional GP kernels (e.g., the RBF kernel,kRBF (r |λ) = exp

(−λr2/2

)), one implicitly assumes a

single time scale of relevance (λ). We found that thischoice does not reflect the dynamics of medical datavery well.

In order to model data that varies at multiple timescales, we consider a mixture of RBF kernels withdifferent λ’s [Rasmussen, 2003]. By defining a Gammadistribution over the length scale, that is, p(λ |α, β) ∝λα−1 exp (−αλ/β), we can compute an infinite mixtureof RBF kernels,∫

p(λ |α, β) kRBF (r |λ) dλ ∝(1 +

r2

2αβ−1

)−α.

This yields the so-called Rational Quadratic kernel[Rasmussen, 2003]. For α = 1 and l2 = 2β−1, itreduces to the Cauchy kernel

kCau(τ, τ′) = σ2

(1 +

(τ − τ ′)2

l2

)−1, (1)

which has previously been successfully used in thecontext of robust dynamic topic modeling where similarmulti-scale time dynamics occur [Jähnichen et al., 2018].We therefore choose this kernel for our Gaussian processprior.

Given the latent time series z1:T , the observations xtare generated time-point-wise by

pθ(xt | zt) = N(gθ(zt), σ

2I), (2)

Page 5: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

Vincent Fortuin, Dmitry Baranchuk, Gunnar Rätsch, Stephan Mandt

where gθ(·) is a potentially nonlinear function param-eterized by the parameter vector θ. Considering thescenario of a medical time series, zt can be thoughtof as the latent physiological state of the patient andgθ(·) would be the process of generating observablemeasurements xt (e.g., heart rate, blood pressure, etc.)from that physiological state. In our experiments, thefunction gθ is implemented by a deep neural network.

3.3 Inference model

In order to learn the parameters of the deep generativemodel described above, and in order to efficiently in-fer its latent state, we are interested in the posteriordistribution p(z1:T |x1:T ). Since the exact posterior isintractable, we use variational inference [Jordan et al.,1999, Blei et al., 2017, Zhang et al., 2018]. Furthermore,to avoid inference over per-datapoint (local) variationalparameters, we apply inference amortization [Kingmaand Welling, 2014]. To make our variational distribu-tion more expressive and capture the temporal correla-tions of the data, we employ a structured variationaldistribution [Wainwright and Jordan, 2008] with effi-cient inference that leads to an approximate posteriorwhich is also a GP.

We approximate the true posterior p(z1:T,j |x1:T ) witha multivariate Gaussian variational distribution

q(z1:T,j |xo1:T ) = N(mj ,Λ

−1j

), (3)

where j indexes the dimensions in the latent space.Our approximation implies that our variational poste-rior is able to reflect correlations in time, but breaksdependencies across the different dimensions in z-space(which is typical in VAE training [Kingma and Welling,2014, Rezende et al., 2014]).

We choose the variational family to be the family ofmultivariate Gaussian distributions in the time domain,where the precision matrix Λj is parameterized in termsof a product of bidiagonal matrices,

Λj := B>j Bj ,with {Bj}tt′ =

{bjtt′ if t′ ∈ {t, t+ 1}0 otherwise

.

(4)Above, the bjtt′ ’s are local variational parameters andBj is an upper triangular band matrix. Similar struc-tured distributions were also employed by Blei andLafferty [2006], Bamler and Mandt [2017a].

This parameterization automatically leads to Λj beingpositive definite, symmetric, and tridiagonal. Samplesfrom q can thus be generated in linear time in T [Huangand McColl, 1997, Mallik, 2001, Bamler and Mandt,2017b] as opposed to the cubic time complexity fora full-rank matrix. Moreover, compared to a fullyfactorized variational approximation, the number of

variational parameters are merely doubled. Note thatwhile the precision matrix is sparse, the covariancematrix can still be dense, allowing to reflect long-rangedependencies in time.

Instead of optimizing m and B separately for everydata point, we amortize the inference through an in-ference network with parameters ψ that computesthe variational parameters based on the inputs as(m,B) = hψ(x

o1:T ). In the following, we accordingly

denote the variational distribution as qψ(·). FollowingVAE training, the parameters of the generative modelθ and inference network ψ can be jointly trained byoptimizing the evidence lower bound (ELBO),

log p(Xo) ≥T∑t=1

Eqψ(zt |x1:T ) [log pθ(xot | zt)]

− β DKL [qψ(z1:T |xo1:T ) ‖ p(z1:T )]

(5)

Following Nazabal et al. [2018] (see Sec. 2), we evaluatethe ELBO only on the observed features of the datasince the remaining features are unknown, and set thesemissing features to a fixed value (zero) during inference.We also include an additional tradeoff parameter β intoour ELBO, similar to the β-VAE [Higgins et al., 2017].This parameter can be used to rebalance the influenceof the likelihood term and KL term on the ELBO.Since the likelihood factorizes into a sum over observedfeature dimensions, its absolute magnitude depends onthe missingness rate in the data. We thus choose βin our experiments dependent on the missingness rateto counteract this effect. Our training objective is theRHS of (5).

4 Experiments

We performed experiments on the benchmark data setHealing MNIST [Krishnan et al., 2015], which com-bines the classical MNIST data set [LeCun et al., 1998]with properties common to medical time series, theSPRITES data set [Li and Mandt, 2018], and on areal-world medical data set from the 2012 PhysionetChallenge [Silva et al., 2012]. We compared our modelagainst classical imputation baselines as well as moderndeep learning approaches. We found strong quantita-tive and qualitative evidence that our proposed modeloutperforms most of the baseline methods in termsof imputation quality on all three tasks and performscomparably to the state of the art, while also offering aprobabilistic interpretation and uncertainty estimates.In the following, we are first going to give an overviewof the baseline methods and then present our experi-mental findings. Details about the data sets and neuralnetwork architectures can be found in Appendix A. Animplementation of our model can be retrieved fromhttps://github.com/ratschlab/GP-VAE.

Page 6: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

Table 1: Performance of the different models on the Healing MNIST test set and the SPRITES test set in terms ofnegative log likelihood [NLL] and mean squared error [MSE] (lower is better), as well as downstream classificationperformance [AUROC] (higher is better). The reported values are means and their respective standard errorsover the test set. The proposed model outperforms all the baselines.

Healing MNIST SPRITES

Model NLL MSE AUROC MSE

Mean imputation - 0.168 ± 0.000 0.938 ± 0.000 0.013 ± 0.000Forward imputation - 0.177 ± 0.000 0.935 ± 0.000 0.028 ± 0.000VAE 0.599 ± 0.002 0.232 ± 0.000 0.922 ± 0.000 0.028 ± 0.000HI-VAE 0.372 ± 0.008 0.134 ± 0.003 0.962 ± 0.001 0.007 ± 0.000GP-VAE (proposed) 0.350 ± 0.007 0.114 ± 0.002 0.960 ± 0.002 0.002 ± 0.000

4.1 Baseline methods

Forward imputation and mean imputation.Forward and mean imputation are so-called single im-putation methods, which do not attempt to fit a dis-tribution over possible values for the missing features,but only predict one estimate [Little and Rubin, 2002].Forward imputation always predicts the last observedvalue for any given feature, while mean imputationpredicts the mean of all the observations of the featurein a given time series.

Gaussian process in data space. One option todeal with missingness in multivariate time series is tofit independent Gaussian processes to each channel.As discussed previously (Sec. 3.2), this ignores thecorrelation between channels. The missing values arethen imputed by taking the mean of the respectiveposterior of the GP for that feature.

VAE and HI-VAE. The VAE [Kingma and Welling,2014] and HI-VAE [Nazabal et al., 2018] are fit to thedata using the same training procedure as the proposedGP-VAE model. The VAE uses a standard ELBO thatis defined over all the features, while the HI-VAE usesthe ELBO from (5), which is only evaluated on theobserved part of the feature space. During inference,missing features are filled with constant values, suchas zero.

GRUI-GAN and BRITS. The GRUI-GAN [Luoet al., 2018] uses a recurrent neural network (RNN),namely a gated recurrent unit (GRU). Once the GANis trained on a set of time series, an unseen time seriesis imputed by optimizing the latent vector in the inputspace of the generator, such that the generator’s outputon the observed features is closest to the true values.BRITS [Cao et al., 2018] also uses an RNN, namelya bidirectional long short-term memory network (BiL-STM). For a series of partially observed states, thenetwork is trained to predict any intermediate stategiven its past and future states, aggregated by two

separate LSTM layers. Similarly to our approach, theloss function is only computed on the observed values.

4.2 Healing MNIST

Time series with missing values play a crucial role in themedical field, but are often hard to obtain. Krishnanet al. [2015] generated a data set called Healing MNIST,which is designed to reflect many properties that onealso finds in real medical data. We benchmark ourmethod on a variant of this data set. It was designedto incorporate some properties that one also finds inreal medical data, and consists of short sequences ofmoving MNIST digits [LeCun et al., 1998] that rotaterandomly between frames. The analogy to healthcareis that every frame may represent the collection ofmeasurements that describe a patient’s health state,which contains many missing measurements at eachmoment in time. The temporal evolution representsthe non-linear evolution of the patient’s health state.The image frames contain around 60 % missing pixelsand the rotations between two consecutive frames arenormally distributed.

The benefit of this data set is that we know the groundtruth of the imputation task. We compare our modelagainst a standard VAE (no latent GP and standardELBO over all features), the HI-VAE [Nazabal et al.,2018], as well as mean imputation and forward imputa-tion. The models were trained on time series of digitsfrom the Healing MNIST training set (50,000 time se-ries) and tested on digits from the Healing MNISTtest set (10,000 time series). Negative log likelihoodson the ground truth values of the missing pixels andmean squared errors (MSE) are reported in Table 1,and qualitative results shown in Figure 2. To assess theusefulness of the imputations for downstream tasks, wealso trained a linear classifier on the imputed MNISTdigits to predict the digit class and measured its per-formance in terms of area under the receiver-operator-characteristic curve (AUROC) (Tab. 1).

Page 7: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

Vincent Fortuin, Dmitry Baranchuk, Gunnar Rätsch, Stephan Mandt

Table 2: Performance of different models on Healing MNIST data with artificial missingness and differentmissingness mechanisms. We report mean squared error (lower is better). The reported values are means and theirrespective standard errors over the test set. Our model outperforms the baselines on all missingness mechanisms.

Mechanism Mean imp. Forward imp. VAE HI-VAE GP-VAE (proposed)

Random 0.069 ± 0.000 0.099 ± 0.000 0.100 ± 0.000 0.046 ± 0.001 0.036 ± 0.000Spatial 0.090 ± 0.000 0.099 ± 0.000 0.122 ± 0.000 0.097 ± 0.000 0.071 ± 0.001Temporal+ 0.107 ± 0.000 0.117 ± 0.000 0.101 ± 0.000 0.047 ± 0.001 0.038 ± 0.001Temporal− 0.086 ± 0.000 0.093 ± 0.000 0.101 ± 0.000 0.046 ± 0.001 0.037 ± 0.001MNAR 0.168 ± 0.000 0.177 ± 0.000 0.232 ± 0.000 0.134 ± 0.003 0.114 ± 0.002

vae reconstructLon, NLL_mLssLng 0.0835, NLL_full 0.3327

hL-vae reconstructLon, NLL_mLssLng 0.0477, NLL_full 0.2098

gp-vae reconstructLon, 1LL_mLssLng 0.0430, 1LL_full 0.2012

Input

VAE

HI-VAE

GP-VAE

Original

Healing MNIST SPRITES

Figure 2: Reconstructions from Healing MNIST andSPRITES. The GP-VAE (proposed) is stable over timeand yields the highest fidelity.

Our approach outperforms the baselines in terms oflikelihood and MSE. The reconstructions (Fig. 2) re-veal the benefits of the GP-VAE approach: relatedapproaches yield unstable reconstructions over time,while our approach offers more stable reconstructions,using temporal information from neighboring frames.Moreover, our model also yields the most useful imputa-tions for downstream classification in terms of AUROC.The downstream classification performance correlateswell with the test likelihood on the ground truth data,supporting the intuition that it is a good proxy mea-sure in cases where the ground truth likelihood is notavailable.

We also observe that our model outperforms the base-lines on different missingness mechanisms (Tab. 2).Details regarding this evaluation are laid out in theappendix (Sec. B.2).

4.3 SPRITES data

To assess our model’s performance on more complexdata, we applied it to the SPRITES data set, which

has previously been used with sequential autoencoders[Li and Mandt, 2018]. The dataset consists of 9,000sequences of animated characters with different clothes,hair styles, and skin colors, performing different actions.Each frame has a size of 64×64 pixels and each time se-ries features 8 frames. We again introduced about 60 %of missing pixels and compared the same methods asabove. The results are reported in Table 1 and examplereconstructions are shown in Figure 2. As in the previ-ous experiment, our model outperforms the baselinesand also yields the most convincing reconstructions.

4.4 Real medical time series data

We also applied our model to the data set from the 2012Physionet Challenge [Silva et al., 2012]. The data setcontains 12,000 patients which were monitored on theintensive care unit (ICU) for 48 hours each. At eachhour, there is a measurement of 35 different variables(heart rate, blood pressure, etc.), any number of whichmight be missing.

We again compare our model against the standard VAEand HI-VAE, as well as a GP fit feature-wise in the dataspace, the GRUI-GAN [Luo et al., 2018] and BRITSmodel [Cao et al., 2018], which reported state-of-the-artimputation performance.

The main challenge is the absence of ground truthdata for the missing values. This cannot easily becircumvented by introducing additional missingnesssince (1) the mechanism by which measurements wereomitted is not random, and (2) the data set is alreadyvery sparse with about 80% of the features missing.To overcome this issue, Luo et al. [2018] proposed adownstream task as a proxy for the imputation quality.They chose the task of mortality prediction, which wasone of the main tasks of the Physionet Challenge onthis data set, and measured the performance in termsof AUROC. In this paper, we adopt this measure.

For sake of interpretability, we used a logistic regressionas a downstream classification model. This model triesto optimally separate the whole time series in the input

Page 8: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

0 8 16 24 32 40 48

−1.25

−1.00

−0.75

−0.50

−0.25

0.00

0.25

0.50

0 8 16 24 32 40 48

−0.5

0.0

0.5

1.0

1.5

0 8 16 24 32 40 48

−1.0

−0.5

0.0

0.5

1.0

Forward GP-VAE BRITS

Figure 3: Imputations of several clinical variables with different amounts of missingness for an example patientfrom the Physionet 2012 test set. Blue dots are observed ground-truth points, red dots are ground-truth pointsthat were withheld from the models. BRITS (red) and forward imputation (green) yield single imputations, whilethe GP-VAE (blue) allows to draw samples from the posterior. The GP-VAE produces smoother curves, reducingnoise from the original input, and exhibits an interpretable posterior uncertainty.

Table 3: Performance of the different models on thePhysionet data set in terms of AUROC of a logisticregression trained on the imputed time series. Weobserve that the proposed model performs comparablyto the state of the art.

Model AUROC

Mean imputation 0.703 ± 0.000Forward imputation 0.710 ± 0.000GP 0.704 ± 0.007VAE 0.677 ± 0.002HI-VAE 0.686 ± 0.010GRUI-GAN 0.702 ± 0.009BRITS 0.742 ± 0.008GP-VAE (proposed) 0.730 ± 0.006

space using a linear hyperplane. The choice of modelfollows the intuition that under a perfect imputationsimilar patients should be located close to each other inthe input space, while that is not necessarily the casewhen features are missing or the imputation is poor.

Note that it is unrealistic to ask for high accuracies inthis task, as the clean data are unlikely to be perfectlyseparable. As seen in Table 1, this proxy measurecorrelates well with the ground truth likelihood.

The performances of the different methods under thismeasure are reported in Table 3. Our model outper-forms most baselines, including the GRUI-GAN, andperforms comparably to the state-of-the-art methodBRITS. This provides strong evidence that our modelis well suited for real-world imputation tasks. Interest-ingly, forward imputation performs on a competitivelevel with many of the baselines, suggesting that thosebaselines do not succeed in discovering any nonlinear

structure in the data.

Note that, while BRITS outperforms our proposedmodel quantitatively, it does not fit a generative modelto the data and does not offer any probabilistic interpre-tation. In contrast, our model can be used to sampledifferent imputations from the variational posteriorand thus provides an interpretable way of estimatingthe uncertainty of the predictions, as is depicted inFigure 3. The imputations on different clinical vari-ables in the figure show how the posterior of our modelexhibits different levels of uncertainty for different fea-tures (blue shaded area). The uncertainty estimatescorrelate qualitatively with the sparseness of the fea-tures and the noise levels of the measurements. Thiscould be communicated to end-users, such as clinicians,to help them make informed decisions about the degreeof trust they should have in the model.

It can also be seen in Figure 3 that our model producessmoother curves than the baselines. This is likely dueto the denoising effect of our GP prior, which acts ina similar way to a Kalman filter in this case [Kalman,1960]. Especially when the data are very noisy, suchas medical time series, this smoothing can make theimputations more interpretable for humans and canhelp in identifying temporal trends.

5 Conclusion

We presented a deep probabilistic model for multi-variate time series imputation, combining ideas fromvariational autoencoders and Gaussian processes. Weobserved that our model outperforms classical base-lines as well as modern deep learning approaches onbenchmark data sets and real-world medical data andperforms comparably to the state of the art.

Page 9: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

Vincent Fortuin, Dmitry Baranchuk, Gunnar Rätsch, Stephan Mandt

Acknowledgments

VF is supported by a PhD fellowship from the SwissData Science Center and by the PHRT grant #2017-110of the ETH Domain. DB was partially supported by theETH summer fellowship program. SM acknowledgesfunding from DARPA (HR001119S0038), NSF (FW-HTF-RM), and Qualcomm. We thank Robert Bamler,Gideon Dresdner, Claire Vernade, Rich Turner, andthe anonymous reviewers for their valuable feedback.

This is version #645023440864087831186872693254 ofthis paper. The other 2100−1 versions exist on differentbranches of the universe’s wave function [Everett III,1957]. We thank Sean Carroll [Carroll, 2019] for theinspiration for this versioning system.

References

Samuel K Ainsworth, Nicholas J Foti, and Emily B Fox.Disentangled vae representations for multi-aspectand missing data. arXiv preprint arXiv:1806.09060,2018.

Robert Bamler and Stephan Mandt. Dynamic wordembeddings. In Proceedings of the 34th InternationalConference on Machine Learning-Volume 70, pages380–389. JMLR. org, 2017a.

Robert Bamler and Stephan Mandt. Structured blackbox variational inference for latent time series models.arXiv preprint arXiv:1707.01069, 2017b.

Faraj Bashir and Hua-Liang Wei. Handling missingdata in multivariate time series using a vector au-toregressive model-imputation (var-im) algorithm.Neurocomputing, 276:23–30, 2018.

David M Blei and John D Lafferty. Dynamic topicmodels. In Proceedings of the 23rd international con-ference on Machine learning, pages 113–120. ACM,2006.

David M Blei, Alp Kucukelbir, and Jon D McAuliffe.Variational inference: A review for statisticians.Journal of the American Statistical Association, 112(518):859–877, 2017.

Wei Cao, Dong Wang, Jian Li, Hao Zhou, Lei Li, andYitan Li. Brits: bidirectional recurrent imputationfor time series. In Advances in Neural InformationProcessing Systems, pages 6775–6785, 2018.

Sean Carroll. Something deeply hidden: quantum worldsand the emergence of spacetime. Dutton, 2019.

Francesco Paolo Casale, Adrian Dalca, Luca Saglietti,Jennifer Listgarten, and Nicolo Fusi. Gaussian pro-cess prior variational autoencoders. In Advances inNeural Information Processing Systems, pages 10369–10380, 2018.

Adrian V Dalca, John Guttag, and Mert RSabuncu. Unsupervised data imputation via varia-tional inference of deep subspaces. arXiv preprintarXiv:1903.03503, 2019.

Hugh Everett III. Relative state formulation of quan-tum mechanics. Reviews of modern physics, 29(3):454, 1957.

Vincent Fortuin and Gunnar Rätsch. Deep mean func-tions for meta-learning in gaussian processes. arXivpreprint arXiv:1901.08098, 2019.

Vincent Fortuin, Gideon Dresdner, Heiko Strathmann,and Gunnar Rätsch. Scalable gaussian processes ondiscrete domains. arXiv preprint arXiv:1810.10368,2018a.

Vincent Fortuin, Matthias Hüser, Francesco Locatello,Heiko Strathmann, and Gunnar Rätsch. Som-vae:Interpretable discrete representation learning on timeseries. arXiv preprint arXiv:1806.02199, 2018b.

Marta Garnelo, Dan Rosenbaum, Chris J Maddison,Tiago Ramalho, David Saxton, Murray Shanahan,Yee Whye Teh, Danilo J Rezende, and SM Es-lami. Conditional neural processes. arXiv preprintarXiv:1807.01613, 2018a.

Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum,Fabio Viola, Danilo J Rezende, SM Eslami, andYee Whye Teh. Neural processes. arXiv preprintarXiv:1807.01622, 2018b.

Irina Higgins, Loic Matthey, Arka Pal, ChristopherBurgess, Xavier Glorot, Matthew Botvinick, ShakirMohamed, and Alexander Lerchner. beta-vae: Learn-ing basic visual concepts with a constrained varia-tional framework. ICLR, 2(5):6, 2017.

James Honaker and Gary King. What to do about miss-ing values in time-series cross-section data. AmericanJournal of Political Science, 54(2):561–581, 2010.

Y Huang and WF McColl. Analytical inversion ofgeneral tridiagonal matrices. Journal of Physics A:Mathematical and General, 30(22):7919, 1997.

Oleg Ivanov, Michael Figurnov, and Dmitry Vetrov.Variational autoencoder with arbitrary conditioning.arXiv preprint arXiv:1806.02382, 2018.

Patrick Jähnichen, Florian Wenzel, Marius Kloft, andStephan Mandt. Scalable generalized dynamic topicmodels. Conference on Artificial Intelligence andStatistics, 2018.

Michael I Jordan, Zoubin Ghahramani, Tommi SJaakkola, and Lawrence K Saul. An introduction tovariational methods for graphical models. Machinelearning, 37(2):183–233, 1999.

Rudolph Emil Kalman. A new approach to linearfiltering and prediction problems. Journal of basicEngineering, 82(1):35–45, 1960.

Page 10: GP-VAE: Deep Probabilistic Multivariate Time Series Imputationproceedings.mlr.press/v108/fortuin20a/fortuin20a.pdf · 2020. 10. 9. · GP-VAE: Deep Probabilistic Multivariate Time

GP-VAE: Deep Probabilistic Multivariate Time Series Imputation

Diederik P Kingma and Jimmy Ba. Adam: A methodfor stochastic optimization. International Conferenceon Learning Representations, 2015.

Diederik P Kingma and Max Welling. Auto-encodingvariational bayes. International Conference on Learn-ing Representations, 2014.

Rahul G Krishnan, Uri Shalit, and David Sontag. Deepkalman filters. arXiv preprint arXiv:1511.05121,2015.

Rahul G Krishnan, Uri Shalit, and David Sontag. Struc-tured inference networks for nonlinear state spacemodels. In Thirty-First AAAI Conference on Artifi-cial Intelligence, 2017.

Neil D Lawrence. Gaussian process latent variablemodels for visualisation of high dimensional data. InAdvances in neural information processing systems,pages 329–336, 2004.

Yann LeCun, Léon Bottou, Yoshua Bengio, PatrickHaffner, et al. Gradient-based learning applied todocument recognition. Proceedings of the IEEE, 86(11):2278–2324, 1998.

Steven Cheng-Xian Li, Bo Jiang, and Benjamin Marlin.Misgan: Learning from incomplete data with gener-ative adversarial networks. International Conferenceon Learning Representations, 2019.

Yingzhen Li and Stephan Mandt. Disentangled se-quential autoencoder. International Conference onMachine Learning, 2018.

Zachary C Lipton and Subarna Tripathi. Precise re-covery of latent vectors from generative adversarialnetworks. International Conference on Learning Rep-resentations, 2017.

Roderick JA Little and Donald B Rubin. Single im-putation methods. Statistical analysis with missingdata, pages 59–74, 2002.

Yonghong Luo, Xiangrui Cai, Ying Zhang, Jun Xu,et al. Multivariate time series imputation with gen-erative adversarial networks. In Advances in NeuralInformation Processing Systems, pages 1596–1607,2018.

Chao Ma, Sebastian Tschiatschek, Konstantina Palla,Jose Miguel Hernandez Lobato, Sebastian Nowozin,and Cheng Zhang. Eddi: Efficient dynamic discoveryof high-value information with partial vae. Interna-tional Conference on Machine Learning, 2018.

Ranjan K Mallik. The inverse of a tridiagonal matrix.Linear Algebra and its Applications, 325(1-3):109–139, 2001.

Alfredo Nazabal, Pablo M Olmos, Zoubin Ghahra-mani, and Isabel Valera. Handling incompleteheterogeneous data using vaes. arXiv preprintarXiv:1807.03653, 2018.

Alma B Pedersen, Ellen M Mikkelsen, Deirdre Cronin-Fenton, Nickolaj R Kristensen, Tra My Pham, LarsPedersen, and Irene Petersen. Missing data and mul-tiple imputation in clinical epidemiological research.Clinical epidemiology, 9:157, 2017.

Carl Edward Rasmussen. Gaussian processes in ma-chine learning. In Summer School on Machine Learn-ing, pages 63–71. Springer, 2003.

Danilo Jimenez Rezende, Shakir Mohamed, and DaanWierstra. Stochastic backpropagation and approxi-mate inference in deep generative models. Interna-tional Conference on Machine Learning, 2014.

Stephen Roberts, Michael Osborne, Mark Ebden,Steven Reece, Neale Gibson, and Suzanne Aigrain.Gaussian processes for time-series modelling. Philo-sophical Transactions of the Royal Society A: Math-ematical, Physical and Engineering Sciences, 371(1984):20110550, 2013.

Donald B Rubin. Inference and missing data.Biometrika, 63(3):581–592, 1976.

Ikaro Silva, George Moody, Daniel J Scott, Leo ACeli, and Roger G Mark. Predicting in-hospitalmortality of icu patients: The physionet/computingin cardiology challenge 2012. In 2012 Computing inCardiology, pages 245–248. IEEE, 2012.

Michalis Titsias and Neil D Lawrence. Bayesian gaus-sian process latent variable model. In Proceedings ofthe Thirteenth International Conference on ArtificialIntelligence and Statistics, pages 844–851, 2010.

Martin J Wainwright and Michael Jordan. Graph-ical models, exponential families, and variationalinference. Foundations and Trends R© in MachineLearning, 1(1–2):1–305, 2008.

Andrew Gordon Wilson, Zhiting Hu, Ruslan Salakhut-dinov, and Eric P Xing. Deep kernel learning. InArtificial Intelligence and Statistics, pages 370–378,2016.

Jinsung Yoon, James Jordon, and Mihaela VanDer Schaar. Gain: Missing data imputation usinggenerative adversarial nets. International Conferenceon Machine Learning, 2018.

Cheng Zhang, Judith Butepage, Hedvig Kjellstrom, andStephan Mandt. Advances in variational inference.IEEE transactions on pattern analysis and machineintelligence, 2018.