TODO: THIS PAGE IS OUT OF DATE; NEED TO UPDATE IT! I was born in Jingmen, Hubei, a small town in Central China, where I lived for ten years until I moved to Shenzhen, Guangdong, a prosperous metropolis in South China. I received my B.S. in Physics from Peking University in 2020, and currently I’m a first-year graduate student at Department of Astrophysical Sciences, Princeton University. I’m interested in a variety of topics, including cosmology, black hole astrophysics, and Bayesian statistics. I’m the main developer of the BayesFast and CosmoFast packages, which provide novel algorithms for posterior sampling and evidence evaluation that are orders-of-magnitude more efficient than traditional competitors. Here is a link to my two-page academic CV.
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
T. S. Eliot, Burnt Norton
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.T. S. Eliot, Burnt Norton
TODO: THIS PAGE IS OUT OF DATE; NEED TO UPDATE IT! I was born in Jingmen, Hubei, a small town in Central China, where I lived for ten years until I moved to Shenzhen, Guangdong, a prosperous metropolis in South China. I received my B.S. in Physics from Peking University in 2020, and currently I’m a first-year graduate student at Department of Astrophysical Sciences, Princeton University. I’m interested in a variety of topics, including cosmology, black hole astrophysics, and Bayesian statistics. I’m the main developer of the BayesFast and CosmoFast packages, which provide novel algorithms for posterior sampling and evidence evaluation that are orders-of-magnitude more efficient than traditional competitors. Here is a link to my two-page academic CV.
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
T. S. Eliot, Burnt Norton
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.T. S. Eliot, Burnt Norton
TODO: THIS PAGE IS OUT OF DATE; NEED TO UPDATE IT! I was born in Jingmen, Hubei, a small town in Central China, where I lived for ten years until I moved to Shenzhen, Guangdong, a prosperous metropolis in South China. I received my B.S. in Physics from Peking University in 2020, and currently I’m a first-year graduate student at Department of Astrophysical Sciences, Princeton University. I’m interested in a variety of topics, including cosmology, black hole astrophysics, and Bayesian statistics. I’m the main developer of the BayesFast and CosmoFast packages, which provide novel algorithms for posterior sampling and evidence evaluation that are orders-of-magnitude more efficient than traditional competitors. Here is a link to my two-page academic CV.
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
T. S. Eliot, Burnt Norton
Time present and time past
Are both perhaps present in time future,
And time future contained in time past.T. S. Eliot, Burnt Norton
Spotlights
We present BayesFast, a method for fast and scalable Bayesian posterior inference. It is based on polynomial surrogate model expansion combined with Hamiltonian Monte Carlo and importance sampling. We apply the method to several cosmology examples, including 7-9 dimensional Planck and 27 dimensional DES. A typical application needs O(1-10) minutes runtime and O(10^2-10^3) likelihood calls, with posteriors nearly identical to the full MCMC using O(10^5-10^6) likelihood calls. BayesFast implements a new Bayesian evidence estimator, which also achieves orders of magnitude speed-ups and higher accuracy when compared to the standard methods. Many of the components can be parallelized enabling efficient scaling.
He Jia and Uros Seljak, BayesFast: A Fast and Scalable Method for Cosmological Bayesian Inference, in prep.
We present BayesFast, a method for fast and scalable Bayesian posterior inference. It is based on polynomial surrogate model expansion combined with Hamiltonian Monte Carlo and importance sampling. We apply the method to several cosmology examples, including 7-9 dimensional Planck and 27 dimensional DES. A typical application needs O(1-10) minutes runtime and O(10^2-10^3) likelihood calls, with posteriors nearly identical to the full MCMC using O(10^5-10^6) likelihood calls. BayesFast implements a new Bayesian evidence estimator, which also achieves orders of magnitude speed-ups and higher accuracy when compared to the standard methods. Many of the components can be parallelized enabling efficient scaling.
He Jia and Uros Seljak, BayesFast: A Fast and Scalable Method for Cosmological Bayesian Inference, in prep.
We present BayesFast, a method for fast and scalable Bayesian posterior inference. It is based on polynomial surrogate model expansion combined with Hamiltonian Monte Carlo and importance sampling. We apply the method to several cosmology examples, including 7-9 dimensional Planck and 27 dimensional DES. A typical application needs O(1-10) minutes runtime and O(10^2-10^3) likelihood calls, with posteriors nearly identical to the full MCMC using O(10^5-10^6) likelihood calls. BayesFast implements a new Bayesian evidence estimator, which also achieves orders of magnitude speed-ups and higher accuracy when compared to the standard methods. Many of the components can be parallelized enabling efficient scaling.
He Jia and Uros Seljak, BayesFast: A Fast and Scalable Method for Cosmological Bayesian Inference, in prep.
Normalizing constant (also called partition function, Bayesian evidence, or marginal likelihood) is one of the central goals of Bayesian inference, yet most of the existing methods are both expensive and inaccurate. Here we develop a new approach, starting from posterior samples obtained with a standard Markov Chain Monte Carlo (MCMC). We apply a novel Normalizing Flow (NF) approach to obtain an analytic density estimator from these samples, followed by Optimal Bridge Sampling (OBS) to obtain the normalizing constant. We compare our method which we call Gaussianized Bridge Sampling (GBS) to existing methods such as Nested Sampling (NS) and Annealed Importance Sampling (AIS) on several examples, showing our method is both significantly faster and substantially more accurate than these methods, and comes with a reliable error estimation.
Normalizing constant (also called partition function, Bayesian evidence, or marginal likelihood) is one of the central goals of Bayesian inference, yet most of the existing methods are both expensive and inaccurate. Here we develop a new approach, starting from posterior samples obtained with a standard Markov Chain Monte Carlo (MCMC). We apply a novel Normalizing Flow (NF) approach to obtain an analytic density estimator from these samples, followed by Optimal Bridge Sampling (OBS) to obtain the normalizing constant. We compare our method which we call Gaussianized Bridge Sampling (GBS) to existing methods such as Nested Sampling (NS) and Annealed Importance Sampling (AIS) on several examples, showing our method is both significantly faster and substantially more accurate than these methods, and comes with a reliable error estimation.
Normalizing constant (also called partition function, Bayesian evidence, or marginal likelihood) is one of the central goals of Bayesian inference, yet most of the existing methods are both expensive and inaccurate. Here we develop a new approach, starting from posterior samples obtained with a standard Markov Chain Monte Carlo (MCMC). We apply a novel Normalizing Flow (NF) approach to obtain an analytic density estimator from these samples, followed by Optimal Bridge Sampling (OBS) to obtain the normalizing constant. We compare our method which we call Gaussianized Bridge Sampling (GBS) to existing methods such as Nested Sampling (NS) and Annealed Importance Sampling (AIS) on several examples, showing our method is both significantly faster and substantially more accurate than these methods, and comes with a reliable error estimation.
Publications
No. | Author(s) | Title | Reference | arXiv |
---|---|---|---|---|
- / 2 | He Jia and Uros Seljak | BayesFast: A Fast and Scalable Method for Cosmological Bayesian Inference | in prep | N/A |
1 / 1 | He Jia and Uros Seljak | Normalizing Constant Estimation with Gaussianized Bridge Sampling | AABI 2019 Proceedings, PMLR 118:1-14, 2020 | 1912.06073 |
Talks
Date | Title | Place | Link |
---|---|---|---|
Jan 13, 2020 | BayesFast: A Fast and Scalable Method for Cosmological Bayesian Inference | BCCP Workshop, Berkeley, US | Slides |
Dec 29, 2019 | Efficient Bayesian Methods for Posterior Sampling and Evidence Estimation | NAOC Workshop, Beijing, China | Slides |
Dec 8, 2019 | Normalizing Constant Estimation with Gaussianized Bridge Sampling | AABI 2019, Vancouver, Canada | Poster |