ZHU Wanchuang, JI Chunlin, DENG Ke. Recent Progress of Approximate Bayesian Computation and Its Applications[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1179-1203. doi: 10.21656/1000-0887.400245
Citation: ZHU Wanchuang, JI Chunlin, DENG Ke. Recent Progress of Approximate Bayesian Computation and Its Applications[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1179-1203. doi: 10.21656/1000-0887.400245

Recent Progress of Approximate Bayesian Computation and Its Applications

doi: 10.21656/1000-0887.400245
Funds:  The National Natural Science Foundation of China(11771242)
  • Received Date: 2019-08-26
  • Rev Recd Date: 2019-09-01
  • Publish Date: 2019-11-01
  • In the era of big data and artificial intelligence, it is a common challenge for applied mathematics, statistics and computer science to extract valuable information and knowledge from complex data and models. Generative models are a class of powerful models which can potentially handle the above difficulty. From a macro point of view, the differential equations and dynamic systems in applied mathematics, the probability distribution in statistical models, and the typical generative models (generative adversarial networks and variational auto-encoders) in computer science could be considered as generalized generative models. Along with larger and larger-size data, the structure of data becomes more and more complicated simultaneously. Therefore, more powerful generative models are essential to process real problems. It is a challenge to describe mathematical structures of these generative models. It poses a natural question of how to analyze such generative models without analytic forms (or hard to obtain their analytic forms). Originated from the Bayesian inference, the approximate Bayesian computation, as a likelihood-free technique, plays an important role in processing complex statistical models and generative models. Based on the classic approximate Bayesian computation, the development and recent advance of approximate Bayesian computation were systematically reviewed. Finally, the application of the approximate Bayesian computation to complex data and the deep connection between the approximate Bayesian computation and cutting-edge artificial intelligence methods were discussed.
  • loading
  • [1]
    STRANG G. Introduction to Applied Mathematics [M]. Wellesley, MA: Wellesley-Cambridge Press, 1986.
    [2]
    谷超豪, 李大潜, 陈恕行. 数学物理方法[M]. 上海: 上海科学技术出版社, 2002.(GU Chaohao, LI Daqian, CHEN Shuxing. Mathematical and Physical Methods [M]. Shanghai: Shanghai Scientific & Technical Publishers, 2002.(in Chinese))
    [3]
    BOARD A. Stochastic Modelling and Applied Probability [M]. Springer, 2005.
    [4]
    COURANT R, HILBERT D. Methods of Mathematical Physics: Partial Differential Equations [M]. John Wiley & Sons, 2008.
    [5]
    ARNOL’D V I. Mathematical Methods of Classical Mechanics [M]. Springer Science & Business Media, 2013.
    [6]
    NETER J, KUTNER M H, NACHTSHEIM C J, et al. Applied Linear Statistical Models [M]. Chicago, 1996.
    [7]
    CONGDON P. Bayesian Statistical Modelling [M]. John Wiley & Sons, 2007.
    [8]
    FAHRMEIR L, TUTZ G. Multivariate Statistical Modelling Based on Generalized Linearmodels [M]. Springer Science & Business Media, 2013.
    [9]
    GOODFELLOW I, POUGET-ABADIE J, MIRZA M, et al. Generative adversarial nets[C]// Advances in Neural Information Processing Systems 〖STBX〗27 (NIPS 2014).Montreal, Canada, 2014.
    [10]
    KINGMA D P, WELLING M. Auto-encoding variational Bayes[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1511.06434.pdf.
    [11]
    RADFORD A, METZ L, CHINTALA S. Unsupervised representation learning with deep convolutional generative adversarial networks[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1312.6114.pdf.
    [12]
    CHEN X, DUAN Y, HOUTHOOFT R, et al. Infogan: interpretable representation learning by information maximizing generative adversarial nets[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1606.03657.pdf.
    [13]
    ZHANG Han, XU Tao, LI Hongsheng, et al. Stackgan: text to photo-realistic image synthesis with stacked generative adversarial networks[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1612.03242v1.pdf.
    [14]
    MAO X D, LI Q, XIE H R, et al. Least squares generative adversarial networks[C]//2017 IEEE International Conference on Computer Vision (ICCV).2017.
    [15]
    ISOLA P, ZHU J Y, ZHOU T H, et al. Image-to-image translation with conditional adversarial networks[C]//2017 IEEE Conference on Computer Vision and Pattern Recognition(CVPR).2017: 1125-1134.
    [16]
    ARJOVSKY M, CHINTALA S, BOTTOU L. Wasserstein GAN[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1701.07875.pdf.
    [17]
    ZHU J Y, PARK T, ISOLA P, et al. Unpaired image-to-image translation using cycle-consistent adversarial networks[C]//2017 IEEE International Conference on Computer Vision (ICCV).Venice, Italy, 2017.
    [18]
    KARRAS T, LAINE S, AILA T. A style-based generator architecture for generative adversarial networks[C]//2019 IEEE Conference on Computer Vision and Pattern Recognition(CVPR).2019: 4401-4410.
    [19]
    TAO C Y, CHEN L Q, HENAO R, et al. Chi-square generative adversarial network[C]// International Conference on Machine Learning.2018: 4894-4903.
    [20]
    SNDERBY C K, RAIKO T, MAALE L, et al. Ladder variational autoencoders[C]// Advances in Neural Information Processing Systems 29 (NIPS 2016).2016: 3738-3746.
    [21]
    HIGGINS I, MATTHEY L, GLOROT X, et al. Early visual concept learning with unsupervised deep learning[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1606.05579.pdf.
    [22]
    RUBIN D B. Bayesianly justifiable and relevant frequency calculations for the applies statistician[J]. The Annals of Statistics,1984,12(4): 1151-1172.
    [23]
    PRITCHARD J K, SEIELSTAD M T, PEREZ-LEZAUN A, et al. Population growth of human Y chromosomes: a study of Y chromosome microsatellites[J]. Molecular Biology & Evolution,1999,16(12): 1791-1798.
    [24]
    WILKINSON R D,TAVARB S. Estimating primate divergence times by using conditioned birth-and-death processes[J]. Theoretical Population Biology,2009,75(4): 278-285.
    [25]
    PETERS G W, SISSON S A, FAN Y. Likelihood-free Bayesian inference for Alpha-stable models[J]. Computational Statistics & Data Analysis,2012,56(11): 3743-3756.
    [26]
    NOTT D J, FAN Y, MARSHALL L, et al. Approximate Bayesian computation and Bayes’ linear analysis: toward high-dimensional ABC[J]. Journal of Computational and Graphical Statistics,2014,23(1): 65-86.
    [27]
    RATMANN O, ANDRIEU C, WIUF C,et al. Model criticism based on likelihood-free inference, with an application to protein network evolution[J]. Proceedings of the National Academy of Sciences of the United States of America,2009,106(26): 10576-10581.
    [28]
    KULKARNI T, YILDIRIM I, KOHLI P,et al. Deep generative vision as approximate Bayesian computation[C]// NIPS 2014 ABC Workshop.2014.
    [29]
    SHEEHAN S, SONG Y S. Deep learning for population genetic inference[J]. PLoS Computational Biology,2016,12(3): e1004845. DOI: 10.1371/journal.pcbi.1004845.
    [30]
    GAL Y, GHAHRAMANI Z B. Dropout as a Bayesian approximation: representing model uncertainty in deep learning[C]// Proceedings of the 〖STBX〗33rd International Conference on Machine Learning.New York, USA, 2016: 1050-1059.
    [31]
    FELIP J, AHUJA N, GOMEZ-GUTIERREZ D, et al. Real-time approximate Bayesian computation for scene understanding[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1905.13307.pdf.
    [32]
    GHOSH J K, RAMAMOORTHI R V. Bayesian Nonparametrics [M]. New York: Springer, 2003.
    [33]
    ROBERT C. The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation [M]. Springer Science, 2007.
    [34]
    BERNARDO J M, SMITH A F M. Bayesian Theory [M]. John Wiley & Sons, 2009.
    [35]
    GELMAN A, CARLIN J B, STERN H S, et al. Bayesian Data Analysis [M]. 3rd ed. Chapman and Hall/CRC, 2013.
    [36]
    LIU J S. Monte Carlo Strategies in Scientific Computing [M]. New York: Springer, 2001.
    [37]
    BROOKS S, GELMAN A, JONES G, et al. Handbook of Markov Chain Monte Carlo [M]. New York: CRC press, 2011.
    [38]
    CHEN M H, SHAO Q M, IBRAHIM J G. Monte Carlo Methods in Bayesian Computation [M]. Springer Science & Business Media, 2012.
    [39]
    GIUDICI P, GIVENS G H, MALLICK B K. Wiley Series in Computational Statistics [M]. Wiley Online Library, 2013.
    [40]
    MARJORAM P, MOLITOR J, PLAGNOL V, et al. Markov chain Monte Carlo without likelihoods[J]. Proceedings of the National Academy of Sciences of the United States of America,2003,100(26): 15324-15328.
    [41]
    SISSON S A, FAN Y, TANAKA M M. Sequential monte carlo without likelihoods[J]. Proceedings of the National Academy of Sciences of the United States of America,2007,104(6): 1760-1765.
    [42]
    FEARNHEAD P, PRANGLE D.Constructing summary statistics for approximate Bayesian computation: semi-automatic approximate Bayesian computation[J]. Journal of the Royal Statistical Society: Series B (Statistical Methodology),2012,74(3): 419-474.
    [43]
    DELMORAL P, DOUCET A, JASRA A. An adaptive sequential Monte Carlo method for approximate Bayesian computation[J]. Statistics and Computing,2012,22(5): 1009-1020.
    [44]
    MENGERSEN K L, PUDLO P, ROBERT C P. Bayesian computation via empirical likelihood[J]. Proceedings of the National Academy of Sciences of the United States of America,2013,110(4): 1321-1326.
    [45]
    SISSON S A, FAN Y, BEAUMONT M. Handbook of Approximate Bayesian Computation [M]. Chapman and Hall/CRC, 2018.
    [46]
    FRAZIER D T, MARTIN G M, ROBERT C P. On consistency of approximate Bayesian computation[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1508.05178.pdf.
    [47]
    HEIN J, SCHIERUP M, WIUF C. Gene Genealogies, Variation and Evolution: a Primer in Coalescent Theory[M]. Oxford: Oxford University Press, 2004.
    [48]
    TANAKA M M, FRANCIS A R, LUCIANI F, et al. Using approximate Bayesian computation to estimate tuberculosis transmission parameters from genotype data[J].Genetics,2006,173(3): 1511-1520.
    [49]
    WOOD S N. Statistical inference for noisy nonlinear ecological dynamic systems[J]. Nature,2010,466(7310): 1102-1104.
    [50]
    ZHU W C, FAN Y. A novel approach for Markov random field with intractable normalizing constant on large lattices[J]. Journal of Computational and Graphical Statistics,2018,27(1): 59-70.
    [51]
    PRANGLE D. Summary Statistics [M]. Chapman and Hall/CRC, 2018.
    [52]
    JOYCE P, MARJORAM P. Approximately sufficient statistics and Bayesian computation[J]. Statistical Applications in Genetics and Molecular Biology,2008,7(1). DOI: 10.2202/1544-6115.1389.
    [53]
    HASTIE T, TIBSHIRANI R, FRIEDMAN J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction [M]. 2nd ed. New York: Springer, 2009.
    [54]
    MARDIA K V, KENT J T, BIBBY J M. Multivariate analysis[M]//Probability and Mathematical Statistics.New York: Academic Press, 1979.
    [55]
    JIANG B, WU T Y, ZHENG C, et al. Learning summary statistic for approximate Bayesian computation via deep neural network[J]. Statistica Sinica,2017,27(4): 1595-1618.
    [56]
    CREEL M. Neural nets for indirect inference[J]. Econometrics and Statistics,2017,2: 36-49.
    [57]
    RAYNAL L, MARIN J M, PUDLO P, et al. ABC random forests for Bayesian parameter inference[J]. Bioinformatics,2018,〖STHZ〗 35(10): 1720-1728.
    [58]
    BEAUMONT M A. Approximate Bayesian computation[J]. Annual Review of Statistics and Its Application,2019,6(1): 379-403.
    [59]
    Gunning P W, Ghoshdastider U, Whitaker S, Popp D, Robinson R C. The evolution of compositionally and functionally distinct actin filaments[J].Journal of Cell Science,2015,128(11): 2009-2019.
    [60]
    BERNTON E, JACOB P E, GERBER M, et al. Approximate Bayesian computation with the wasserstein distance[J]. Journal of the Royal Statistical Society: Series B (Statistical Methodology),2019,81(2): 235-269.
    [61]
    PARK M, JITKRITTUM W, SEJDINOVIC D. K2-ABC: approximate Bayesian computation with kernel embeddings[C]// Proceedings of the 19th International Conference on Artificial Intelligence and Statistics (AISTATS).Cadiz, Spain, 2016.
    [62]
    SISSON S A, FAN Y, TANAKA M M. Sequential monte carlo without likelihoods: errata[J]. Proceedings of the National Academy of Sciences of the United States of America,2009,106(39): 1760-1765.
    [63]
    BEAUMONT M A, CORNUET J M, MARIN J M, et al. Adaptive approximate Bayesian computation[J]. Biometrika,2009,96(4): 983-990.
    [64]
    TONI T, WELCH D, STRELKOWA N, et al. Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems[J]. Journal of the Royal Society Interface,2009,〖STHZ〗 6(31): 187-202.
    [65]
    DUANE S, KENNEDY A D, PENDLETON B J, et al. Hybrid Monte Carlo[J]. Physics letters B,1987,195(2): 216-222.
    [66]
    MEEDS E, LEENDERS R, WELLING M. Hamiltonian ABC[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1503.01916.pdf.
    [67]
    GIANNOPOULOS P, GODSILL S J. Estimation of car processes observed in noise using Bayesian inference[C]//2001 IEEE International Conference on Acoustics, Speech, and Signal Processing.Salt Lake City, UT, USA, 2001.
    [68]
    JI C L, YANG L G, ZHU W C, et al. On Bayesian inference for continuous-time autoregressive models without likelihood[C]//2018 21st International Conference on Information Fusion (FUSION).2018: 2137-2142.
    [69]
    HARVEY A C. Forecasting, Structural Time Series Models and the Kalman Filter [M]. Cambridge: Cambridge University Press, 1991.
    [70]
    JONES R H. Fitting a continuous time autoregression to discrete data[C]// Proceedings of the Second Applied Time Series Symposium Held.Tulsa, Oklahoma, 1980.
    [71]
    SZKELY G J, RIZZO M L, BAKIROV N K. Measuring and testing dependence by correlation of distances[J]. The Annals of Statistics,2007,35(6): 2769-2794.
    [72]
    LI C L, CHANG W C, CHENG Y, et al. MMD GAN: towards deeper understanding of moment matching network[C]// Advances in Neural Information Processing Systems 30 (NIPS 2017).2017.
    [73]
    HOFFMAN M D, BLEI D M, WANGC, et al. Stochastic variational inference[J]. Journal of Machine Learning Research,2013,14(1): 1303-1347.
    [74]
    ZHAO S J, SONG J M, ERMON S. Infovae: information maximizing variational autoencoders[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1706.02262.pdf.
    [75]
    MAKHZANI A, SHLENS J, JAITLY N, et al. Adversarial autoencoders[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1511.05644.pdf.
    [76]
    MORENO A, ADEL T, MEEDS E, et al. Automatic variational ABC[R/OL]. [2019-08-26]. https://arxiv.org/pdf/1606.08549.pdf.
    [77]
    王小娥, 蔺小林, 李建全. 一类具有脉冲免疫治疗的HIV-1感染模型的动力学分析[J]. 应用数学和力学, 2019,40(7): 728-740.(WANG Xiaoe, LIN Xiaolin, LI Jianquan. Dynamic analysis of a class of HIV-1 infection models with pulsed immunotherapy[J]. Applied Mathematics and Mechanics,2019,40(7): 728-740.(in Chinese))
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (1444) PDF downloads(413) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return