Publications

  1. Scarabosio, Laura; Faghihi, Danial; Wohlmuth, B.; Oden, J.T. "Goal-Oriented Adaptive Modeling of Random Heterogeneous Media and Model-Based Multilevel Monte Carlo Methods.” Computers & Mathematics with Applications, arXiv preprint arXiv:1808.01923, 2019.
  2. Khristenko, Ustim; Constantinescu, Andrei; Le Tallec, Patrick; Oden, J Tinsley; and Wohlmuth, Barbara. “A Statistical Framework for generating microstructures of two-phase random materials: application to fatigue Analysis”, arXiv preprint arXiv:1907.02412, July 2019 (in review, Multiscale Modeling and Simulation: A SIAM Interdisciplinary Journal).
  3. Fritz, Marvin; Lima, E. A. B. F.; Nikolic, Vanja; Oden, J. T.; and Wohlmuth, B. “Local and Nonlocal Phase-Field Models of Tumor Growth and Invasion Due to ECM Degradation”,M3AS (Mathematical Models and Methods in Applied Sciences), (in review, 2019).
  4. Swischuk, R., Kramer, B., Huang, C. and Willcox, K. , Learning physics-based reduced-order models for a single-injector combustion process., Oden Institute Report 19-13. , AIAA Journal, Vol. 58, No. 6, June 2020, pp. 2658-2672. Also in Proceedings of 2020 AIAA SciTech Forum & Exhibition, Orlando FL, January, 2020.
  5. Kramer, B., and Willcox, K. , In Realization and Model Reduction of Dynamical Systems, Springer, to appear.
  6. Qian, E., Kramer, B., Peherstorfer, B., and Willcox, K. , Learning physics-based reduced-order models for a single-injector combustion process., Physica D: Nonlinear Phenomena, Volume 406, May 2020, 132401
  7. Benner, P., Goyal, P., Kramer, B., Peherstorfer, B., and Willcox, K., Operator inference for non-intrusive model reduction of systems with non-polynomial nonlinear terms. Computer Methods in Applied Mechanics and Engineering, to appear.
  8. G.Kabacaolu and G.Biros, Machine learning acceleration of simulations of Stokesian suspensions,, Oden Institute Report 19-13. Physical Review E 99(6), 2019.
  9. Chenhan Yu, Severin Reiz, and George Biros, Distributed O(N) Linear Solver for Dense Symmetric Hierarchical Semi-Separable Matrices, IEEE 13th International Symposium on Embedded Multicore/Many-core Systems-on-Chip, Nanyang Technological University, Singapore, October 2019
  10. Amir Gholami, Kurt Keutzer, and George Biros, ANODE: Unconditionally Accurate Memory-Efficient Gradients for Neural ODEs, Proceedings of the International Joint Conferences on Artificial Intelligence Macao, China (19\% acceptance rate), August 2019.
  11. Amir Gholami, George Biros et al, ANODEV2: A Coupled Neural ODE Framework, to appear in NeuroIPS 2019, Vancouver (Acceptance rate 21\%), December 2019
  12. O. Burkovska and M. Gunzburger, Affine Approximation of Parametrized Kernels and Model Order Reduction for Nonlocal and Fractional Laplace Models, SIAM Journal on Numerical Analysis 58 (3), 1469-1494, May 2020
  13. O. Burkovska and M. Gunzburger, On a nonlocal Cahn-Hilliard model permitting sharp interfaces., Submitted, Apr. 2020
  14. O. Burkovska, C. Glusa, and M. D’Elia, An optimization-based approach to parameter learning for fractional type nonlocal models., Submitted, Oct. 2020
  15. M. McKerns, F. J. Alexander, K. Hickmann, T. J. Sullivan, and D. Vaughan, Optimal Bounds on Nonlinear Partial Differential Equations in Model Certification, Validation, and Experiment Design, in Advanced Analysis Solutions for Leading Experimental Techniques (2019 to appear)
  16. K. G. Reyes and F. J. Alexander, Autonomous Experimental Design and Execution, in Advanced Analysis Solutions for Leading Experimental Techniques (2019 to appear)
  17. Zollanvari, A., and E. R. Dougherty, “Optimal Bayesian Classification with Autoregressive Data Dependency, IEEE Transactions on Signal Processing, 67(12). 3073-3086, 2019.
  18. Zhao, G., Qian, X., Yoon, B-J., Alexander, F. J., and E. R. Dougherty, “Model-based Robust Filtering and Experimental Design for Stochastic Differential Equation Systems,” IEEE Transactions on Signal Processing (in review).
  19. Guang Zhao, Xiaoning Qian, Byung-Jun Yoon, Francis J. Alexander, and Edward R. Dougherty, “Model-based robust filtering and experimental design for stochastic differential equation systems", IEEE Transactions on Signal Processing, vol. 68, 3849-3859, 2020. (submitted last year; published now.)
  20. Guang Zhao, Edward R. Dougherty, Byung-Jun Yoon, Francis J. Alexander, and Xiaoning Qian, “Uncertainty-aware Active Learning for Optimal Bayesian Classifier", International Conference on Learning Representations (ICLR), 2021. (submitted)
  21. Shahin Boluki, Xiaoning Qian, Edward R Dougherty, “Optimal Bayesian supervised domain adaptation for RNA sequencing data”, Bioinformatics, 2020 (submitted).
  22. Youngjoon Hong, Bongsuk Kwon, Byung-Jun Yoon, “Optimal experimental design for uncertain systems based on coupled differential equations”, arXiv:2\007.06117 (submitted to IEEE Signal Processing Letters).
  23. A. Petrosyan, H. Tran, and C. G. Webster. Reconstruction of jointly sparse vectors via manifold optimization. Applied Numerical Mathematics, 144:140–150, 2019.
  24. X. Xie, G. Zhang, and C. G. Webster. Non-intrusive inference reduced order model for fluids using deep multistep neural network. Mathematics, 7(8):1–15, 2019.
  25. N. Dexter, H. Tran, and C. G. Webster. Reconstructing high-dimensional Hilbert-valued functions via compressed sensing. IEEE Signal Processing, 2019. Accepted (arXiv:1905.05853).
  26. N. Dexter, H. Tran, and C. G. Webster. A mixed l1 regularization approach for sparse simultaneous approximation of parameterized PDEs. ESAIM: Mathematical Modelling and Numerical Analysis, 2019. Accepted (arXiv:1812.06174).
  27. A. Dereventsov and C. G. Webster. The Natural Greedy Algorithm for reduced bases in Banach spaces. Foundations of Computational Mathematics, 2019. Submitted (arXiv:1905.06448).
  28. J. Daws and C. G. Webster. A polynomial-based approach for architectural design and learning with deep neural networks, Submitted (arXiv:1905.10457), 2019.
  29. A. Dereventsov, A. Petrosyan, and C. G. Webster. Greedy shallow networks: A new approach for constructing and training neural networks. Submitted (arXiv:1905.06448), 2019.
  30. V. Reshniak and C. G. Webster. Robust learning with implicit residual networks. Submitted (arXiv:1905.10479), 2019.
  31. X. Xie, F. Bao, T. Maier, and C. G. Webster. Analytic continuation of noisy data using adams bashforth resnet. Submitted (arXiv:1905.10457).
  32. A. Petrosyan, A. Dereventsov, and C. G. Webster. Neural network integral representations with the ReLU activation function. Proceedings of Machine Learning Research, 2020. Accepted (arXiv:1910.02743).
  33. V. Reshniak, J. Trageser, and C. G. Webster. A nonlocal feature-driven exemplar-based approach for image inpainting. SIAM Journal on Imaging Sciences, 2020. Accepted (arXiv:1909.09301).
  34. X. Xie, C. G. Webster, and T. Iliescu. Closure Learning for Nonlinear Model Reduction Using Deep Residual Neural Network. Fluids, 5(1,39):1–15, 2020.
  35. Y. Xu, A. Narayan, H. Tran, and C. G. Webster. Analysis of the ratio of l1 and l2 norms in compressed sensing. Applied and Computational Harmonic Analysis, 2020. Submitted (arXiv:2004.05873).
  36. A. Dereventsov, C. G. Webster, and J. Daws. Analysis of the ratio of l1 and l2 norms in compressed sensing. In Advances in Neural Information Processing Systems 32. Curran Associates, Inc., 2020. Submitted (arXiv:2006.10887).
  37. J. Daws, A. Petrosyan, H. Tran, and C. G. Webster. A Weighted l1-minimization approach for wavelet reconstruction of signals and images. IEEE Transactions on Signal Processing, 2019. Submitted (arXiv:1909.07270).
  38. N. Dexter, H. Tran, and C. G. Webster. Reconstructing high-dimensional Hilbert-valued functions via compressed sensing. In 13th International conference on Sampling Theory and Applications SampTA, pages 1–5, Bordeaux, France, 2019.
  39. N. Dexter, H. Tran, and C. G. Webster. A mixed l1 regularization approach for sparse simultaneous approximation of parameterized PDEs. M2AN. Mathematical Modelling and Numerical Analysis, 53(6):2025–2045, 2019.
  40. N. Dexter, H. Tran, and C. G. Webster. A mixed l1 regularization approach for sparse simultaneous approximation of parameterized PDEs. M2AN. Mathematical Modelling and Numerical Analysis, 53(6):2025–2045, 2019.
  41. A. Petrosyan, H. Tran, and C. G. Webster. Reconstruction of jointly sparse vectors via manifold optimization. Applied Numerical Mathematics, 144:140–150, 2019.
  42. H. Tran and C. G. Webster. A class of null space conditions for sparse recovery via nonconvex, non-separable minimizations. Results in Applied Mathematics, 3:100011, 2019.
  43. X. Xie, G. Zhang, and C. G. Webster. Non-intrusive inference reduced order model for fluids using deep multistep neural network. Mathematics, 7(8):1–15, 2019.
  44. U. Villa, N. Petra, and O. Ghattas, hIPPYlib: An Extensible Software Framework for Large-Scale Inverse Problems Governed by PDEs; Part I: Deterministic Inversion and Linearized Bayesian Inference, submitted, 2019. http://arxiv.org/abs/1909.03948
  45. I. Ambartsumyan, W. Boukaram, T. Bui-Thanh, O. Ghattas, D. Keyes, G. Stadler, G. Turkiyyah, and S. Zampini, Hierarchical Matrix Approximations of Hessians Arising in Inverse Problems Governed by PDEs, submitted, 2019.
  46. T. O’Leary-Roseberry, N. Alger, O. Ghattas, Inexact Newton Methods for Stochastic Non-Convex Optimization with Applications to Neural Network Training, submitted. https://arxiv.org/abs/1905.06738
  47. P. Chen and O. Ghattas, Sparse polynomial approximations for affine parametric saddle point problems, submitted. https://arxiv.org/abs/1809.10251
  48. P. Chen and O. Ghattas, Sparse polynomial approximation for optimal control problems con- strained by elliptic PDEs with lognormal random coefficients, submitted. https://arxiv.org/abs/1903.05547
  49. P. Chen, K. Wu, J. Chen, T. O’Leary-Roseberry, O. Ghattas, Projected Stein variational Newton: A fast and scalable Bayesian inference method in high dimensions, NeurIPS 2019. https://arxiv.org/abs/1901.08659
  50. Tong, X. T., Morzfeld, M., Marzouk, Y.M. “MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure.” Submitted (2019). arXiv:1908.09429
  51. Spantini, A., Baptista, R., Marzouk, Y.M. “Coupling techniques for nonlinear ensemble filtering.” Submitted (2019). arXiv:1907.00389
  52. Bigoni, D., Zahm, O., Spantini, A., Marzouk, Y.M. “Greedy inference with layers of lazy maps.” Submitted (2019). arXiv:1906.00031.
  53. Musolas, A., Massart, E., Hendrickx, J., Absil, P.-A., Marzouk, Y.M. “Low-rank multi-parametric covariance estimation.” Submitted (2019).
  54. Wang, Z., Cui, T., Bardsley, J., Marzouk, Y.M. “Scalable optimization-based sampling on function space.” Submitted (2019). arXiv:1903.00870

Posters

  1. Block Copolymer Directed Self-Assembly: Optimal Design of Nanoscale Manufacturing Under Uncertainty Danial Faghihi, Max Gunzburger, Clayton G. Webster, Frank Alexander, Institute for Computational Engineering and Sciences (ICES), The University of Texas at Austin Department of Computational and Applied Mathematics, Oak Ridge National Laboratory Computational Science Initiative Directorate, Brookhaven National Laboratory | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  2. Optimal Control under Uncertainty and Multiscale Heterogeneous Materials Modeling in Additive Manufacturing Yuanxun Bao George Biros John A. Turner Institute for Computational Engineering & Sciences, The University of Texas at Austin Computational Sciences and Engineering Division, Oak Ridge National Laboratory | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  3. Learning Low-dimensional Models via Operator Lifting and Natural Greedy Algorithms Anton Dereventsov, Max Gunzburger, Clayton Webster, Karen Willcox, Guannan Zhang Department of Computational and Applied Mathematics, Oak Ridge National Laboratory Institute for Computational Engineering & Sciences, The University of Texas at Austin | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  4. Transport-Based Variational Bayesian Methods for Learning from Data Daniele Bigoni Joshua Chen, Peng Chen, Omar Ghattas, Youssef Marzouk Tom O’Leary–Roseberry, Keyi Wu, Center for Computational Engineering, Massachusetts Institute of Technology, Institute for Computational Engineering & Sciences, The University of Texas at Austin | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
© 2019 AEOLUS. All Rights Reserved.