Publications

Annual reports

Peer-reviewed publications

2022
  1. Cao, L., Ghattas, O. and Oden, J.T., 2022. A Globally Convergent Modified Newton Method for the Direct Minimization of the Ohta--Kawasaki Energy with Application to the Directed Self-Assembly of Diblock Copolymers. SIAM Journal on Scientific Computing, 44(1), pp.B51-B79.
  2. Khodabakhshi, P. and Willcox, K.E., 2022. Non-intrusive data-driven model reduction for differential–algebraic equations derived from lifting transformations. Computer Methods in Applied Mechanics and Engineering, 389, p.114296.
  3. Geelen, R. and Willcox, K, 2022. Localized non-intrusive reduced-order modeling in the operator inference framework, Philosophical Transactions A, The Royal Society, accepted for publication.
  4. Qin, Y., Bao, Y., DeWitt, S., Radhakrishnan, R., and Biros, G., Dendrite-resolved, full-melt-pool phase-field simulations to reveal non-steady-state effects and to test an approximate model, Computational Materials Science, Volume 207, pp.111262
  5. Maddouri, O., Qian, X., Alexander, F.J., Dougherty, E.R. and Yoon, B.J., 2022. Robust importance sampling for error estimation in the context of optimal Bayesian transfer learning. Patterns, p.100428.
  6. Fan, M., Yoon, B.J., Alexander, F.J., Dougherty, E.R. and Qian, X., 2022. Adaptive Group Testing with Mismatched Models. The 47th International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2022), arXiv preprint arXiv:2110.02265.
  7. Portone, T. and Moser, R.D., 2022. Bayesian inference of an uncertain generalized diffusion operator. SIAM/ASA Journal on Uncertainty Quantification, 10(1), pp.151-178.
  8. Dereventsov, A., Webster, C.G. and Daws, J., 2022. An adaptive stochastic gradient-free approach for high-dimensional blackbox optimization. In Proceedings of International Conference on Computational Intelligence (pp. 333-348). Springer, Singapore.
  9. Adcock, B., Brugiapaglia, S. and Webster, C.G., 2022. Sparse polynomial approximation of high-dimensional functions, SIAM Book Series in Computational Sciences and Engineering.
  10. O'Leary-Roseberry, T., Villa, U., Chen, P. and Ghattas, O., 2022. Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs. Computer Methods in Applied Mechanics and Engineering, 388, p.114199.
  11. Wang, Y., Chen, P., Pilanci, M. and Li, W., 2022. Optimal Neural Network Approximation of Wasserstein Gradient Direction via Convex Optimization, submitted.
  12. Zahm, O., Cui, T., Law, K., Spantini, A. and Marzouk, Y., 2022. Certified dimension reduction in nonlinear Bayesian inverse problems. Mathematics of Computation. (in press)
  13. Bigoni, D., Marzouk, Y., Prieur, C. and Zahm, O., 2022. Nonlinear dimension reduction for surrogate modeling using gradient information. Information and Inference: a Journal of the IMA. (in press)
  14. Zech, J. and Marzouk, Y., 2022. Sparse approximation of triangular transports. Part I: the finite dimensional case. (in press)
  15. Zech, J. and Marzouk, Y., 2022. Sparse approximation of triangular transports. Part II: the infinite dimensional case. Constructive Approximation. (in press)
  16. Spantini, A., Baptista, R. and Marzouk, Y., 2022. Coupling techniques for nonlinear ensemble filtering. SIAM Review. (in press)
  17. Musolas, A., Massart, E., Hendrickx, J.M., Absil, P.A. and Marzouk, Y., 2022. Low-rank multi-parametric covariance identification. BIT Numerical Mathematics, 62(1), pp.221-249.

2021
  1. Hormuth, D.A., Phillips, C.M., Wu, C., Lima, E.A., Lorenzo, G., Jha, P.K., Jarrett, A.M., Oden, J.T. and Yankeelov, T.E., 2021. Biologically-Based Mathematical Modeling of Tumor Vasculature and Angiogenesis via Time-Resolved Imaging Data. Cancers, 13(12), p.3008.
  2. Fritz, M., Jha, P.K., Köppl, T., Oden, J.T. and Wohlmuth, B., 2021. Analysis of a new multispecies tumor growth model coupling 3D phase-fields with a 1D vascular network. Nonlinear Analysis: Real World Applications, 61, p.103331.
  3. Fritz, M., Jha, P.K., Köppl, T., Oden, J.T., Wagner, A. and Wohlmuth, B., 2021. Modeling and simulation of vascular tumors embedded in evolving capillary networks. Computer Methods in Applied Mechanics and Engineering, 384, p.113975.
  4. Bukowski, R., Schulz, K., Gaither, K., Stephens, K.K., Semeraro, D., Drake, J., Smith, G., Cordola, C., Zariphopoulou, T., Hughes, T.J. and Zarins, C., 2021. Computational medicine, present and the future: obstetrics and gynecology perspective. American Journal of Obstetrics and Gynecology, 224(1), pp.16-34.
  5. Willcox, K.E., Ghattas, O. and Heimbach, P., 2021. The imperative of physics-based modeling and inverse theory in computational science. Nature Computational Science, 1(3), pp.166-168.
  6. McQuarrie, S.A., Huang, C. and Willcox, K.E., 2021. Data-driven reduced-order models via regularised Operator Inference for a single-injector combustion process. Journal of the Royal Society of New Zealand, 51(2), pp.194-211.
  7. Khodabakhshi, P., Willcox, K.E. and Gunzburger, M., 2021. A multifidelity method for a nonlocal diffusion model. Applied Mathematics Letters, 121, p.107361.
  8. Kapteyn, M.G., Pretorius, J.V. and Willcox, K.E., 2021. A probabilistic graphical model foundation for enabling predictive digital twins at scale. Nature Computational Science, 1(5), pp.337-347.
  9. Ghattas, O. and Willcox, K., 2021. Learning physics-based models from data: perspectives from inverse problems and model reduction. Acta Numerica, 30, pp.445-554.
  10. Ehre, M., Papaioannou, I., Willcox, K.E. and Straub, D., 2021. Conditional reliability analysis in high dimensions based on controlled mixture importance sampling and information reuse. Computer Methods in Applied Mechanics and Engineering, 381, p.113826.
  11. Tunç, B., Hormuth II, D.A., Biros, G. and Yankeelov, T.E., 2021. Modeling of glioma growth with mass effect by longitudinal magnetic resonance imaging. IEEE Transactions on Biomedical Engineering, 68(12), pp.3713-3724.
  12. Chen, C., Reiz, S., Yu, C.D., Bungartz, H.J. and Biros, G., 2021. Fast Approximation of the Gauss--Newton Hessian Matrix for the Multilayer Perceptron. SIAM Journal on Matrix Analysis and Applications, 42(1), pp.165-184.
  13. Brunn, M., Himthani, N., Biros, G., Mehl, M. and Mang, A., 2021. Fast GPU 3D diffeomorphic image registration. Journal of Parallel and Distributed Computing, 149, pp.149-162.
  14. Brunn, M., Himthani, N., Biros, G., Mehl, M. and Mang, A., 2021. CLAIRE: Constrained Large Deformation Diffeomorphic Image Registration on Parallel Computing Architectures. Journal of Open Source Software, 6(61), p.3038.
  15. Burkovska, O. and Gunzburger, M., 2021. On a nonlocal Cahn–Hilliard model permitting sharp interfaces. Mathematical Models and Methods in Applied Sciences, 31(09), pp.1749-1786.
  16. Burkovska, O., Glusa, C. and D'Elia, M., 2021. An optimization-based approach to parameter learning for fractional type nonlocal models. Computers & Mathematics with Applications.
  17. Zhao, G., Dougherty, E., Yoon, B.J., Alexander, F. and Qian, X., 2021, January. Uncertainty-aware active learning for optimal Bayesian classifier. In International Conference on Learning Representations (ICLR 2021).
  18. Zhao, G., Dougherty, E., Yoon, B.J., Alexander, F. and Qian, X., 2021. Efficient active learning for Gaussian process classification by error reduction. Advances in Neural Information Processing Systems, 34.
  19. Zhao, G., Dougherty, E., Yoon, B.J., Alexander, F.J. and Qian, X., 2021, March. Bayesian active learning by soft mean objective cost of uncertainty. In International Conference on Artificial Intelligence and Statistics (pp. 3970-3978). PMLR.
  20. Yoon, B.J., Qian, X. and Dougherty, E.R., 2021. Quantifying the multi-objective cost of uncertainty. IEEE Access, 9, pp.80351-80359.
  21. Boluki, S., Qian, X. and Dougherty, E.R., 2021. Optimal Bayesian supervised domain adaptation for RNA sequencing data . Bioinformatics, 37(19), pp.3212-3219.
  22. Woo, H.M., Hong, Y., Kwon, B. and Yoon, B.J., 2021. Accelerating optimal experimental design for robust synchronization of uncertain Kuramoto oscillator model using machine learning. IEEE Transactions on Signal Processing, 69, pp.6473-6487.
  23. McBane, S. and Choi, Y., 2021. Component-wise reduced order model lattice-type structure design. Computer Methods in Applied Mechanics and Engineering, 381, p.113813.
  24. Hong, Y., Kwon, B. and Yoon, B.J., 2021. Optimal experimental design for uncertain systems based on coupled differential equations. IEEE Access, 9, pp.53804-53810.
  25. Xu, Y., Narayan, A., Tran, H. and Webster, C.G., 2021. Analysis of the ratio of ℓ1 and ℓ2 norms in compressed sensing. Applied and Computational Harmonic Analysis, 55, pp.486-511.
  26. Xie, X., Bao, F., Maier, T. and Webster, C., 2021. Analytic continuation of noisy data using Adams Bashforth residual neural network. Discrete & Continuous Dynamical Systems-S.
  27. Reshniak, V. and Webster, C.G., 2021. Robust learning with implicit residual networks. Machine Learning and Knowledge Extraction, 3(1), pp.34-55.
  28. Dexter, N., Tran, H. and Webster, C.G., 2021. On the strong convergence of forward-backward splitting in reconstructing jointly sparse signals. Set-Valued and Variational Analysis, pp.1-15.
  29. Villa, U., Petra, N. and Ghattas, O., 2021. HIPPYlib: an extensible software framework for large-scale inverse problems governed by PDEs: part I: deterministic inversion and linearized Bayesian inference. ACM Transactions on Mathematical Software (TOMS), 47(2), pp.1-34.
  30. Chen, P., Wu, K. and Ghattas, O., 2021. Bayesian inference of heterogeneous epidemic models: Application to COVID-19 spread accounting for long-term care facilities. Computer Methods in Applied Mechanics and Engineering, 385, p.114020.
  31. Chen, P., Haberman, M.R. and Ghattas, O., 2021. Optimal design of acoustic metamaterial cloaks under uncertainty. Journal of Computational Physics, 431, p.110114.
  32. Chen, P. and Ghattas, O., 2021. Stein variational reduced basis Bayesian inversion. SIAM Journal on Scientific Computing, 43(2), pp.A1163-A1193.
  33. Chen, P. and Ghattas, O., 2021. Taylor approximation for chance constrained optimization problems governed by partial differential equations with high-dimensional random parameters. SIAM/ASA Journal on Uncertainty Quantification, 9(4), pp.1381-1410.
  34. Alghamdi, A., Hesse, M.A., Chen, J., Villa, U. and Ghattas, O., 2021. Bayesian Poroelastic Aquifer Characterization From InSAR Surface Deformation Data. Part II: Quantifying the Uncertainty. Water Resources Research, 57(11), p.e2021WR029775.
  35. Alghamdi, A., Hesse, M.A., Chen, J. and Ghattas, O., 2020. Bayesian poroelastic aquifer characterization from InSAR surface deformation data. Part I: Maximum a posteriori estimate. Water Resources Research, 56(10), p.e2020WR027391.
  36. Aretz, N., Chen, P. and Veroy, K., 2021. Sensor selection for hyper‐parameterized linear Bayesian inverse problems. PAMM, 20(S1), p.e202000357.
  37. Aretz-Nellesen, N., Chen, P., Grepl, M.A. and Veroy, K., 2021. A sequential sensor selection strategy for hyper-parameterized linear Bayesian inverse problems. In Numerical Mathematics and Advanced Applications ENUMATH 2019 (pp. 489-497). Springer, Cham.
  38. Uribe, F., Papaioannou, I., Marzouk, Y.M. and Straub, D., 2021. Cross-entropy-based importance sampling with failure-informed dimension reduction for rare event simulation. SIAM/ASA Journal on Uncertainty Quantification, 9(2), pp.818-847.
  39. Jagalur-Mohan, J. and Marzouk, Y., 2021. Batch greedy maximization of non-submodular functions: Guarantees and applications to experimental design. Journal of Machine Learning Research, 22(252), pp.1-62.

2020
  1. Khristenko, U., Constantinescu, A., Tallec, P.L., Oden, J.T. and Wohlmuth, B., 2020. A statistical framework for generating microstructures of two-phase random materials: application to fatigue analysis. Multiscale Modeling & Simulation, 18(1), pp.21-43.
  2. Jha, P.K., Cao, L. and Oden, J.T., 2020. Bayesian-based predictions of COVID-19 evolution in Texas using multispecies mixture-theoretic continuum models. Computational Mechanics, 66(5), pp.1055-1068.
  3. Faghihi, D., Feng, X., Lima, E.A., Oden, J.T. and Yankeelov, T.E., 2020. A coupled mass transport and deformation theory of multi-constituent tumor growth. Journal of the Mechanics and Physics of Solids, 139, p.103936.
  4. Benner, P., Goyal, P., Kramer, B., Peherstorfer, B. and Willcox, K., 2020. Operator inference for non-intrusive model reduction of systems with non-polynomial nonlinear terms. Computer Methods in Applied Mechanics and Engineering, 372, p.113433.
  5. Swischuk, R., Kramer, B., Huang, C. and Willcox, K., 2020. Learning physics-based reduced-order models for a single-injector combustion process. AIAA Journal, 58(6), pp.2658-2672.
  6. Qian, E., Kramer, B., Peherstorfer, B. and Willcox, K., 2020. Lift & Learn: Physics-informed machine learning for large-scale nonlinear dynamical systems. Physica D: Nonlinear Phenomena, 406, p.132401.
  7. Scheufele, K., Subramanian, S. and Biros, G., 2020. Fully automatic calibration of tumor-growth models using a single mpMRI scan. IEEE transactions on medical imaging, 40(1), pp.193-204.
  8. Subramanian, S., Scheufele, K., Himthani, N. and Biros, G., 2020, October. Multiatlas calibration of biophysical brain tumor growth models with mass effect. In International Conference on Medical Image Computing and Computer-Assisted Intervention (pp. 551-560). Springer, Cham.
  9. Brunn, M., Himthani, N., Biros, G., Mehl, M. and Mang, A., 2020, November. Multi-node multi-GPU diffeomorphic image registration for large-scale imaging problems. In SC20: International Conference for High Performance Computing, Networking, Storage and Analysis (pp. 1-17). IEEE.
  10. Burkovska, O. and Gunzburger, M., 2020. Affine approximation of parametrized kernels and model order reduction for nonlocal and fractional Laplace models. SIAM Journal on Numerical Analysis, 58(3), pp.1469-1494.
  11. McKerns, M., Alexander, F.J., Hickmann, K.S., Sullivan, T.J. and Vaughan, D.E., 2020. Optimal Bounds on Nonlinear Partial Differential Equations in Model Certification, Validation, and Experiment Design. In Handbook on Big Data and Machine Learning in the Physical Sciences: Volume 2. Advanced Analysis Solutions for Leading Experimental Techniques (pp. 271-306).
  12. Reyes, K.G. and Alexander, F.J., 2020. Autonomous Experimental Design and Execution. In Handbook on Big Data and Machine Learning in the Physical Sciences: Volume 2. Advanced Analysis Solutions for Leading Experimental Techniques (pp. 241-270).
  13. Zhao, G., Qian, X., Yoon, B.J., Alexander, F.J. and Dougherty, E.R., 2020. Model-based robust filtering and experimental design for stochastic differential equation systems. IEEE Transactions on Signal Processing, 68, pp.3849-3859.
  14. Xie, X., Webster, C. and Iliescu, T., 2020. Closure learning for nonlinear model reduction using deep residual neural network. Fluids, 5(1), p.39.
  15. Reshniak, V., Trageser, J. and Webster, C.G., 2020. A nonlocal feature-driven exemplar-based approach for image inpainting. SIAM Journal on Imaging Sciences, 13(4), pp.2140-2168.
  16. Petrosyan, A., Dereventsov, A. and Webster, C.G., 2020, August. Neural network integral representations with the ReLU activation function. In Mathematical and Scientific Machine Learning (pp. 128-143). PMLR.
  17. Chen, P. and Ghattas, O., 2020. Projected Stein variational gradient descent. Advances in Neural Information Processing Systems, 33, pp.1947-1958.
  18. Ambartsumyan, I., Boukaram, W., Bui-Thanh, T., Ghattas, O., Keyes, D., Stadler, G., Turkiyyah, G. and Zampini, S., 2020. Hierarchical matrix approximations of Hessians arising in inverse problems governed by PDEs. SIAM Journal on Scientific Computing, 42(5), pp.A3397-A3426.
  19. Alger, N., Chen, P. and Ghattas, O., 2020. Tensor train construction from tensor actions, with application to compression of large high order derivative tensors. SIAM Journal on Scientific Computing, 42(5), pp.A3516-A3539.
  20. Brennan, M., Bigoni, D., Zahm, O., Spantini, A. and Marzouk, Y., 2020. Greedy inference with structure-exploiting lazy maps. Advances in Neural Information Processing Systems, 33, pp.8330-8342.
  21. Tong, X.T., Morzfeld, M. and Marzouk, Y.M., 2020. MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure. SIAM Journal on Scientific Computing, 42(3), pp.A1765-A1788.
  22. Bardsley, J.M., Cui, T., Marzouk, Y.M. and Wang, Z., 2020. Scalable optimization-based sampling on function space. SIAM Journal on Scientific Computing, 42(2), pp.A1317-A1347.

2019
  1. Scarabosio, L., Wohlmuth, B., Oden, J.T. and Faghihi, D., 2019. Goal-oriented adaptive modeling of random heterogeneous media and model-based multilevel Monte Carlo methods. Computers & Mathematics with Applications, 78(8), pp.2700-2718.
  2. Fritz, M., Lima, E.A., Nikolić, V., Oden, J.T. and Wohlmuth, B., 2019. Local and nonlocal phase-field models of tumor growth and invasion due to ECM degradation. Mathematical Models and Methods in Applied Sciences, 29(13), pp.2433-2468.
  3. Gholami, A., Keutzer, K. and Biros, G., 2019, August. ANODE: unconditionally accurate memory-efficient gradients for neural ODEs. In Proceedings of the 28th International Joint Conference on Artificial Intelligence (pp. 730-736).
  4. Kabacaoğlu, G. and Biros, G., 2019. Machine learning acceleration of simulations of Stokesian suspensions. Physical Review E, 99(6), p.063313.
  5. Chenhan, D.Y., Reiz, S. and Biros, G., 2019, October. Distributed o(n) linear solver for dense symmetric hierarchical semi-separable matrices. In 2019 IEEE 13th International Symposium on Embedded Multicore/Many-core Systems-on-Chip (MCSoC) (pp. 1-8). IEEE.
  6. Zhang, T., Yao, Z., Gholami, A., Gonzalez, J.E., Keutzer, K., Mahoney, M.W. and Biros, G., 2019. ANODEV2: A coupled neural ODE framework. Advances in Neural Information Processing Systems, 32.
  7. Zollanvari, A. and Dougherty, E.R., 2019. Optimal bayesian classification with vector autoregressive data dependency . IEEE Transactions on Signal Processing, 67(12), pp.3073-3086.
  8. Petrosyan, A., Tran, H. and Webster, C., 2019. Reconstruction of jointly sparse vectors via manifold optimization. Applied Numerical Mathematics, 144, pp.140-150.
  9. Xie, X., Zhang, G. and Webster, C.G., 2019. Non-intrusive inference reduced order model for fluids using deep multistep neural network. Mathematics, 7(8), p.757.
  10. Dexter, N., Tran, H. and Webster, C., 2019, July. Reconstructing high-dimensional Hilbert-valued functions via compressed sensing. In 2019 13th International conference on Sampling Theory and Applications (SampTA) (pp. 1-5). IEEE.
  11. Dexter, N., Tran, H. and Webster, C., 2019. A mixed ℓ1 regularization approach for sparse simultaneous approximation of parameterized PDEs. ESAIM: Mathematical Modelling and Numerical Analysis, 53(6), pp.2025-2045.
  12. Tran, H. and Webster, C., 2019. A class of null space conditions for sparse recovery via nonconvex, non-separable minimizations. Results in Applied Mathematics, 3, p.100011.
  13. Chen, P., Wu, K., Chen, J., O'Leary-Roseberry, T. and Ghattas, O., 2019. Projected Stein variational Newton: A fast and scalable Bayesian inference method in high dimensions. Advances in Neural Information Processing Systems, 32.
  14. Alger, N., Rao, V., Myers, A., Bui-Thanh, T. and Ghattas, O., 2019. Scalable matrix-free adaptive product-convolution approximation for locally translation-invariant operators. SIAM Journal on Scientific Computing, 41(4), pp.A2296-A2328.

Pre-prints

  1. Wu, K., O'Leary-Roseberry, T., Chen, P. and Ghattas, O., 2022. Derivative-informed projected neural network for large-scale Bayesian optimal experimental design. arXiv preprint arXiv:2201.07925.
  2. Lorenzo, G., Hormuth II, D.A., Jarrett, A.M., Lima, E.A., Subramanian, S., Biros, G., Oden, J.T., Hughes, T.J. and Yankeelov, T.E., 2021. Quantitative in vivo imaging to enable tumor forecasting and treatment optimization. arXiv preprint arXiv:2102.12602.
  3. Woo, H.M., Qian, X., Tan, L., Jha, S., Alexander, F.J., Dougherty, E.R. and Yoon, B.J., 2021. Optimal Decision Making in High-Throughput Virtual Screening Pipelines. arXiv preprint arXiv:2109.11683.
  4. Chen, P. and Royset, J., 2021. Performance Bounds in PDE-Constrained Optimization under Uncertainty. arXiv preprint arXiv:2110.10269.
  5. Wang, Y., Chen, P. and Li, W., 2021. Projected Wasserstein gradient descent for high-dimensional Bayesian inference. arXiv preprint arXiv:2102.06350.
  6. Kim, K.T., Villa, U., Parno, M., Marzouk, Y., Ghattas, O. and Petra, N., 2021. hIPPYlib-MUQ: A Bayesian Inference Software Framework for Integration of Data with Complex Predictive Models under Uncertainty. arXiv preprint arXiv:2112.00713.
  7. Wu, K., Chen, P. and Ghattas, O., 2021. An efficient method for goal-oriented linear Bayesian optimal experimental design: Application to optimal sensor placemen. arXiv preprint arXiv:2102.06627.
  8. Wu, K., Chen, P. and Ghattas, O., 2020. A fast and scalable computational framework for large-scale and high-dimensional Bayesian optimal experimental design. arXiv preprint arXiv:2010.15196.
  9. Kramer, B. and Willcox, K.E., 2019. Balanced truncation model reduction for lifted nonlinear systems. arXiv preprint arXiv:1907.12084.
  10. Dereventsov, A., Petrosyan, A. and Webster, C., 2019. Greedy Shallow Networks: An Approach for Constructing and Training Neural Networks. arXiv preprint arXiv:1905.10409.
  11. Dereventsov, A. and Webster, C., 2019. The Natural Greedy Algorithm for reduced bases in Banach spaces. arXiv preprint arXiv:1905.06448.
  12. Daws Jr, J. and Webster, C.G., 2019. A Polynomial-Based Approach for Architectural Design and Learning with Deep Neural Networks. arXiv preprint arXiv:1905.10457.
  13. O'Leary-Roseberry, T., Alger, N. and Ghattas, O., 2019. Inexact Newton methods for stochastic nonconvex optimization with applications to neural network training. arXiv preprint arXiv:1905.06738.
  14. Chen, P. and Ghattas, O., 2019. Sparse polynomial approximation for optimal control problems constrained by elliptic PDEs with lognormal random coefficients. arXiv preprint arXiv:1903.05547.
  15. Daws Jr, J., Petrosyan, A., Tran, H. and Webster, C.G., 2019. A Weighted ℓ1-Minimization Approach For Wavelet Reconstruction of Signals and Images. arXiv preprint arXiv:1909.07270.
  16. Chen, P. and Ghattas, O., 2018. Sparse polynomial approximations for affine parametric saddle point problems. arXiv preprint arXiv:1809.10251.

Posters

  1. Block Copolymer Directed Self-Assembly: Optimal Design of Nanoscale Manufacturing Under Uncertainty Danial Faghihi, Max Gunzburger, Clayton G. Webster, Frank Alexander, Institute for Computational Engineering and Sciences (ICES), The University of Texas at Austin Department of Computational and Applied Mathematics, Oak Ridge National Laboratory Computational Science Initiative Directorate, Brookhaven National Laboratory | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  2. Optimal Control under Uncertainty and Multiscale Heterogeneous Materials Modeling in Additive Manufacturing Yuanxun Bao George Biros John A. Turner Institute for Computational Engineering & Sciences, The University of Texas at Austin Computational Sciences and Engineering Division, Oak Ridge National Laboratory | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  3. Learning Low-dimensional Models via Operator Lifting and Natural Greedy Algorithms Anton Dereventsov, Max Gunzburger, Clayton Webster, Karen Willcox, Guannan Zhang Department of Computational and Applied Mathematics, Oak Ridge National Laboratory Institute for Computational Engineering & Sciences, The University of Texas at Austin | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019
  4. Transport-Based Variational Bayesian Methods for Learning from Data Daniele Bigoni Joshua Chen, Peng Chen, Omar Ghattas, Youssef Marzouk Tom O’Leary–Roseberry, Keyi Wu, Center for Computational Engineering, Massachusetts Institute of Technology, Institute for Computational Engineering & Sciences, The University of Texas at Austin | Presented at the Department of Energy ASCR Applied Math PI Meeting, January 2019