Tag Archives: mathematics

Open Source Component-Based Modeling with ModelingToolkit

By: Christopher Rackauckas

Re-posted from: https://www.stochasticlifestyle.com/open-source-component-based-modeling-with-modelingtoolkit/

Component-based modeling systems such as Simulink and Dymola allow for building scientific models in a way that can be composed. For example, Bob can build a model of an engine, and Alice can build a model of a drive shaft, and you can then connect the two models and have a model of a car. These kinds of tools are used all throughout industrial modeling and simulation in order to allow for “separation of concerns”, allowing experts to engineer their domain and compose the final digital twins with reusable scientific modules. But what about open source? In this talk we will introduce ModelingToolkit, an open source component-based modeling framework that allows for composing pre-built models and scales to large high-fidelity digital twins.

PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.

PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases.

The post Open Source Component-Based Modeling with ModelingToolkit appeared first on Stochastic Lifestyle.

The Numerical Analysis of Differentiable Simulation: Automatic Differentiation Can Be Incorrect

By: Christopher Rackauckas

Re-posted from: https://www.stochasticlifestyle.com/the-numerical-analysis-of-differentiable-simulation-automatic-differentiation-can-be-incorrect/

ISCL Seminar Series

The Numerical Analysis of Differentiable Simulation: How Automatic Differentiation of Physics Can Give Incorrect Derivatives

Scientific machine learning (SciML) relies heavily on automatic differentiation (AD), the process of constructing gradients which include machine learning integrated into mechanistic models for the purpose of gradient-based optimization. While these differentiable programming approaches pitch an idea of “simply put the simulator into a loss function and use AD”, it turns out there are a lot more subtle details to consider in practice. In this talk we will dive into the numerical analysis of differentiable simulation and ask the question: how numerically stable and robust is AD? We will use examples from the Python-based Jax (diffrax) and PyTorch (torchdiffeq) libraries in order to demonstrate how canonical formulations of AD and adjoint methods can give inaccurate gradients in the context of ODEs and PDEs. We demonstrate cases where the methodologies are “mathematically correct”, but due to the intricacies of numerical error propagation, their approaches can give 60% and greater error even in simple cases like linear ODEs. We’ll then describe some of the non-standard modifications to AD which are done in the Julia SciML libraries to overcome these numerical instabilities and achieve accurate results, crucially also describing the engineering trade-offs which are required to be made in the process. The audience should leave with a greater appreciation of the greater numerical challenges which still need to be addressed in the field of AD for SciML.

The post The Numerical Analysis of Differentiable Simulation: Automatic Differentiation Can Be Incorrect appeared first on Stochastic Lifestyle.

Symbolic-Numerics: how compiler smarts can help improve the performance of numerical methods (nonlinear solvers in Julia)

By: Christopher Rackauckas

Re-posted from: http://www.stochasticlifestyle.com/symbolic-numerics-how-compiler-smarts-can-help-improve-the-performance-of-numerical-methods-nonlinear-solvers-in-julia/

Many problems can be reduced down to solving f(x)=0, maybe even more than you think! Solving a stiff differential equation? Finding out where the ball hits the ground? Solving an inverse problem to find the parameters to fit a model? In this talk we’ll showcase how SciML’s NonlinearSolve.jl is a general system for solving nonlinear equations and demonstrate its ability to efficiently handle these kinds of problems with high stability and performance. We will focus on how compilers are being integrated into the numerical stack so that many of the things that were manual before, such as defining sparsity patterns, Jacobians, and adjoints, are all automated out-of-the-box making it greatly outperform purely numerical codes like SciPy or NLsolve.jl.

PyData Global 2023

The post Symbolic-Numerics: how compiler smarts can help improve the performance of numerical methods (nonlinear solvers in Julia) appeared first on Stochastic Lifestyle.