Janus provides a powerful symbolic computing layer built on top of CasADi, abstracted to feel like native C++ with Eigen integration. This allows you to compute derivatives, generate code, and optimize systems using the same code you write for simulation. Symbolic mode works by building a computational graph instead of executing immediately, enabling automatic differentiation, sensitivity analysis, and matrix-free second-order products.
Quick Start
auto f = x * x;
auto res = fn(3.0);
std::cout << "f(3) = " << res[0] << ", f'(3) = " << res[1] << std::endl;
Wrapper around casadi::Function providing Eigen-native IO.
Definition Function.hpp:46
Umbrella header that includes the entire Janus public API.
SymbolicScalar sym(const std::string &name)
Create a named symbolic scalar variable.
Definition JanusTypes.hpp:90
auto jacobian(const Expr &expression, const Vars &...variables)
Computes Jacobian of an expression with respect to variables.
Definition AutoDiff.hpp:109
Core API
- janus::SymbolicScalar: Alias for casadi::MX. Represents a symbolic value or expression.
- janus::SymbolicMatrix: Alias for Eigen::Matrix<casadi::MX, -1, -1>. Allows you to use familiar Eigen syntax (block operations, coeff access) on symbolic variables.
- janus::sym(name): Create a scalar symbolic variable.
- janus::sym(name, n): Create a column vector symbolic variable (n x 1).
- janus::sym(name, r, c): Create a matrix symbolic variable (r x c).
- janus::Function({inputs}, {outputs}): Compile symbolic expressions into a callable function.
- janus::jacobian({outputs}, {inputs}): Compute the Jacobian of outputs with respect to inputs.
Usage Patterns
Creating Variables
Use the janus::sym helper to create symbolic variables cleanly.
Building Expressions
You can use standard arithmetic operators (+, -, *, /) and Janus math functions. These build a computational graph instead of executing immediately.
auto z = A * v;
T sin(const T &x)
Computes sine of x.
Definition Trig.hpp:21
T pow(const T &base, const T &exponent)
Computes the power function: base^exponent.
Definition Arithmetic.hpp:72
Defining Functions (janus::Function)
To evaluate expressions numerically, you must wrap them in a janus::Function. This wrapper handles type conversion between Eigen and CasADi.
auto result = f(1.5, Eigen::Vector3d::Zero());
std::cout << result[0] << std::endl;
Features:
- Automatic Naming: You don't need to provide a string name; one is generated automatically.
- Eigen Integration: Inputs and outputs are converted to/from Eigen types seamlessly.
Automatic Differentiation (janus::jacobian)
Compute derivatives effortlessly. Janus handles the tedious task of concatenating variables for CasADi.
Advanced Usage
Sensitivity Regime Switching
For compiled janus::Function objects, Janus can choose between forward and adjoint Jacobian construction automatically based on parameter count, output count, and optional trajectory hints.
f,
0,
0,
400,
true
);
auto J = J_fun.eval(x_val);
SensitivityRecommendation select_sensitivity_regime(int parameter_count, int output_count, int horizon_length=1, bool stiff=false, const SensitivitySwitchOptions &opts=SensitivitySwitchOptions())
Recommend a sensitivity regime from parameter/output counts.
Definition AutoDiff.hpp:293
Function sensitivity_jacobian(const Function &fn, int output_idx=0, int input_idx=0, int horizon_length=1, bool stiff=false, const SensitivitySwitchOptions &opts=SensitivitySwitchOptions())
Build a Jacobian function for one output/input block using the recommended regime.
Definition AutoDiff.hpp:373
Use rec.integrator_options() when you want the same recommendation expressed as CasADi/SUNDIALS options (nfwd or nadj, plus checkpoint settings for long-horizon adjoints).
Matrix-Free Second-Order Products
For large optimization problems, you often want H * v without ever forming the dense Hessian. Janus exposes matrix-free Hessian-vector products for both plain scalar expressions and Lagrangians:
casadi::MX x0 = x(0);
casadi::MX x1 = x(1);
casadi::MX x2 = x(2);
auto objective = x0 * x0 + x1 * x2 +
janus::sin(x2);
auto constraints = casadi::MX::vertcat({x0 + x1, x1 * x2});
auto lag_hvp =
SymbolicMatrix lagrangian_hessian_vector_product(const SymbolicArg &objective, const SymbolicArg &constraints, const SymbolicArg &vars, const SymbolicArg &multipliers, const SymbolicArg &direction)
Hessian-vector product of a Lagrangian, i.e. a second-order adjoint action.
Definition AutoDiff.hpp:503
SymbolicMatrix hessian_vector_product(const SymbolicArg &expr, const SymbolicArg &vars, const SymbolicArg &direction)
Hessian-vector product for a scalar expression without forming the dense Hessian.
Definition AutoDiff.hpp:477
For compiled janus::Function objects, the wrappers return another janus::Function:
auto hv = hvp_fun.eval(x_val, v_val);
The Lagrangian variant appends the multiplier block first and the direction block last:
auto hv = lag_hvp_fun.eval(x_val, lam_val, v_val);
These products use CasADi's forward-over-reverse AD internally, so the dense Hessian is never constructed as an intermediate.
See Also