TMB Documentation  v1.9.11
Public Member Functions | List of all members
newton::NewtonOperator< Functor, Hessian_Type > Struct Template Reference

Generalized newton solver similar to TMB R function 'newton'. More...

#include <newton.hpp>

Public Member Functions

 NewtonOperator (Functor &F, vector< TMBad::ad_aug > start, newton_config cfg)
 Constructor. More...
 
- Public Member Functions inherited from TMBad::global::Operator< ninput, noutput >
void dependencies_updating (Args<> &args, Dependencies &dep) const
 Default implementation of OperatorPure::dependencies_updating()
 
void * operator_data ()
 Return operator specific dynamic information (optional)
 
OperatorPureother_fuse (OperatorPure *self, OperatorPure *other)
 How to fuse this operator (self) with another (other)
 
void print (print_config cfg)
 Print this operator (optional)
 

Additional Inherited Members

- Static Public Attributes inherited from TMBad::global::DynamicOperator< -1, -1 >
static const bool dynamic
 
static const int max_fuse_depth
 
- Static Public Attributes inherited from TMBad::global::Operator< ninput, noutput >
static const bool add_forward_replay_copy
 Should this operator replay it self by invoking the copy CTOR ?
 
static const bool add_static_identifier
 Should this operator have a static identifier ?
 
static const bool allow_remap
 Is it safe to remap the inputs of this operator? More...
 
static const int dependent_variable
 Is output of this operator a dependent variable ?
 
static const bool dynamic
 Does this operator require dynamic allocation ?
 
static const bool elimination_protected
 Protect this operator from elimination by the tape optimizer ?
 
static const bool have_dependencies
 Have dependencies member been defined ?
 
static const bool have_eval
 Does this class have an eval member from which to define forward ?
 
static const bool have_forward_incr_reverse_decr
 Have forward_incr and reverse_incr members been defined ?
 
static const bool have_forward_mark_reverse_mark
 Have forward_mark and reverse_mark members been defined ?
 
static const bool have_forward_reverse
 Have forward and reverse members been defined ?
 
static const bool have_increment_decrement
 Have increment and decrement members been defined ?
 
static const bool have_input_size_output_size
 Have input_size and output_size members been defined ?
 
static const bool implicit_dependencies
 Does this operator have implicit dependencies? More...
 
static const int independent_variable
 Is output of this operator an independent variable ?
 
static const bool is_constant
 Is this a constant operator ?
 
static const bool is_linear
 Is this a linear operator ?
 
static const int max_fuse_depth
 How many times can this operator be doubled ?
 
static const int ninput
 Number of operator inputs.
 
static const int noutput
 Number of operator outputs.
 
static const bool smart_pointer
 Is this operator a 'smart pointer' (with reference counting) ?
 
static const bool updating
 This operator may update existing variables ? More...
 

Detailed Description

template<class Functor, class Hessian_Type = jacobian_dense_t<>>
struct newton::NewtonOperator< Functor, Hessian_Type >

Generalized newton solver similar to TMB R function 'newton'.

This operator represents a Newton solver that can be put on the AD tape. Using the notation

\[ f(u,\theta) \]

for an objective function with 'inner' parameters \(u\) and 'outer' parameters \(\theta\), the operator is defined by

\[ \theta \rightarrow \hat{u}(\theta) := \text{argmin}_u f(u, \theta) \]

.

Applied optimizations

  1. If cfg.decompose=true (default), all sub-expressions of \(f(u,\theta)\) that do not depend on \(u\) are moved out of the solver, effecively corresponding to a re-parameterization of the outer parameters. An example could be that the objective function calculates a determinant that only depends on \(\theta\). In this situation the result of the determinant becomes the new outer parameter rather than \(\theta\).
  2. A sligthly overlapping, but different, optimization is applied if cfg.simplify=true (default). We call it a 'dead gradient' analysis because it detects which \(\theta_i\) do not affect the gradient of the objective wrt \(u\). Such \(\theta_i\) cannot affect the solution \(\hat{u}\) either and can therefore be removed from the list of outer parameters, effectively reducing the input dimension of the Newton operator.
Template Parameters
FunctorClass of the objective function.
Hessian_TypeClass of the hessian structure (sparse, dense or sparse_plus_lowrank).

Definition at line 766 of file newton.hpp.

Constructor & Destructor Documentation

§ NewtonOperator()

template<class Functor , class Hessian_Type = jacobian_dense_t<>>
newton::NewtonOperator< Functor, Hessian_Type >::NewtonOperator ( Functor &  F,
vector< TMBad::ad_aug start,
newton_config  cfg 
)
inline

Constructor.

Parameters
FObjective function taking vector<TMBad::ad_aug> as input and TMBad::ad_aug as output.
startInitial guess for the optimizer.
cfgConfiguration parameters - see newton_config.

Definition at line 782 of file newton.hpp.


The documentation for this struct was generated from the following file:
License: GPL v2