Wrapper functions for API of ReverseDiff.jl at http://www.juliadiff.org/ReverseDiff.jl/api/. These functions can help you calculate gradient, jacobian and hessian for your functions using reverse mode automatic differentiation. For more details, see http://www.juliadiff.org/ReverseDiff.jl/api/.

reverse_grad(f_or_tape, input, cfg = NULL, diffresult = NULL,
  debug = TRUE)

reverse_jacobian(f_or_tape, input, cfg = NULL, diffresult = NULL,
  debug = TRUE)

reverse_hessian(f_or_tape, input, cfg = NULL, diffresult = NULL,
  debug = TRUE)

reverse_grad_config(input, diffresult = NULL)

reverse_jacobian_config(input, diffresult = NULL)

reverse_hessian_config(input, diffresult = NULL)

reverse_grad_tape(f, input, cfg = NULL)

reverse_jacobian_tape(f, input, cfg = NULL)

reverse_hessian_tape(f, input, cfg = NULL)

reverse_compile(tape)

Arguments

f_or_tape

the target function f or the tape recording execution trace of f.

input

the point where you take the gradient, jacobian and hessian. Note that it should be a a vector of length greater than 1. If you want to calulate the derivative of a function, you can considering using forward_deriv.

cfg

Config objects which contains the preallocated tape and work buffers used by reverse mode automatic differentiation. ReverseDiff's API methods will allocate the Config object automatically by default, but you can preallocate them yourself and reuse them for subsequent calls to reduce memory usage.

diffresult

Optional DiffResult object to store the derivative information.

debug

Whether to use the wrapper functions under debug mode. With the debug mode, users can have more informative error messages. Without the debug mode, the wrapper functions will be more performant.

f

the function you want to calulate the gradient, jacobian and hessian. Note that f(x) should be a scalar for grad and hessian, a vector of length greater than 1 for jacobian.

tape

the object to record the target function's execution trace used by reverse mode automatic differentiation. In many cases, pre-recording and pre-compiling a reusable tape for a given function and differentiation operation can improve the performance of reverse mode automatic differentiation. Note that pre-recording a tape can only capture the the execution trace of the target function with the given input values. In other words, the tape cannot any re-enact branching behavior that depends on the input values. If the target functions contain control flow based on the input values, be careful or not to use tape-related APIs.

Value

reverse_grad, reverse_jacobian and reverse_hessian return the gradient, jacobian and hessian of f or tape correspondingly evaluated at input. reverse_grad_config, reverse_jacobian_config and reverse_hessian_config return Config instances containing the preallocated tape and work buffers used by reverse mode automatic differentiation. reverse_grad_tape, reverse_jacobian_tape and reverse_hessian_tape return Tape instances containing the the execution trace of the target function with the given input values.