Questions tagged [automatic-differentiation]
Also known as algorithmic differentiation, short AD. Techniques that take a procedure evaluating a numerical function and transform it into a procedure that additionally evaluates directional derivatives, gradients, higher order derivatives.
automatic-differentiation
209
questions
1
vote
1
answer
35
views
Taking derivatives with multiple inputs in JAX
I am trying to take first and second derivatives of functions in JAX however, my ways of doing that give me the wrong number or zeros. I have an array with two columns for each variable and two rows ...
1
vote
1
answer
51
views
Error in automatic derivative calculation with pytorch
x2=torch.tensor(x,requires_grad=True)
t2=torch.tensor(t,requires_grad=True)
def mr():
for k in range(n):
z01=solve_burgers(torch.tensor([x2[k]]),0.25,0.01/np.pi)[0]
...
1
vote
1
answer
38
views
JAX custom_jvp with 'None' output leads to TypeError
I try to define a function whose jvp is only defined for selected output(s). Below is a simple example:
from jax import custom_jvp, jacobian
@custom_jvp
def func(x, y):
return x+y, x*y
@func....
0
votes
0
answers
13
views
In PyTorch, when using loss.backward() to compute gradients, how can I prevent it from overriding the gradients I've manually computed?
I'm currently working on an RNN neural network for a speech recognition task where I've designed an algorithm to calculate the gradients for w_in and w_rec myself. However, I want to let PyTorch's ...
0
votes
1
answer
62
views
Whether there is any need to modify the backward function in pytorch?
Recently I have been working on self-defined models with self-defined backward function (since the forward process is not implemented via pytorch AD). Say if I have a model of which the forward ...
2
votes
1
answer
141
views
Computing the Jacobian of an image: How to reshape the numpy array properly?
I have a batch x of images of the shape [k, width, height, channel_count]. This batch is transformed by a function f. The result has the same shape and I need to compute the divergence (i.e. trace of ...
0
votes
1
answer
62
views
How to generate jacobian of a tensor-valued function using torch.autograd?
Computing the jacobian of a function f : R^d -> R^d is not too hard:
def jacobian(y, x):
k, d = x.shape
jacobian = list()
for i in range(d):
v = torch.zeros_like(y)
...
2
votes
1
answer
72
views
How to implement Carleman Matrix in Haskell?
I am trying to implement Carlemann matrix of a differentiable function in Haskell using the Numeric.AD library. I'm using https://en.wikipedia.org/wiki/Carleman_matrix for reference.
So far I have the ...
4
votes
2
answers
129
views
Calculating two gradients in pytorch and reusing an intermediate gradient
Suppose we have a function f whose gradient is slow to compute, and two functions g1 and g2 whose gradient is easy to compute. In pytorch, how can I calculate the gradients of z1 = g1(f(x)) and z2 = ...
1
vote
1
answer
152
views
solving an ODE using neural networks
I want to solve this ODE using neural nets. du/dt + 2u + t = 0 with initial condition u(0)=1 and t is between 0 to 2.
I want to use pytorch and automatic differentiation method to solve this equation. ...
0
votes
0
answers
37
views
How to preallocate using JacobianConfig for "Hessian of vector valued function" double Jacobian in Julia ForwardDiff package
I have a vector valued function that is fixed in input and output size, but is called in a loop and I would like to calculate the gradient and hessian of all output entries.
In addition, each call ...
1
vote
1
answer
133
views
JAX `custom_vjp` for functions with multiple outputs
In the JAX documentation, custom derivatives for functions with a single output are covered. I'm wondering how to implement custom derivatives for functions with multiple outputs such as this one?
# ...
1
vote
1
answer
49
views
JAX `vjp` fails for vmapped function with `custom_vjp`
Below is an example where a function with a custom-defined vector-Jacobian product (custom_vjp) is vmapped. For a simple function like this, invoking vjp fails:
@partial(custom_vjp, nondiff_argnums=(0,...
1
vote
1
answer
61
views
JAX `vjp` does not recognize cotangent argument with `custom_vjp`
I have a JAX function cart_deriv() which takes another function f and returns the Cartesian derivative of f, implemented as follows:
@partial(custom_vjp, nondiff_argnums=0)
def cart_deriv(f: Callable[....
1
vote
1
answer
43
views
Failing to return gradients
This code is supposed to calculate gradients of the output of a network w.r.t its inputs. but it seems to return wrong values. what is the problem with the code?
for more context, the function "B&...