Main Content

staticderiv

Static derivative function

Syntax

staticderiv('dperf_dwb',net,X,T,Xi,Ai,EW)
staticderiv('de_dwb',net,X,T,Xi,Ai,EW)

Description

This function calculates derivatives using the chain rule from the networks performance or outputs back to its inputs. For time series data and dynamic networks this function ignores the delay connections resulting in a approximation (which may be good or not) of the actual derivative. This function is used by Elman networks (elmannet) which is a dynamic network trained by the static derivative approximation when full derivative calculations are not available. As full derivatives are calculated by all the other derivative functions, this function is not recommended for dynamic networks except for research into training algorithms.

staticderiv('dperf_dwb',net,X,T,Xi,Ai,EW) takes these arguments,

net

Neural network

X

Inputs, an RxQ matrix (or NxTS cell array of RixQ matrices)

T

Targets, an SxQ matrix (or MxTS cell array of SixQ matrices)

Xi

Initial input delay states (optional)

Ai

Initial layer delay states (optional)

EW

Error weights (optional)

and returns the gradient of performance with respect to the network’s weights and biases, where R and S are the number of input and output elements and Q is the number of samples (and N and M are the number of input and output signals, Ri and Si are the number of each input and outputs elements, and TS is the number of timesteps).

staticderiv('de_dwb',net,X,T,Xi,Ai,EW) returns the Jacobian of errors with respect to the network’s weights and biases.

Examples

Here a feedforward network is trained and both the gradient and Jacobian are calculated.

[x,t] = simplefit_dataset;
net = feedforwardnet(20);
net = train(net,x,t);
y = net(x);
perf = perform(net,t,y);
gwb = staticderiv('dperf_dwb',net,x,t)
jwb = staticderiv('de_dwb',net,x,t)

Version History

Introduced in R2010b