Linear
The linear activation function simply returns its input without applying any modification:
$$g(x) = x$$
Its derivative is then given by:
$$\frac{\partial g}{\partial x} = 1$$
This activation function is used for instance in the last layer of a regression network.