Which activation function is commonly used in deep learning?

Explanation:

The activation function commonly used in deep learning is ReLU, which stands for Rectified Linear Unit.

It is widely used because it is simple, fast, and helps reduce the vanishing gradient problem.

Formula:

f(x) = max(0, x)