**Sigmoid unit :**

**Tanh unit:**

**Rectified linear unit (ReLU):**

we call;

- as
**stepped sigmoid**

- as
**softplus**function

The **softplus** function can be approximated by **max function** (**or hard max **) ie . The max function is Continue Reading

Posts tagged with: deep learning

**Sigmoid unit :**

**Tanh unit:**

**Rectified linear unit (ReLU):**

we call;

- as
**stepped sigmoid**

- as
**softplus**function

The **softplus** function can be approximated by **max function** (**or hard max **) ie . The max function is Continue Reading