The most basic algorithm is Gradient Descent to optimize your cost function in a machine learning algorithm but it has some problems like;

- Choosing alpha value for the steps of convergence process.
- Too much iteration for converge.

There are some other approaches that are;

- Conjugate Gradient
- BFGS
- L-BFGS

These are more clever way since:

- They have smart iterations chooses alpha intelligently instead of you
- Small number of iterations do faster to converge.

Now you might search these algorithms much more and see the details or just keep their names in mind to use them in you ML problem as a out of box tools.