Asset Classes

Free investment financial education

Language

Multilingual content from IBKR

Close Navigation
Learn more about IBKR accounts
Neural Network In Python: Introduction, Structure And Trading Strategies – Part IV

Neural Network In Python: Introduction, Structure And Trading Strategies – Part IV

Posted September 24, 2019 at 10:17 am
Devang Singh
QuantInsti

In the previous installment, Devang Singh discussed how to train the Neural Network. In today’s post, Devang will demonstrate the concept of Gradient Descent.

Gradient Descent involves analyzing the slope of the curve of the cost function. Based on the slope, we adjust the weights to minimize the cost function in steps rather than computing the values for all possible combinations. 

Visualizations of Gradient Descent are shown in the diagrams below. The first plot is a single value of weights and hence is two dimensional. It can be seen that the red ball moves in a zig-zag pattern to arrive at the minimum of the cost function. 

In the second diagram, we have to adjust two weights in order to minimize the cost function. Therefore, we can visualize it as a contour, as shown in the graph, where we are moving in the direction of the steepest slope in order to reach the minima in the shortest duration. With this approach, we do not have to do many computations, and as a result, the computations do not take very long, making the training of the model a feasible task.

gradient descent
gradient descent

Gradient Descent can be done in three possible ways, 

  • batch gradient descent
  • stochastic gradient descent
  • mini-batch gradient descent

In batch gradient descent, the cost function is computed by summing all the individual cost functions in the training dataset and then computing the slope and adjusting the weights.

In stochastic gradient descent, the slope of the cost function and the adjustments of weights are done after each data entry in the training dataset. This is extremely useful to avoid getting stuck at a local minima if the curve of the cost function is not strictly convex. Each time you run the stochastic gradient descent, the process to arrive at the global minima will be different. Batch gradient descent may result in getting stuck with a suboptimal result if it stops at local minima. 

The third type is the mini-batch gradient descent, which is a combination of the batch and stochastic methods. Here, we create different batches by clubbing together multiple data entries in one batch. This essentially results in implementing the stochastic gradient descent on bigger batches of data entries in the training dataset. 

In the next installment, Devang will discuss how backpropagation works to adjust the weights according to the error which had been generated. Visit QuantInsti website to download the sample code.

Disclosure: Interactive Brokers

Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from QuantInsti and is being posted with its permission. The views expressed in this material are solely those of the author and/or QuantInsti and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

IBKR Campus Newsletters

This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.