<back
Building a Neural Network from scratch
[2023-07-03]

I'm trying to improve my knowledge on backpropogation. I feel like my current understanding relies too much on "black-box" abstractions. So the next best step? Build a simple NN from scratch.
My code on GitHub
Technical Specifics on Notion

[2023-07-08]
Graph VS Matrix Implementation

You can express the layered model of a neural network through graphs (nodes and weighted edges that are inter-connected) or represent it as a matrix. Matrix implementaions require less compute by leveraging matrix operations, but they limit the modifications you can make to how the backpropogation adjusts weights, etc. For now I'll be using the matrix implementaion.

Forward propogation

Forward prop is how the network generates a probability vector for your given inputs. The input layer is just the data you're passing in, and then every layer after that is a combination of a bias value, the previous layer's value and a connection weight between the nodes.
To put this into an equation, AL = ReLU(AL-1 * WL + BL). There's a detailed visual on the notion page for reference.

Back propogation

The specifics of this are a little complex, so theres detailed info on the notion page. But basically, we are trying to reduce the loss function of the network, by using the gradient descent algorithm, an iterative approach that adjusts weights based on the difference between generated outputs and expected outputs, using derivatives to find the minimum value of a function.

[2023-07-22]
Overall thoughts

Today I finished all parts of the network, including back propogation. Unfortunately, its performance is not great, you can review specifics including the loss progression on the notion page. I'm going to experiment and improve performance over the next few weeks, I suspect something might be wrong with how the network is ingesting data. But now I'm going to focus on the OS project. Thanks for joining the journey!