Last time, we played with Karmen Blake's Neural Network Library. We just used it to construct some basic networks. Let's have a look at the code to see how he pulled it together.

We'll start by looking at the `or`

mix task we created last time. The first thing we did was:

` {:ok, network_pid} = Network.start_link([2,1])`

Let's look at that function. It takes a list of integers. Each integer defines the number of neurons in incrementally-declared layers of the network. In our case, we have an input layer with two neurons and an output layer with one neuron, with no hidden layers.

`defmodule NeuralNetwork.Network do @moduledoc """ Contains layers which makes up a matrix of neurons. """ # ... @doc """ Pass in layer sizes which will generate the layers for the network. The first number represents the number of neurons in the input layer. The last number represents the number of neurons in the output layer. [Optionally] The middle...`

To see the complete version of this and more awesome articles become a superuser or login

Comments for Code Spelunking in a Neural Network Library