synapses.lib

package synapses.lib

Type members

Classlikes

case class Codec(attributes: LazyList[Attribute])

The methods of a codec.

The methods of a codec.

Encode a data point:

codec.encode(Map("petal_length" -> "1.5","species" -> "setosa"))

Decode a data point:

codec.decode(List(0.0, 1.0, 0.0))

Get the JSON representation of the codec:

codec.json()
Companion
object
object Codec

The constructors of a codec.

The constructors of a codec.

One hot encoding is a process that turns discrete attributes into a list of 0.0 and 1.0. Minmax normalization scales continuous attributes into values between 0.0 and 1.0.

A codec can encode and decode every data point.

There are two ways to create a codec:

  1. By providing a list of pairs that define the name and the type of each attribute:
val codec = Codec(
  List( ("petal_length", false),
        ("species", true) ),
  Iterator(Map("petal_length" -> "1.5",
               "species" -> "setosa"),
           Map("petal_length" -> "3.8",
               "species" -> "versicolor"))
)
  1. By providing its JSON representation.
val codec = Codec(
  """[{"Case":"SerializableContinuous",
       "Fields":[{"key":"petal_length","min":1.5,"max":3.8}]},
      {"Case":"SerializableDiscrete",
       "Fields":[{"key":"species","values":["setosa","versicolor"]}]}]"""
)
Companion
class
object Fun

The activation functions a neuron can have.

The activation functions a neuron can have.

They can be used in the arguments of neural network's constructor.

Net(List(2, 3, 1), _ => Fun.sigmoid, _ => Random().nextDouble())
Net(List(4, 6, 8, 5, 3), _ => Fun.identity, _ => Random().nextDouble())
Net(List(4, 8, 3), _ => Fun.tanh, _ => Random().nextDouble())
Net(List(2, 1), _ => Fun.leakyReLU, _ => Random().nextDouble())
case class Net(layers: LazyList[Layer])

The methods of a neural network.

The methods of a neural network.

Get the prediction for an input:

net.predict(List(0.4, 0.05, 0.2))

Fit network to a single observation:

net.fit(0.1, List(0.4, 0.05, 0.2), List(0.03. 0.8))

Get the JSON representation of the network:

net.json()
Companion
object
object Net

The constructors of a neural network.

The constructors of a neural network.

There are four ways to create a neural network:

  1. By providing its layer sizes. This constructor creates a random sigmoid neural network.
val net = Net(List(2, 3, 1))
  1. By providing its layer sizes and a seed. This constructor creates a non-random sigmoid neural network.
val net = Net(List(4, 6, 8, 5, 3), 1000)
  1. By providing its JSON representation.
val net = Net("""[[{"activationF":"sigmoid","weights":[-0.4,-0.1,-0.8]}]]""")
  1. By providing the size, the activation function and the weights for each layer.
val net = Net(List(4, 8, 3), _ => Fun.tanh, _ => Random().nextDouble())
Companion
class
object Stats

Measure the difference between the values predicted by a neural network and the observed values.

Measure the difference between the values predicted by a neural network and the observed values.

Calculate the root mean square error:

Stats.rmse(
  Iterator(
    (List(0.0, 0.0, 1.0), List(0.0, 0.0, 1.0)),
    (List(0.0, 0.0, 1.0), List(0.0, 1.0, 1.0))
  )
)

Calculate the score of the classification accuracy:

Stats.score(
  Iterator(
    (List(0.0, 0.0, 1.0), List(0.0, 0.1, 0.9)),
    (List(0.0, 1.0, 0.0), List(0.8, 0.2, 0.0)),
    (List(1.0, 0.0, 0.0), List(0.7, 0.1, 0.2)),
    (List(1.0, 0.0, 0.0), List(0.3, 0.3, 0.4)),
    (List(0.0, 0.0, 1.0), List(0.2, 0.2, 0.6))
    (List(0.0, 0.0, 1.0), List(0.0, 0.1, 0.9)),
    (List(0.0, 1.0, 0.0), List(0.8, 0.2, 0.0)),
    (List(1.0, 0.0, 0.0), List(0.7, 0.1, 0.2)),
    (List(1.0, 0.0, 0.0), List(0.3, 0.3, 0.4)),
    (List(0.0, 0.0, 1.0), List(0.2, 0.2, 0.6))
  )
)

Types

type Fun = Activation