在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):jbarrow/LambdaNet开源软件地址(OpenSource Url):https://github.com/jbarrow/LambdaNet开源编程语言(OpenSource Language):Haskell 100.0%开源软件介绍(OpenSource Introduction):LambdaNetLambdaNet is an artificial neural network library written in Haskell that abstracts network creation, training, and use as higher order functions. The benefit of this approach is that it provides a framework in which users can:
The library comes with a pre-defined set of functions that can be composed in many ways to operate on real-world data. These will be enumerated later in the documentation. Current ReleaseThe code from this repo doesn't reflect the current release of LambdaNet. The README for the current release on Hackage can be found here. InstallationThe first step is to follow the HMatrix installation instructions. After that, LambdaNet can be installed through Cabal:
Installing the Most Recent BuildAlternatively, you can use the nightly. The API may be different than what
is covered in the README, but the To install the nightly build, simply run:
Using LambdaNetUsing LambdaNet to rapidly prototype networks using built-in functions requires only a minimal level of Haskell knowledge (although getting the data into the right form may be more difficult). However, extending the library may require a more in-depth knowledge of Haskell and functional programming techniques. You can find a quick example of using the network in
The rest of this section dissects the XOR network in order to talk about the design of LambdaNet. Training DataBefore you can train or use a network, you must have training data. The training data is a tuple of vectors, the first value being the input to the network, and the second value being the expected output. For the XOR network, the data is easily hardcoded:
However, for any non-trivial application the most difficult work will be getting the data in this form. Unfortunately, LambdaNet does not currently have tools to support data handling. Layer DefinitionsThe first step in creating a network is to define a list of layer definitions. The type layer definition takes a neuron type, a count of neurons in the layer, and a connectivity function. Creating the layer definitions for a three-layer XOR network, with 2 neurons in the input layer, 2 hidden neurons, and 1 output neuron can be done as:
Neuron TypesA neuron is simply defined as an activation function and its derivative, and the LambdaNet library provides three built-in neuron types:
By passing one of these functions into a LayerDefinition, you can create a layer with neurons of that type. ConnectivityA connectivity function is a bit more opaque. Currently, the library
only provides Simply, the connectivity function takes in the number of neurons in layer l and the number of neurons in layer l + 1, and returns a boolean matrix of integers (0/1) that represents the connectivity graph of the layers -- a 0 means two neurons are not connected and a 1 means they are. The starting weights are defined later. Creating the NetworkThe For the XOR network, the createNetwork function is:
Our source of entropy is the very random: Random TransformsThe random transform function is a transform that operates on a stream of uniformly distributed random numbers and returns a stream of floating point numbers. Currently, the two defined distributions are:
Work is being done to offer a student t-distribution, which would require support for a chi-squared distribution transformation. Training the NetworkIn order to train a network, you must create a new trainer:
The BackpropTrainer type takes in a learning rate, a cost function, and its derivative. The actual training of the network, the
LambdaNet provides three training methods:
The The calculated error is what is returned by the cost function. Cost FunctionsCurrently, the only provided cost function is the quadratic error cost function,
Selection FunctionsSelection functions break up a dataset for each round of training. The currently provided selection functions are:
For small data sets, it's better to use online, while for larger data sets, the training can occur much faster if you use a reasonably sized minibatch. Using the NetworkOnce the network is trained, you can use it with your test data or production data:
LambdaNet at least attempts to follow a Scikit-Learn style naming scheme
with Storing and LoadingOnce a network has been trained, the weights and biases can be stored in a file:
By calling Loading a network requires passing in a list of layer definitions for the original network, but will load all the weights and biases of the saved network:
Note that the loadNetwork function returns an IO (Network), you can't simply call predict or train on the object returned by loadNetwork. Using the approach in XOR.hs should allow you to work with the returned object. Currently Under DevelopmentWhat has been outlined above is only the first stages of LambdaNet. I intend to support some additional features, such as:
Unit TestingIn order to develop more complex network architectures, it is important to ensure that all of the basics are working -- especially as the API undergoes changes. To run the unit tests:
This will download the most recent version of LambdaNet and run all the unit tests. Self-Organizing Maps (SOMs, or Kohonen Maps)SOMs were chosen as the next architecture to develop because they make different assumptions than FeedForward networks. This allows us to see how the current library handles building out new architectures. Already this has forced a change in the Neuron model and spurred the development of a visualizations package (in order to usefully understand the outputs of the SOMs). Regularization Functions and MomentumStandard backprop training is subject to overfitting and falling into local minima. By providing support for regularization and momentum, LambdaNet will be able to provide more extensible and robust training. Future GoalsThe future goals are:
Generating the Documentation ImagesAll the documentation for the network was generated in the following manner. In the docs folder, run:
Note that I am currently working on removing the Python image analysis from the library, and switching it with Haskell and gnuplot. I'm also working on using the generated images in network documentation. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论