SimpleDNN alternatives and similar libraries
Based on the "Misc" category.
Alternatively, view SimpleDNN alternatives based on common mentions on social networks and blogs.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest. Visit our partner's website for more details.
Do you think we are missing an alternative of SimpleDNN or a related project?
SimpleDNN is part of KotlinNLP and has been designed to support relevant neural network architectures in natural language processing tasks such as pos-tagging, syntactic parsing and named-entity recognition.
As it is written in Kotlin it is interoperable with other JVM languages (e.g. Java and Scala). Mathematical operations within the library are performed with jblas. Different libraries can be used instead, although during our experiments it proved to be the fastest. The effort required is minimum as the essential mathematical functions are wrapped in a single package and fully tested — saving you valuable time.
SimpleDNN does not use the computational graph model and does not perform automatic differentiation of functions.
In case you are looking for state-of-the-art technology to create sophisticated flexible network architectures, you
should consider the following libraries:
Keras, TensorFlow and Deeplearning4j
If instead a simpler yet well structured neural network almost ready to use is what you need, then you are in the right place!
Building a basic Neural Network with SimpleDNN does not require much more effort than just configuring a stack of layers, with one input layer, any number of hidden layers, and one output layer:
/** * Create a fully connected neural network with an input layer with dropout, * two hidden layers with ELU activation function and an output one with * Softmax activation for classification purpose. */ val network = StackedLayersParameters( LayerInterface( // input layer size = 784, dropout = 0.25), LayerInterface( // first hidden layer size = 100, activationFunction = ELU(), connectionType = LayerType.Connection.Feedforward), LayerInterface( // second hidden layer size = 100, ctivationFunction = ELU(), connectionType = LayerType.Connection.Feedforward), LayerInterface( // output layer size = 10, activationFunction = Softmax(), connectionType = LayerType.Connection.Feedforward))
Import with Maven
<dependency> <groupId>com.kotlinnlp</groupId> <artifactId>simplednn</artifactId> <version>0.14.0</version> </dependency>
Try some examples of usage of SimpleDNN running the files in the
To make the examples working download the datasets
here, then set their paths
copying the file
example/config/configuration.yml and editing it
This software is released under the terms of the Mozilla Public License, v. 2.0
We greatly appreciate any bug reports and contributions, which can be made by filing an issue or making a pull request through the github page.
*Note that all licence references and agreements mentioned in the SimpleDNN README section above are relevant to that project's source code only.