Lasagne is a lightweight library to build and train neural networks in Theano. Its main features are:
Its design is governed by six principles:
In short, you can install a known compatible version of Theano and the latest Lasagne development version via:
pip install -r https://raw.githubusercontent.com/Lasagne/Lasagne/master/requirements.txt
pip install https://github.com/Lasagne/Lasagne/archive/master.zip
For more details and alternatives, please see the Installation instructions.
Documentation is available online: http://lasagne.readthedocs.org/
For support, please refer to the lasagne-users mailing list.
import lasagne
import theano
import theano.tensor as T
# create Theano variables for input and target minibatch
input_var = T.tensor4('X')
target_var = T.ivector('y')
# create a small convolutional neural network
from lasagne.nonlinearities import leaky_rectify, softmax
network = lasagne.layers.InputLayer((None, 3, 32, 32), input_var)
network = lasagne.layers.Conv2DLayer(network, 64, (3, 3),
nonlinearity=leaky_rectify)
network = lasagne.layers.Conv2DLayer(network, 32, (3, 3),
nonlinearity=leaky_rectify)
network = lasagne.layers.Pool2DLayer(network, (3, 3), stride=2, mode='max')
network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
128, nonlinearity=leaky_rectify,
W=lasagne.init.Orthogonal())
network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5),
10, nonlinearity=softmax)
# create loss function
prediction = lasagne.layers.get_output(network)
loss = lasagne.objectives.categorical_crossentropy(prediction, target_var)
loss = loss.mean() + 1e-4 * lasagne.regularization.regularize_network_params(
network, lasagne.regularization.l2)
# create parameter update expressions
params = lasagne.layers.get_all_params(network, trainable=True)
updates = lasagne.updates.nesterov_momentum(loss, params, learning_rate=0.01,
momentum=0.9)
# compile training function that updates parameters and returns training loss
train_fn = theano.function([input_var, target_var], loss, updates=updates)
# train network (assuming you've got some training data in numpy arrays)
for epoch in range(100):
loss = 0
for input_batch, target_batch in training_data:
loss += train_fn(input_batch, target_batch)
print("Epoch %d: Loss %g" % (epoch + 1, loss / len(training_data)))
# use trained network for predictions
test_prediction = lasagne.layers.get_output(network, deterministic=True)
predict_fn = theano.function([input_var], T.argmax(test_prediction, axis=1))
print("Predicted class for first test input: %r" % predict_fn(test_data[0]))
For a fully-functional example, see examples/mnist.py, and check the Tutorial for in-depth explanations of the same. More examples, code snippets and reproductions of recent research papers are maintained in the separate Lasagne Recipes repository.
If you find Lasagne useful for your scientific work, please consider citing it in resulting publications. We provide a ready-to-use BibTeX entry for citing Lasagne.
Lasagne is a work in progress, input is welcome.
Please see the Contribution instructions for details on how you can contribute!