Welcome to Pystacks

Pystacks is a modular hierarchical machine learning library built on top of Theano. At its core, Pystacks provides modules that can be combined to create complex Theano graphs.

Installation

pip install git+https://github.com/vzhong/pystacks.git

# for development
git clone https://github.com/vzhong/pystacks.git

# run tests
cd pystacks
pip install -e .
python setup.py test

Examples

MNIST Multilayer perceptron

sym_X = T.fmatrix()
net = Sequential([
        LinearLayer(784, 128),
        ReLU(),
        Dropout(0.2),
        LinearLayer(128, 128),
        ReLU(),
        Dropout(0.2),
        LinearLayer(128, 10),
        Softmax(),
        ])
sym_prob = net.forward(sym_X, train=True)
sym_Y = T.ivector()
sym_loss = cross_entropy_loss(sym_prob, sym_Y, one_hot_num_classes=10)

sym_lr = T.fscalar()

optim = RMSProp()

train_f = function([sym_X, sym_Y, sym_lr], sym_loss, updates=net.grad_updates(sym_loss, sym_lr, optimizer=optim))
pred_f = function([sym_X], net.forward(sym_X, train=False).argmax(axis=1))

Character recurrent language model

net = Recurrent([
        LookupTable(len(vocab), emb_size, E_transform=UnitNorm()), 
        LSTMMemoryLayer(emb_size, h1_size), 
        LSTMMemoryLayer(h1_size, h2_size), 
        LinearLayer(h2_size, len(vocab)), 
        Softmax()])

sym_X = T.itensor3()
sym_Y = T.ftensor3()
sym_prob = net.forward(sym_X, train=True, truncate_grad=50)
sym_loss = cross_entropy_loss(sym_prob, sym_Y)
sym_acc = T.mean(T.eq(sym_pred, sym_Y.argmax(-1)))
sym_lr = T.fscalar()

optimizer = RMSProp()
updates = net.grad_updates(sym_loss, lr=sym_lr, optimizer=optimizer, default_grad_transformer=ClipGradientNorm(20.))

train_f = function([sym_X, sym_Y, sym_lr], [sym_loss, sym_acc], updates=updates)
pred_f = function([sym_X], net.forward(sym_X, train=False).argmax(-1))