Monday, March 2, 2015

Install Konsole/gnome-terminal for better Vim experience in OS X

Go to the App Store and install Xcode

Next, go to to download and install MacPorts

After this do

For gnome terminal:
sudo port install gnome-terminal 

sudo port install konsole

For both apps, you will also need
sudo port install xorg-server

In the Applications file under MacPorts, there should be an X11 app. Launch this before trying to launch either gnome-terminal or console.

Sunday, February 15, 2015

IFT6266 Week 5

The convent has gone no farther this week since massively overfitting, but I had a few interesting discussions with Roland about computationally efficient pooling which should be useful once I solve my current issues.

I also got the convolutional VAE working for MNIST. If I can get a good run for CIFAR10 it might also be useful to slap a one or two layer MLP on the hidden space representation to see if that gets above 80% for cats and dogs. If not it would also be fun to train on the dataset itself, folding in all the data and then finetune for prediction. This is a sidetrack from "the list" but could be fruitful.

Here are some samples from the ConvVAE on MNIST (also a link to the current code here ).

Monday, February 9, 2015

IFT6266 Week 4

I got the convolutional-deconvolutional VAE working as a standalone script now - training it on LFW to see the results. The code can be found here:

I have also completed coding a convnet in pure theano which heavily overfits the dogs and cats data. See here:

Current training stats:
Epoch 272
Train Accuracy  0.993350
Valid Accuracy  0.501600
Loss 0.002335

The architecture is:
load in data as color and resize all to 48x48
1000 epochs, batch size 128
SGD with 0.01 learning rate, no momentum

layer 1 - 10 filters, 3x3 kernel, 2x2 max pool, relu
layer 2 - 10 filters, 3x3 kernel, 1x1 max pool, relu
layer 3 - 10 filters, 3x3 kernel, 1x1 max pool, relu
layer 4 - fully connected 3610x100,  relu
layer 5 - softmax

The next step is quite obviously to add dropout. With this much overfitting I am hopeful that this architecture can get me above 80%. Other things to potentially add include ZCA preprocessing, maxout instead of relu, network-in-network, inception layers, and more. Also considering bumping the default image size to 64x64, random subcrops, image flipping, and other preprocessing tricks.

Once above 80%, I want to experiment with some of the "special sauce" from Dr. Ben Graham - fractional max pooling and spatially sparse convolution. His minibatch dropout also seems quite nice!

Sunday, February 1, 2015

IFT6266 Week 3

Alec Radford shared some very interesting results on LFW using Convolutional VAE ( and I have been working to convert his code into something more generally useable, as his version ( depends on other local code from Indico.

This *probably* won't be the thing that gets above our 80% baseline, but it would be cool to get it working for another dataset. It may also be interesting for other projects since we know convolutional nets can work well for sound.

Friday, January 23, 2015

IFT6266 Week 2

Test Vincent Dumoulin's dataset ( and get a baseline with a simple convnet or fully connected net.

Monday, January 12, 2015

IFT6266 Week 1

Current plans:

GoogleNet (almost done)
DeepCNet/Deep Sparse Network (Benjamin Graham)
Deep Scattering Networks (Mallat)
Convolutional Kernel Networks (Mairal)
Disjunctive Normal Networks (this paper)

Friday, November 28, 2014

FreeDNS Cron Update

A Dynamic DNS Bash Script for FreeDNS
First, make sure you have registered an account for FreeDNS, and set up a subdomain as an A record at

Save the contents below as, making sure to change /path/to/log/dns.log , <YOUR_API_KEY> , <YOUR_SUBDOMAIN>. You can get both your API Key and the domain from . The domain should be something like (or or one of the other FreeDNS domains).

You can get the full update line (https address and the full wget call, seen in the if statement below) from your FreeDNS dashboard, under Main Menu -> Dynamic DNS -> quick cron example , toward the bottom of the page.

echo "-----------------------" >> $LOG                                          
echo "Running" >> $LOG                                                   
echo $(date) >> $LOG                                                            
# get the current ip...                                                         
CURRENT_IP=$(wget -q -O -|sed -e 's/.*Current IP Address: //' -e 's/<.*$//')
echo "Current IP:"$CURRENT_IP >> $LOG                                           
#FreeDNS updater script                                                         
PREVIOUS_IP=$(nslookup $SUBDOMAIN | tail -n2 | grep A | sed s/[^0-9.]//g)                
echo "Previous IP:"$PREVIOUS_IP >> $LOG                                         
if [ "$CURRENT_IP" != "$PREVIOUS_IP" ]                                          
        echo "Current and previous IP differ! Updating FreeDNS..." >> $LOG      
        wget -q -O /dev/null $UPDATEURL                                         
        echo "DNS updated on:"$(date) >> $LOG                                   
echo "-----------------------" >> $LOG

Now run sudo vim /etc/crontab , adding this line at the bottom,
replacing /path/to/script/ with the path you saved to,
and replacing <USER> with your username
*/5 * * * * <USER> /path/to/script/