how to fix a blank login screen on a ubuntu laptop with nvidia drivers

I’m running Ubuntu on my Lenovo Y50 laptop, with a Nvidia GPU. And every time I do an update (or restart it?), I see the Ubuntu logo, hear the chime to log in, and then see a blank black screen, or a small white dot in the upper corner.

Other times, after a reboot, I get to the login screen, enter my username and password, then everything flickers violently, and it loops back to asks me to enter in my info again.

Today this post is not about how to permanently fix this (although that would be nice), but rather how to get your GUI back (until you update/restart your machine again).

It seems that on some laptops, the Nvidia drivers and Ubuntu do not always nicely play together. Why? I am not sure.

But anyways, here’s how to get fix your laptop when Ubuntu has a black screen on login (assuming your problem is related to the Nvidia drivers).
Continue reading “how to fix a blank login screen on a ubuntu laptop with nvidia drivers”

multi-resolution-tract CNN with hybrid pretrained and skin-lesion trained layers

Our paper entitled: “Multi-resolution-Tract CNN with Hybrid Pretrained and Skin-Lesion Trained Layers” was accepted and presented as an oral talk in the Machine Learning in Medical Imaging (MLMI) Workshop (part of the MICCAI conference).

In this work, we used a convolutional neural network (CNN) to classify 10 different types of skin lesions, including melanoma and non-melanoma types.

The key technical contribution was to use multiple tracts (or paths) within the neural network, to train (and test) the network on an image using multiple image resolutions simultaneously. Additionally, we extended a CNN pretrained on a single image resolution to work for multiple image resolutions.

Here are our slides presented at MLMI (thanks Aïcha!) showing our deep learning approach to classify skin disease:
Continue reading “multi-resolution-tract CNN with hybrid pretrained and skin-lesion trained layers”

(deep convolutional) generative adversarial nets – slides

There’s this really neat new idea on how to train neural networks that recently came out know as generative adversarial nets (GAN).

The basic idea of a GAN is to train two networks to compete with each other (hence the name “adversarial“). One network (called the generator) creates images that look just like real images. The other network (called the discriminator) distinguishes between real images and those images the generator produced.

Thus the two networks compete with each other, where the generator generates images to fool the discriminator, and the discriminator discriminates between the generator’s images and real images.

Continue reading “(deep convolutional) generative adversarial nets – slides”

Using numpy on google app engine with the anaconda python distribution

– you are using the Google App Engine (GAE) development server with Python
– you installed the Anaconda Python distribution
– you want to use the Numpy library with GAE

On Ubuntu and on Mac (but not Windows for some reason), you get this error when trying to deploy:
google app engine ImportError: No module named _ctypes

The tldr; solution
Create an Anaconda environment using numpy 1.6 and python 2.7:

conda create —n np16py27 anaconda numpy=1.6 python=2.7

Load this specific environment from the command line:

source activate np16py27

Run your GAE dev server: my_gae_project

That’s it! You can read more details below if you are interested.
Continue reading “Using numpy on google app engine with the anaconda python distribution”

how to compute true/false positives and true/false negatives in python for binary classification problems

Here’s how to compute true positives, false positives, true negatives, and false negatives in Python using the Numpy library.

Note that we are assuming a binary classification problem here. That is a value of 1 indicates a positive class, and a value of 0 indicates a negative class. For multi-class problems, this doesn’t really hold.

Continue reading “how to compute true/false positives and true/false negatives in python for binary classification problems”

Deep Dreams and a Neural Algorithm of Artistic Style – slides and explanations

Perhaps you saw an earlier post I wrote about deep dreaming Prague pictures, and you said to your self, “self, I wish I knew more about the techniques to make those crazy looking pictures.”

Well you are in luck since I’ve now posted the slides where I attempted to explain these two works to our reading group: 1) Google’s DeepDream, and 2) A Neural Algorithm of Artistic Style.

Continue reading “Deep Dreams and a Neural Algorithm of Artistic Style – slides and explanations”

Napoleon: A Life – by Andrew Robert – audiobook review and notes

I recently finished the audiobook, “Napoleon: A Life” by Andrew Roberts. As you may expect from the title, this hefty audiobook (nearly 33 hours) gives great detail on the life of Napoleon Bonaparte.

What struck me most about the life of Napoleon was his ability to cope – no, more than cope, he thrived – in such an environment of confrontation. Napoleon waged wars against the strongest powers in Europe, and even when in massive confrontations, he wrote obsessive letters to people on seemingly trivial topics, detailing their marriages and affairs. Even when opposed by with such great forces, he did not shut down, he did not seek escapism, except perhaps for brief periods with his mistresses.

Continue reading “Napoleon: A Life – by Andrew Robert – audiobook review and notes”

The Signal and the Noise (book/audiobook review, summary and notes)

During an odd phase of fascination with American politics (I’m Canadian), I stumbled across Nate Silver’s website’s ( political coverage of the 2016 Idaho primary. Their cold, analytical coverage of the election appealed to me. It turns out Nate also wrote a book about prediction, which luckily for me, is also in audiobook format.

The core idea
The takeaway from this book is essentially this: prediction is really hard and most people (and machines) suck at it (expect for weather forecasters). More concerningly (is that a word?), most people don’t even know that they suck at it. Oh, and you should use Bayesian statistics to give probabilistic estimates and update your probabilities when you get new information.

This book encouraged me to take a hard look at my own predictions. Do I suck at it? Do I actually understand Bayes theorem?

Continue reading “The Signal and the Noise (book/audiobook review, summary and notes)”

What is the derivative of ReLU?