Archive | Tutorials

Installing Keras with TensorFlow backend

A few months ago I demonstrated how to install the Keras deep learning library with a Theano backend. In today’s blog post I provide detailed, step-by-step instructions to install Keras using a TensorFlow backend, originally developed by the researchers and engineers on the Google Brain Team. I’ll also (optionally) demonstrate how you can integrate OpenCV into […]

Continue Reading 31

Ubuntu 16.04: How to install OpenCV

Over the past two years running the PyImageSearch blog, I’ve authored two tutorials detailing the required steps to install OpenCV (with Python bindings) on Ubuntu. You can find the two tutorials here: Install OpenCV 3.0 and Python 2.7+ on Ubuntu 14.04 Install OpenCV 3.0 and Python 3.4+ on Ubuntu 14.04 However, with support of Ubuntu […]

Continue Reading 153

Stochastic Gradient Descent (SGD) with Python

In last week’s blog post, we discussed gradient descent, a first-order optimization algorithm that can be used to learn a set of classifier coefficients for parameterized learning. However, the “vanilla” implementation of gradient descent can be prohibitively slow to run on large datasets — in fact, it can even be considered computationally wasteful. Instead, we should apply Stochastic […]

Continue Reading 5

Bubble sheet multiple choice scanner and test grader using OMR, Python and OpenCV

Over the past few months I’ve gotten quite the number of requests landing in my inbox to build a bubble sheet/Scantron-like test reader using computer vision and image processing techniques. And while I’ve been having a lot of fun doing this series on machine learning and deep learning, I’d be lying if I said this little […]

Continue Reading 32

Understanding regularization for image classification and machine learning

In previous tutorials, I’ve discussed two important loss functions: Multi-class SVM loss and cross-entropy loss (which we usually refer to in conjunction with Softmax classifiers). In order to to keep our discussions of these loss functions straightforward, I purposely left out an important component: regularization. While our loss function allows us to determine how well (or poorly) our […]

Continue Reading 2

Softmax Classifiers Explained

Last week, we discussed Multi-class SVM loss; specifically, the hinge loss and squared hinge loss functions. A loss function, in the context of Machine Learning and Deep Learning, allows us to quantify how “good” or “bad” a given classification function (also called a “scoring function”) is at correctly classifying data points in our dataset. However, […]

Continue Reading 7