I find there are a lot of tutorials and toy examples on convolutional neural networks - so many ways to skin an MNIST cat! - but not so many on other types of scenarios. So I've decided to put together a quick sample notebook on regression using the bike-share dataset. After learning the basics of neural... Continue Reading →

# Data structures for deep learning

I recently completed the Udacity Deep Learning Nanodegree (highly worth doing by the way), which focuses on implementing a variety of deep learning architectures using PyTorch. At the outset, it's pretty fundamental to understand the data structures you'll be encountering as inputs to and outputs from your neural network architecture. What I noticed was that... Continue Reading →

# How to – KMeans clustering

Clustering is a type of unsupervised learning. Us humans would think of it as 'categorization' perhaps. For example, if I gave you a bag of red, blue and white balls and asked you to sort them (without telling you how) you would probably naturally gravitate towards sorting them by colour as this would be the... Continue Reading →

# How to – Principal Component Analysis

I always like to understand concepts well before I use them (which is good because it's the right thing to do, but bad because it slows me down a lot!), so it was with great excitement that I came across Matt Brems' article A One-Stop Shop for Principal Component Analysis recently. If you read this... Continue Reading →

# If you buy one book in 2020…

...make it Grokking Deep Learning by Andrew Trask! This gem of a book breaks deep learning down to its smallest component parts and then builds up your understanding from there. It's the equivalent of stripping your car down to nuts and bolts and then re-building it: at the end, you will know to a certainty... Continue Reading →

# Finding relationships between words

This week I did something a bit different and rather fun! My colleague Carel phoned to say he was bringing his 11-year old daughter, Lisa-Marie, to work the next day and did I have anything interesting to share with her about the world of data science? As it happens I've spent the past couple of... Continue Reading →

I've just discovered the awesome Brandon Rohrer and his blog while trying to find an intelligible article on Bayesian inference. What a goldmine - this guy is a born educator! Thank you for sharing your knowledge - it is well-appreciated!

# Getting results vs Understanding

Alexander Pope is famously quoted as saying: A little learning is a dangerous thing; drink deep, or taste not the Pierian spring: there shallow draughts intoxicate the brain, and drinking largely sobers us again. I've been thinking about these words the past few days as I worked on my latest challenge: a text classifier using... Continue Reading →

# Multivariate regression

So: with linear regression (aka simple linear regression) we have one feature which we are using to predict a dependent value (for example number of rooms as a predictor of house price). With multivariate regression (aka multiple linear regression) we simply have multiple features which could be used to predict that dependent value (for example... Continue Reading →

# Polynomial regression

Polynomial regression is a considered a special case of linear regression where higher order powers (x2, x3, etc.) of an independent variable are included. It's appropriate where your data may best be fitted to some sort of curve rather than a simple straight line. The polynomial module of numpy is easily used to explore fitting the best... Continue Reading →