Polynomial regression is a considered a special case of linear regression where higher order powers (x2, x3, etc.) of an independent variable are included. It’s appropriate where your data may best be fitted to some sort of curve rather than a simple straight line.
The polynomial module of numpy is easily used to explore fitting the best curve to your data – as usual I got fixated on understanding how this actually works in the background before I could go on with life (!) and so I offer my notes on what is actually happening and how to work with polyfit() and polyval() in practice: How it works – Polynomial Regression.
As a side-note, I tried getting r-squared values for my data, and came up with strange results – I found this article on Why Is There No R-Squared for Nonlinear Regression? – I’m not 100% sure yet, but I suspect that it must be pertinent to polynomial regression as well even though technically it is a type of linear regression… any thoughts on this would be welcome if you have them :).