Python on OranLooney.com
http://www.oranlooney.com/tags/python/
Recent content in Python on OranLooney.com
Hugo  gohugo.io
en
© Copyright {year} Oran Looney
Fri, 01 Mar 2019 00:00:00 +0000

ML From Scratch, Part 4: Decision Trees
http://www.oranlooney.com/post/mlfromscratchpart4decisiontree/
Fri, 01 Mar 2019 00:00:00 +0000
http://www.oranlooney.com/post/mlfromscratchpart4decisiontree/
So far in this series we’ve followed one particular thread: linear regression > logistic regression > neural network. This is a very natural progression of ideas, but it really represents only one possible approach. Today we’ll switch gears and look at a model with completely different pedigree: the decision tree, sometimes also referred to as Classification and Regression Trees, or simply CART models. In contrast to the earlier progression, decision trees are designed from the start to represent nonlinear features and interactions.

A Fairly Fast Fibonacci Function
http://www.oranlooney.com/post/fibonacci/
Tue, 19 Feb 2019 00:00:00 +0000
http://www.oranlooney.com/post/fibonacci/
A common example of recursion is the function to calculate the \(n\)th Fibonacci number:
def naive_fib(n):
if n < 2:
return n
else:
return naive_fib(n1) + naive_fib(n2)
This follows the mathematical definition very closely but it’s performance is terrible: roughly \(\mathcal{O}(2^n)\). This is commonly patched up with dynamic programming. Specifically, either the memoization:
from functools import lru_cache
@lru_cache(100)
def memoized_fib(n):
if n < 2:
return n
else:
return memoized_fib(n1) + memoized_fib(n2)
or tabulation:

ML From Scratch, Part 3: Backpropagation
http://www.oranlooney.com/post/mlfromscratchpart3backpropagation/
Sun, 03 Feb 2019 00:00:00 +0000
http://www.oranlooney.com/post/mlfromscratchpart3backpropagation/
In today’s installment of Machine Learning From Scratch we’ll build on the logistic regression from last time to create a classifier which is able to automatically represent nonlinear relationships and interactions between features: the neural network. In particular I want to focus on one central algorithm which allows us to apply gradient descent to deep neural networks: the backpropagation algorithm. The history of this algorithm appears to be somewhat complex (as you can hear from Yann LeCun himself in this 2018 interview) but luckily for us the algorithm in its modern form is not difficult  although it does require a solid handle on linear algebra and calculus.

ML From Scratch, Part 2: Logistic Regression
http://www.oranlooney.com/post/mlfromscratchpart2logisticregression/
Thu, 27 Dec 2018 00:00:00 +0000
http://www.oranlooney.com/post/mlfromscratchpart2logisticregression/
In this second installment of the machine learning from scratch we switch the point of view from regression to classification: instead of estimating a number, we will be trying to guess which of 2 possible classes a given input belongs to. A modern example is looking at a photo and deciding if its a cat or a dog.
In practice, its extremely common to need to decide between \(k\) classes where \(k > 2\) but in this article we’ll limit ourselves to just two classes  the socalled binary classification problem  because generalizations to many classes are usually both tedious and straightforward.

ML From Scratch, Part 1: Linear Regression
http://www.oranlooney.com/post/mlfromscratchpart1linearregression/
Thu, 29 Nov 2018 00:00:00 +0000
http://www.oranlooney.com/post/mlfromscratchpart1linearregression/
To kick off this series, will start with something simple yet foundational: linear regression via ordinary least squares.
While not exciting, linear regression finds widespread use both as a standalone learning algorithm and as a building block in more advanced learning algorithms. For example, the output layer of a deep neural network trained for regression with MSE loss, simple AR time series models, and the “local regression” part of LOWESS smoothing are all (modified) examples of linear regression.

ML From Scratch, Part 0: Introduction
http://www.oranlooney.com/post/mlfromscratchpart0introduction/
Sun, 11 Nov 2018 00:00:00 +0000
http://www.oranlooney.com/post/mlfromscratchpart0introduction/
Motivation
“As an apprentice, every new magician must prove to his own satisfaction, at least once, that there is truly great power in magic.”  The Flying Sorcerers, by David Gerrold and Larry Niven
How do you know if you really understand something? You could just rely on the subjective experience of feeling like you understand. This sounds plausible  surely you of all people should know, right? But this runs headfirst into in the DunningKruger effect.

Craps Variants
http://www.oranlooney.com/post/crapsgamevariants/
Wed, 11 Jul 2018 00:00:00 +0000
http://www.oranlooney.com/post/crapsgamevariants/
Craps is a suprisingly fair game. I remember calculating the probability of winning craps for the first time in an undergraduate discrete math class: I went back through my calculations several times, certain there was a mistake somewhere. How could it be closer than $\frac{1}{36}$?
(Spoiler Warning If you haven’t calculated these odds for yourself then you may want to do so before reading further. I’m about to spoil it for you rather thoroughly in the name of exploring a more general case.

Semantic Code
http://www.oranlooney.com/post/semanticcode/
Wed, 30 Apr 2008 00:00:00 +0000
http://www.oranlooney.com/post/semanticcode/
semantic (siman’tik) adj. 1. Of or relating to meaning, especially meaning in language.
Programming destroys meaning. When we program, we first replace concepts with symbols and then replace those symbols with arbitrary codes — that’s why it’s called coding.
At its worst programming is writeonly: the program accomplishes a task, but is incomprehensible to humans. See, for example, the story of Mel. Such a program is correct, yet at the same time meaningless.