Skip to main content

Posts

StOp 1.4: Simulated Annealing in Python, Part 3: Allocating Groups by AI

I quite enjoy discussions. However, talking just to people within my friend circle can sometimes become boring. After all, I tend to attract similar perspectives, and to occasionally challenge my own outlook, and see things from another vantage point is something I'd like to do. Events with discussion groups are a great place to do this. But one thing that tends to be a challenge with such events, is who to allocate which group to. Let's say there are 5 discussion groups, and the Google Form you have created allows a prospective participant to select their First-Choice and Second-Choice. How do you decide which groups to allocate to each participant? And this is a more general problem than the obscure idea of a discussion event: it could apply in a virtual conference of educational seminars, or even in class allocations.  The most interesting part about AI, is its ability to express and solve seemingly subjective problems. Yes, I do mean that Simulated Annealing can solve this
Recent posts

StOp 1.3: Simulated Annealing in Python, Part 2: Sorting by Searching

  I've written all the instructions and code into another  Python Notebook . This will be a more non-traditional application of simulated annealing. We'll implement One More, and then move on to the next algorithm. This is a viewer to see the notebook. Then, you can click Open with Google Colab, Login to your Google Account, and you will be able to edit your own copy of the notebook. If you are doing this, ignore the request in the notebook to make a copy before editing.

StOp 1.2: Simulated Annealing in Python, Part 1: Function Minimization

I've written all the instructions and code into a Python Notebook . This is a viewer to see the notebook. Then, you can click Open with Google Colab, Login to your Google Account, and you will be able to edit your own copy of the notebook. If you are doing this, ignore the request in the notebook to make a copy before editing.

StOp 1.1: Anvils, Annealing and Algorithms

Introduction: Now that the strange title has attracted you to the article, StOp stands for Stochastic Optimization. This is the first episode in our mini-series. I've been mulling over this article for months now, which is kind of absurd considering that this is meant to be a quick series, but I apologize for my online dormancy. In the meanwhile, I was working on writing content for a course on Machine Learning. If you're still in school (not college), and you want to learn more, check out:  https://code-4-tomorrow.thinkific.com/courses/machine-learning At any rate, let's get started. Expansion and Exploitation: In some ways, the more of this you read about, the more you begin to think of the world as an array of optimization processes - from the bargain you settle on with the grocer to the conversation you had before you sold your company. But an unfortunate side-effect of this kind of outlook, is that you often become a visibly more selfish person. You spend more time exp

Stochastic Optimization: NEW MINISERIES!

This is part of my new miniseries on Stochastic Optimization. While this is not taught in a lot of Machine Learning courses, it's an interesting perspective, applicable in an incredible number of fields. Nevertheless, this won't be a very long series, and when we exit it, it'll be time to dive straight into our first Machine Learning algorithm! Introduction to Optimization: Ok, so what is Optimization? As the name may suggest, Optimization is about finding the optimal configuration of a particular system. Of course, in the real world, the important question in any such process is this: in what sense? i.e. By what criteria do you intend to optimize the system? However, we will not delve too much into that just yet, but I promise, that will bring about a very strong connection to ML. Introduction to Stochastic Optimization: So far, as part of our blogposts, we have discussed Gradient Descent and the Normal Equation Method . These are both Optimization algorithms, but they di

Normal Equation Method : A Quick Overview

Here's a cat - I didn't get a chance to work them into the post. In the previous post, we were discussing Gradient Descent and why it has so many connections to Phase Spaces (check that out over here , if you haven't yet). There, I had dropped a hint about another Optimization Method we could use. Did you catch it? When we discussed some of the calculus involved, I mentioned that the derivative or slope of a function at local minima or maxima is equal to 0. As it turns out, you can simply set the derivative to be equal to 0, and solve for the parameters required. This is sometimes referred to as the Normal Equation Method.  I will talk about a case-specific simplification here.  This method is called Least Squares and it minimizes the following error function, often known as Mean Squared Error, for reasons that will be apparent in a moment: This formula may seem complicated, but don't worry, it is actually quite simple. Before we understand it, though, it is important t

Phase Spaces 2 : Math and Gradient Descent

I'm going to start from where we left off in the last part of this series. If you haven't read that yet, check that out first if you want a more detailed understanding: We explored what a Phase Space is, why it's useful, what it has to do with Machine Learning, and more!  I'm assuming you've read the previous article, or you know what I talked about there: so let's get to it. At the end of the last article, we discovered that it was the power of mathematics that would help us find the best values of the parameters for the lowest cost function. Before we get into what the Math does, however, we'll need to define some things in the math. If you've done Calculus, and in particular, partial derivatives, you can skip this section, but otherwise I would suggest at least a cursory glance. I don't go into too much detail on the subject, but that's only because you won't need it.  Calculus Interlude: Derivatives- The slope of a graph is a concept you