Skip to main content

Machine Learning and the Philosophy of Physics

A depiction of the Cost Curve in Machine Learning, with falling blue balls approaching the optimal minimum, where the algorithm predicts the best model.

Science is all about compressing the world. It’s kind of like a .zip file, trying to understand the phenomena and actions of the world around in a few equations that can be scribbled in the corner of a page. As science progressed, and spread, it began, however, to encounter the problem of metaphysics. How could we understand things like joy and happiness, and other abstract things?

These two models, of philosophy and physics, were not easily compatible however. One of the two methods of explaining the world had to dominate. While the two definitely still exist, Physics has since taken precedence. This may not seem the case, but that is because the way I’m thinking of physics is different from the standard understanding. To me, the Philosophy of Physics is the idea that all abstract things, ideas and phenomena are just the result of physical interactions. In other words, these ideas can be both explained and expressed in terms of discrete mathematics. 


In many ways, this very same philosophy is expressed as Machine Learning. Intelligence is a very abstract thing to many. The idea that a system could take some input from the world around and give a complex output that represents a kind of individuality. In fact, even defining intelligence here is so hard. What Machine Learning does, though, is that it atleast claims to convert this complicated entity to sets of mathematical processes. This conversion provides incredible power, no matter how imperfect ML may be in its interpretation.


Physics and Machine Learning have much more direct connections than what we were talking about here, but we’ll get to them eventually. 

Comments

Popular posts from this blog

Phase Spaces 1 : Graphs and Geometry

Phase Spaces One of the least heard of, and most interesting techniques of the sciences, that you rarely realize you’ve used before. Phase spaces are symbolic representations of a particular problem, which you can then use to solve it. Let’s start with a simple problem - in physics maybe. Let’s say we have a car, as all good physics problems do. You’re driving at a set initial speed, and a set acceleration. At what time would you have travelled exactly 15 ft? Let’s look at it in terms of "a phase space". I have a velocity-time graph down here:                                                                                                                                  Linear Velocity-Time Graph Nothing very exciting, but it’s a useful analogy. Here, the two variables involved (more on that later), are effectively the speed and the time. What you want to know are the success cases (totally a technical term), where the car travels 15 ft, no more, no less. How could you do tha

Phase Spaces 2 : Math and Gradient Descent

I'm going to start from where we left off in the last part of this series. If you haven't read that yet, check that out first if you want a more detailed understanding: We explored what a Phase Space is, why it's useful, what it has to do with Machine Learning, and more!  I'm assuming you've read the previous article, or you know what I talked about there: so let's get to it. At the end of the last article, we discovered that it was the power of mathematics that would help us find the best values of the parameters for the lowest cost function. Before we get into what the Math does, however, we'll need to define some things in the math. If you've done Calculus, and in particular, partial derivatives, you can skip this section, but otherwise I would suggest at least a cursory glance. I don't go into too much detail on the subject, but that's only because you won't need it.  Calculus Interlude: Derivatives- The slope of a graph is a concept you

Stochastic Optimization: NEW MINISERIES!

This is part of my new miniseries on Stochastic Optimization. While this is not taught in a lot of Machine Learning courses, it's an interesting perspective, applicable in an incredible number of fields. Nevertheless, this won't be a very long series, and when we exit it, it'll be time to dive straight into our first Machine Learning algorithm! Introduction to Optimization: Ok, so what is Optimization? As the name may suggest, Optimization is about finding the optimal configuration of a particular system. Of course, in the real world, the important question in any such process is this: in what sense? i.e. By what criteria do you intend to optimize the system? However, we will not delve too much into that just yet, but I promise, that will bring about a very strong connection to ML. Introduction to Stochastic Optimization: So far, as part of our blogposts, we have discussed Gradient Descent and the Normal Equation Method . These are both Optimization algorithms, but they di