Author Archives: OpenSourcES

Bézier curves in Julia with animations

I think many heard about Bézier curves but maybe some of you didn’t and I heard about it but wasn’t really sure what they are and how they work.
During my Geometric Modelling and Animations course in university we had some lectures on it and I did some coding for homeworks and also to understand a bit more about it. During the last days I published my animations on Twitter and asked whether you’re interested in a post. There was quite a bit feedback on that so here it is!

Let us start with a basic Bézier curve:

simple Bézier curve

Before I show you what Bézier curves are we should probably have a short look what a basic curve is. A curve can be described by a parameterized description as:

\[
\mathbf{b}(t) = (x(t),y(t))^T \quad t_1 \leq t \leq t_2\]

whereas \(\mathbf{b}\) is a vector and \(x, y\) are polynomial functions like

\[
x(t) = a_0 + a_1t + a_2t^2 + \dots + a_nt^n\]

Now the idea is to compute the values on the curve from \(t_1 = 0\) to \(t_2 = 1\) using a different basis not simply \(t^0, \dots, t^n\).
For Bézier curves this basis are defined as:

\[
B_{i}^{n}(t) :=\left( \begin{array}{c}{n} \\ {i}\end{array}\right) t^{i}(1-t)^{n-i}, \quad 0 \leq i \leq n\]

and is called Bernstein basis.

Which looks a bit random for now but they have some interesting properties:

  • \(B_{0}^{n}(t)+B_{1}^{n}(t)+\cdots+B_{n}^{n}(t)=1\)
  • $B_0^n(0) = 1, B_n^n(1)=1

which means when we write down the complete formula to compute \(\mathbf{b}(t)\):

\[
\mathbf{b}(t) = \sum_{i=0}^n B_i^n \mathbf{b}_i\]

where \(\mathbf{b}_i\) are our control points. Now the second property just means that \(\mathbf{b}(0) = \mathbf{b}_0\) and \(\mathbf{b}(0) = \mathbf{b}_n\)

I think it would be nice to actually see the basis functions:

Bernstein basis

Now we can combine it with our control points to obtain:

Bézier using Bernstein basis

Besides the change in color you probably don’t see a difference to the first plot even though that was plotted differently.

I know some of you are here for code so I’ll show you the bernstein code:

using Plots
# for the LaTeX labels in the legend
using LaTeXStrings 

function compute_bernstein(i,n; steps=100)
    return [binomial(n,i)*t^i*(1-t)^(n-i) for t in LinRange(0,1,steps)]
end

function compute_bernstein_poly(px,py; steps=100)
    n = length(px)-1
    bernsteins = [compute_bernstein(i,n) for i=0:n]
    x_vals = [sum(px[k]*bernsteins[k][t] for k=1:n+1) for t=1:steps]
    y_vals = [sum(py[k]*bernsteins[k][t] for k=1:n+1) for t=1:steps]
    return x_vals, y_vals
end

function plot_with_bernstein(px,py; steps=100, subplot=1)
    x_vals, y_vals = compute_bernstein_poly(px,py; steps=steps)
    plot!(x_vals, y_vals, color=:blue, label="",subplot=subplot)
end

function main()
    px = [0, 3, 7]
    py = [2, 9, 3]

    plot(;size=(700,500), axisratio=:equal, legendfont=font(13))
    plot!(px, py, linetype=:scatter, label="control points")
    plot_with_bernstein(px,py)
    png("using_bernstein")
end

Pretty basic so far. Now combining the two and animate:

Bernstein gif

Actually I’m not too sure about how to interpret the colored animating part on the lower plot with the different bernstein polynomials but I think it looks interesting and I never saw that before. The three dots red, green and blue some up to the black in both ways.

Anyway I think the interesting things are still missing. First of all we only have 3 control points at…

Random Forest Regression from scratch used on Kaggle competition

I normally write some code and some days, weeks, month later when everything is done I write a blog about it. I think the blog is more structured that way but maybe it has less details and my errors in between aren’t that visible anymore (just if someone of you comment afterwards 😉 ) Thanks for that. It really helps improve my code and this blog.
In the past couple of weeks I wanted to understand random forest better and not just how to use them so I’ve built my own package.
For me it’s not satisfying enough to use a machine learning package and tune the parameters I really like to build stuff from scratch.
That’s basically the reason for this blog I see too many blogs about: “How to use random forest?”, “How to achieve an awesome accuracy for MNIST?” but as mentioned in my post about the latter there aren’t many who dive deeper into it and actually use it for real (well and blog about it).

I learn a lot doing this and also when I blog about it and especially when I learn through comments but it also takes quite some time to write these posts. If you’ve read multiple of my posts and learned a good amount please consider a donation via Patreon to keep it going.
Thank you very much!

Of course for everyone else I keep to continue writing and you can’t pay everyone on the internet just because you read a post from them. It’s more about whether you enjoy this blog for a longer time now and found something you didn’t elsewhere.

Back to the track. I’m using random forest for my current Kaggle Challenge about predicting earth quakes. I might publish an extra article about that one when it’s done but for now I want to use my Random Forest package on a different challenge which is more for fun (I mean I do the other for fun as well but Kaggle/LANL are paying money for the winners) whereas this one is purely for training.
It’s about predicting house prices based on some features.

This post is on different aspects

  • Creating a julia package (basics)
  • Explaining random forest
  • Simple features first
  • What can be improved?

I’ve only have my random forest code at the moment so my score in the competition might (and probably will) turn out quite bad. Anyway the goal in general is to learn random forest (by coding it) and somehow apply it.

Creating a julia package

First let’s create our RandomForestRegression package. Starting julia in your favorite projects folder.

(v1.1) pkg> generate RandomForestRegression

That creates a folder RandomForestRegression with:

src/
    - RandomForestRegression.jl
Project.toml

and the RandomForestRegression.jl looks like this:

module RandomForestRegression

greet() = print("Hello World!")

end # module

we later want to add some dependencies to it but for now we are done. Normally you should also add a test folder etc…
Probably you want to have a look at the official documentation: Creating