My other JuliaLang posts and videos of 2019

By: oxinabox.github.io

Re-posted from: https://white.ucc.asn.au/2019/11/08/My-other-JuliaLang-posts-and-videos-of-2019.html

If you follow my blog, it may look like I am blogging a fair bit less this last year.
Infact I am blogging just as much, but collaborating more.
So my posts end up hosted elsewhere.
Also since I am no longer the only julia user in 3000km, I am giving more talks.
For this post I thought I would gather things up so I can find them again.

This list is arranged roughly most recent first.

The Emergent Features of JuliaLang

This was original a talk I was invited to give to PyData Meetup Cambridge, and the Julia Users Group London.
Unfortunately, neither time was recorded, but it is now a pair of very nice blog posts (even if I do say so my self).
Main reason for how nice they are is the great editting from
Cozmin Ududec, Eric Permin Martins, and Eric Davies.

The first is a bit of a grab-bag of random tricks, that you mostly shouldn’t do,
but that helps with understanding.
The second is about traits, which are actually really useful.

JuliaCon 2019 Reflections

Nick Robinson managed to convince a bunch of peole at Invenia to contribute to a blog post on this years JuliaCon.
This post worked out to be a nice little summary of a subset of the talks.
I was the Vice-Program Chair for this JuliaCon (and also am again for 2020),
so I am biased and think all the talks that we selected were great.
But this post talks about some of them in a little more detail.

Building A Debugger with Cassette

This was my JuliaCon talk on MagneticReadHead a debugger I wrote.
The talk have a great outcome for me: afterwards Valentin Churavy, gave me some suggestions that resulted in rewritting the core of it to use some of the compilers tools.
This was a great JuliaCon for me for levelling up my compiler skills.
Nathan Daly and I worked on StagedFunctions.jl at the hackaton which significantly relaxes the restrictions @generated functions have.
That also set the stage for my static_hasmethod which allows for compile time dispatch on whether or not a method exits.
That code currently lives in Tricks.jl, it may not be the best way to do it, but I think it always works (hard to prove).

DiffEqFlux: Neural Differential Equations

It is an utter privilege to get to collaborate with Chris Rackauckas, Mike Innes, Yingbo Ma, Jesse Bettencourt, and Vaibhav Dixit.
Chris and I have been playing with various mixes of Machine Learning and Differential Equations for a few years.
This is where some of those ideas and a bunch more ended up.
Fun fact: Jesse Bettencourt’s supervisor is David Duvenaud, who was one of the original founders of Invenia (my employer).

TensorFlow.jl and other tools for ML in JuliaLang

I’m not doing much with TensorFlow.jl these days, its stable, it works.
Though I am now involved in the framework agnostic TensorBoardLogger.jl
which is a great project, I strongly encourage anyone who has some iterative method to consider logging with it.

Anyway, this was a talk about how TensorFlow.jl is used.
TensorFlow.jl remains (In my biased opinion) one of the most comprehensive, and ideomatic wrappers of TensorFlow.
I’ve moved on to using Flux.jl because it is more ideomatic, and as I found out when preparing this talk,
often faster (Limitted benchmarks, CPU only).
Though, it is much easier to write slow Flux code than slow TensorFlow.jl code.
The preparation of this talk, and me debugging those performance issues, is the reason why Flux now has a performance tips section..

I have lost the video for this one and if anyone finds it, can they let me know?
I know it was recorded, but I can’t find where it ended up.
It was given to a combined meetup of the London Julia Users Group,
and the London Deep Learning Lab
.

I also recommend this video by Jon Malmaud at the TensorFlow Dev Summit.