Julia Ranks Among Top 10 Programming Languages on GitHub

By: Julia Computing, Inc.

Re-posted from: http://juliacomputing.com/press/2017/05/25/github.html

Cambridge, MA – Julia ranks as one of the top 10 programming languages on GitHub as measured by the number of GitHub stars and the number of GitHub repositories.

GitHub users ‘star’ a repository in order to show appreciation and create a bookmark for easy access.

Julia ranks #10 in GitHub stars and #8 in number of repositories

Top GitHub Programming Languages

Rank Language GitHub Stars Number of Repositories
1 Swift 38,513 5,755
2 Go 28,230 3,770
3 TypeScript 22,248 3,230
4 Rust 21,938 4,072
5 CoffeeScript 13,972 1,909
6 Kotlin 13,074 1,219
7 Ruby 12,212 3,401
8 PHP 11,989 4,037
9 Elixir 10,103 1,399
10 Julia 8,651 1,982
11 Scala 8,254 2,092
12 Crystal 8,161 656
13 Python 7,835 1,294
14 Roslyn 7,538 1,851
15 PowerShell 7,010 913

Ranked by Number of GitHub Stars

About Julia Computing and Julia

Julia Computing (JuliaComputing.com) was founded in 2015 by the co-creators of the open source Julia language to develop products and provide support for businesses and researchers who use Julia.

Julia is the fastest modern high performance open source computing language for data, analytics, algorithmic trading, machine learning and artificial intelligence. Julia combines the functionality and ease of use of Python, R, Matlab, SAS and Stata with the speed of Java and C++. Julia delivers dramatic improvements in simplicity, speed, capacity and productivity. With more than 1 million downloads and +161% annual growth, Julia adoption is growing rapidly in finance, energy, robotics, genomics and many other fields.

  1. Julia is lightning fast. Julia provides speed improvements up to
    1,000x for insurance model estimation, 225x for parallel
    supercomputing image analysis and 11x for macroeconomic modeling.

  2. Julia is easy to learn. Julia’s flexible syntax is familiar and
    comfortable for users of Python, R and Matlab.

  3. Julia integrates well with existing code and platforms. Users of
    Python, R, Matlab and other languages can easily integrate their
    existing code into Julia.

  4. Elegant code. Julia was built from the ground up for
    mathematical, scientific and statistical computing, and has advanced
    libraries that make coding simple and fast, and dramatically reduce
    the number of lines of code required – in some cases, by 90%
    or more.

  5. Julia solves the two language problem. Because Julia combines
    the ease of use and familiar syntax of Python, R and Matlab with the
    speed of C, C++ or Java, programmers no longer need to estimate
    models in one language and reproduce them in a faster
    production language. This saves time and reduces error and cost.

Julia users, partners and employers looking to hire Julia programmers in 2017 include: Google, Apple, Amazon, Facebook, IBM, Intel, Microsoft, BlackRock, Capital One, PwC, Ford, Oracle, Comcast, DARPA, Moore Foundation, Federal Reserve Bank of New York (FRBNY), UC Berkeley Autonomous Race Car (BARC), Federal Aviation Administration (FAA), MIT Lincoln Labs, Nobel Laureate Thomas J. Sargent, Brazilian National Development Bank (BNDES), Conning, Berkery Noyes, BestX, Path BioAnalytics, Invenia, AOT Energy, AlgoCircle, Trinity Health, Gambit, Augmedics, Tangent Works, Voxel8, Massachusetts General Hospital, NaviHealth, Farmers Insurance, Pilot Flying J, Lawrence Berkeley National Laboratory, National Energy Research Scientific Computing Center (NERSC), Oak Ridge National Laboratory, Los Alamos National Laboratory, Lawrence Livermore National Laboratory, National Renewable Energy Laboratory, MIT, Caltech, Stanford, UC Berkeley, Harvard, Columbia, NYU, Oxford, NUS, UCL, Nantes, Alan Turing Institute, University of Chicago, Cornell, Max Planck Institute, Australian National University, University of Warwick, University of Colorado, Queen Mary University of London, London Institute of Cancer Research, UC Irvine, University of Kaiserslautern, University of Queensland.

Julia is being used to: analyze images of the universe and research dark matter, drive parallel supercomputing, diagnose medical conditions, provide surgeons with real-time imagery using augmented reality, analyze cancer genomes, manage 3D printers, pilot self-driving racecars, build drones, improve air safety, manage the electric grid, provide analytics for foreign exchange trading, energy trading, insurance, regulatory compliance, macroeconomic modeling, sports analytics, manufacturing, and much, much more.

Filling In The Interop Packages and Rosenbrock

By: JuliaDiffEq

Re-posted from: http://juliadiffeq.org/2017/05/18/Filling_in.html

In the 2.0 state of the ecosystem post
it was noted that, now that we have a clearly laid out and expansive common API,
the next goal is to fill it in. This set of releases tackles the lowest hanging
fruits in that battle. Specifically, the interop packages were setup to be as
complete in their interfaces as possible, and the existing methods which could
expand were expanded. Time for specifics.

Exploring Fibonacci Fractions with Julia

By: perfectionatic

Re-posted from: http://perfectionatic.org/?p=367

Recently, I came across a fascinating blog and video from Mind you Decisions. It is about how a fraction
would show the Fibonacci numbers in order when looking at its decimal output.

On a spreadsheet and most standard programming languages, such output can not be attained due to the limited precision for floating point numbers. If you try this on R or Python, you will get an output of 1e-48.
Wolfram alpha,however,allows arbitrary precision.

In Julia by default we get a little better that R and Python

julia> 1/999999999999999999999998999999999999999999999999
julia> typeof(ans)

We observe here that we are getting the first few Fibonacci numbers 1, 1, 2, 3. We need more precision to get more numbers. Julia has arbitrary precision arithmetic baked into the language. We can crank up the precision of the BigFloat type on demand. Of course, the higher the precision, the slower the computation and the greater the memory we use. We do that by setprecision.

julia> setprecision(BigFloat,10000)

Reevaluating, we get

julia> 1/999999999999999999999998999999999999999999999999

That is looking much better. However it we be nice if we could extract the Fibonacci numbers that are buried in that long decimal. Using the approach in the original blog. We define a function


and calculate the decimal


Here we use the non-standard string literal big"..." to insure proper interpretation of our input. Using BigFloat(1e-24)) would first construct at floating point with limited precision and then do the conversion. The initial loss of precision will not be recovered in the conversion, and hence the use of big. Now we extract our Fibonacci numbers by this function

function extract_fib(a)
   for i=1:div(length(x)-24,24)

Here we first convert our very long decimal number of a string and they we exploit the fact the Fibonacci numbers occur in blocks that 24 digits in length. We get out output in an array of BigInt. We want to compare the output with exact Fibonacci numbers, we just do a quick and non-recursive implementation.

function fib(n)
    for i=3:n+1

Now we compare…

for i in eachindex(fib_frac)
     println(fib_exact[i], " ", fib_exact[i]-fib_frac[i])

We get a long sequence, we just focused here on when the discrepancy happens.

184551825793033096366333 0
298611126818977066918552 0
483162952612010163284885 0
781774079430987230203437 -1
1264937032042997393488322 999999999999999999999998
2046711111473984623691759 1999999999999999999999997

The output shows that just before the extracted Fibonacci number exceeds 24 digits, a discrepancy occurs. I am not quite sure why, but this was a fun exploration. Julia allows me to do mathematical explorations that would take one or even two orders of magnitude of effort to do in any other language.