Tag Archives: julia,puzzle,constraint-programming,optimization

Constraint Solver Part 4: Alldifferent constraint

Series: Constraint Solver in Julia

This is part 4 of the series about: How to build a Constraint Programming solver?

Did you think about how humans solve sudokus?

At the moment we are simply removing values which are already set but what about a number which only exists once in possible values.

From the first post

This is the example after the first post and in the penultimate row we have the following:

Penultimate row

We clearly can set the value to 6 in the second block as we know that there has to be a six in the row (and also in the block) and it’s the only choice for the six.

Let’s have a look at another case in this grid. The second row is quite interesting as well:

Second row

Here we have the possibilities 5 & 6 on the right block which means that we can’t use 5 or 6 in the left part which fixes the left most value to 4 and the next to 9. We still don’t know whether it’s 5 and then 6 in the last block or 6 and 5 but we fixed two additional values.

Now the problem with the first part is a bit that maybe the 6 doesn’t need to be fixed if we just want 9 values out of 10 or something (if we don’t solve sudokus). We could check whether the number of values matches the number of indices.

The second one is not straightforward to implement I think and there a more thinks we could fix by doing some more reasoning.

In this post we will do some graph theory reasoning instead. Some of you might have read the post about Sudoku already but as this is a new series I want to explain it here again but will use the same visualizations as before.

I write about the implementation details in the end which are a bit messy at the moment in my opinion but not sure yet how to make it easier and being still fast or even faster so if you have some ideas 😉 (actually writing it down in this blog made it less messy and faster :D)

Okay let’s dive into graph theory: First of all want we want to do:

  • Create a graph where we connect our cells to the possible values we have in them.

Search space one row

The first row shows us the indices of the cells labeled a-i and the bottom row represents the values we have in general.

We might have some fixed values:

Fixed search space

and then we add all the other edges which are possible:

Full search space

these are the possibilities of the third row in our sudoku:

Third row

I know the graph looks a bit full but we can add some color:

Maximum matching

The green arrows show one possible matching. Now as mentioned several times in this series we always want to know as fast as possible if the current search space is infeasible or if we still…

Constraint Solver Part 3: More Pruning & Benchmarking

Series: Constraint Solver in Julia

This is part 3 of the series about: How to build a Constraint Programming solver?

In this part we want to make a minor but important update to the old post as I missed something as well as doing some bechmarking with other constraint programming solvers or specialized sudoku solvers.

Let’s have a look at the video from the last post:

Explanation:

Last time we have set a value while backtracking and then removed values from the search space and maybe set new values due to fixing this single value.

  • black: Given values
  • blue:
    • beginning: pruned before backtracking
    • later: set or pruned during backtracking (might be wrong)
  • green cell: last set by backtracking
  • orange cell: pruned by green cell (or later other orange cells)

At that moment I didn’t completely think that through and only when I had a look at the visualization I made I realized that it’s a bit stupid not to further propagate the new changes as well.

That is the only change on the code basis for this post and after that I’ll show you some benchmarking results which will not be that awesome yet as it’s more like a spoiler and preparation for the next post which will be longer.

So after the first pruning in our backtrack function: GitHub Line

we add a part to prune further on the fixed values:

# prune on fixed vals
co_idx = 1
constraint_idxs_dict = Dict{Int, Bool}()
# get all constraints which need to be called (only once)
while co_idx < length(constraint_outputs)
    constraint_output = constraint_outputs[co_idx]
    for fixed_ind in keys(constraint_output.fixed)
        inner_constraints = com.constraints[com.subscription[fixed_ind]]
        for constraint in inner_constraints
            constraint_idxs_dict[constraint.idx] = true
        end
    end
    co_idx += 1
end
constraint_idxs = collect(keys(constraint_idxs_dict))

There we just check which constraints need to be called again and the in a while loop we call the constraints and add constraints to our list if some cell got fixed:

con_counter = 0
# while we haven't called every constraint
while length(constraint_idxs_dict) > 0
    con_counter += 1
    constraint = com.constraints[constraint_idxs[con_counter]]
    delete!(constraint_idxs_dict, constraint.idx)

    constraint_output = constraint.fct(com, constraint.indices; logs = false)
    push!(constraint_outputs, constraint_output)
    push!(constraints, constraint)
    if !constraint_output.feasible
        feasible = false
        break
    end

    # if we fixed another value => add the corresponding constraint to the list
    # iff the constraint will not be called anyway in the list 
    for ind in keys(constraint_output.idx_changed)
        for constraint in com.constraints[com.subscription[ind]]
            if !haskey(constraint_idxs_dict, constraint.idx)
                constraint_idxs_dict[constraint.idx] = true
                push!(constraint_idxs, constraint.idx)
            end
        end
    end
end

Now you might say that looks awful why is that so much code and it seems like we did that already. You know what, you’re damn right.
Therefore the code gets it’s own function prune!(com, constraints, constraint_outputs) and returns feasible, constraints, constraint_outputs and it will also be called in our main solve function directly after calling every constraint function ones.

Additionally I found that we always backtrack from left to right and then pruning isn’t working that well in Sudoku because all the surrounding cells are already fixed.

Therefore…

Constraint Solver Part 2: Pruning

Series: Constraint Solver in Julia

This is part 2 of the series about: How to build a Constraint Programming solver?

The first part got more attention in the first two days than I thought. Thanks for that 😉 If you want to keep on notified about updates the best thing is to follow the blog on Twitter @Opensourcesblog.

In this post I’ll use a different Sudoku than last time as the last time turned out to be relatively easy:

grid[1,:] = [0 0 0 0 0 0 0 0 0]
grid[2,:] = [0 1 0 6 2 0 0 9 0]
grid[3,:] = [0 0 2 0 0 9 3 1 0]
grid[4,:] = [0 0 4 0 0 6 0 8 0]
grid[5,:] = [0 0 8 7 0 2 1 0 0]
grid[6,:] = [0 3 0 8 0 0 5 0 0]
grid[7,:] = [0 6 9 1 0 0 4 0 0]
grid[8,:] = [0 8 0 0 7 3 0 5 0]
grid[9,:] = [0 0 0 0 0 0 0 0 0]

In the current version I have the following benchmark results:

julia> @benchmark main(;benchmark=true)
BenchmarkTools.Trial: 
  memory estimate:  686.05 MiB
  allocs estimate:  19347707
  --------------
  minimum time:     882.354 ms (12.26% GC)
  median time:      953.116 ms (12.70% GC)
  mean time:        940.299 ms (13.98% GC)
  maximum time:     981.150 ms (19.88% GC)
  --------------
  samples:          6
  evals/sample:     1

Info: I’ve added the ;benchmark parameter to my current script to avoid output if benchmarking.

In this post we want to bring it down to something around 15ms but to do that we first must understand why the current version is slow. I mean if we call 0.9s slow but based on my experience it is, as I know that we can improve it much more.

There are several points:

  • We are currently not solving it like a human being
    • This doesn’t necessarily have to be our goal if we find a better alternative but might give us some hints
  • We only fix the basics and then we do some kind of trial and error

Let’s have a look at the search space without backtracking

Search space without backtracking

We fix a single number and 54 spots are still open. The backtrack function is called 62264 times. That is a crazy amount.
Now we could try to think about how we remove more values from the search space before we start the backtracking and we will do that in the next post.

For this post we try to prune (remove values) while we are doing backtracking.

Before we start with that I want to do some small things I saw after publishing the post:

  1. I want to have some information about the solve call like whether backtracking was needed, how many constraint calls we had before backtracking and how often the backtracking function was called.
  2. I sometimes only wrote [] instead of Int[] for example.
  3. At the moment sometimes a constraint is called even though nothing changed since last…