Research Article A Prefiltered Cuckoo Search Algorithm with Geometric Operators for Solving Sudoku Problems Ricardo Soto,1,2 Broderick Crawford,1,3 Cristian Galleguillos,1 Eric Monfroy,4 and Fernando Paredes5 1

Pontificia Universidad Cat´olica de Valpara´ıso, 2362807 Valpara´ıso, Chile Universidad Aut´onoma de Chile, 7500138 Santiago, Chile 3 Universidad Finis Terrae, 7501015 Santiago, Chile 4 CNRS, LINA, University of Nantes, 44322 Nantes, France 5 Escuela de Ingenier´ıa Industrial, Universidad Diego Portales, 8370109 Santiago, Chile 2

Correspondence should be addressed to Ricardo Soto; [email protected] Received 11 November 2013; Accepted 30 December 2013; Published 23 February 2014 Academic Editors: Z. Cui and X. Yang Copyright © 2014 Ricardo Soto et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Sudoku is a famous logic-placement game, originally popularized in Japan and today widely employed as pastime and as testbed for search algorithms. The classic Sudoku consists in filling a 9×9 grid, divided into nine 3×3 regions, so that each column, row, and region contains different digits from 1 to 9. This game is known to be NP-complete, with existing various complete and incomplete search algorithms able to solve different instances of it. In this paper, we present a new cuckoo search algorithm for solving Sudoku puzzles combining prefiltering phases and geometric operations. The geometric operators allow one to correctly move toward promising regions of the combinatorial space, while the prefiltering phases are able to previously delete from domains the values that do not conduct to any feasible solution. This integration leads to a more efficient domain filtering and as a consequence to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature.

1. Introduction The Sudoku is a logic-based placement puzzle, widely present as pastime game in newspapers and magazines. It was initially popularized in Japan during the eighties, but today it is a worldwide popular game and a useful benchmark for testing artificial intelligence solving techniques. A Sudoku puzzle consists in filling a board of 9 × 9 subdivided into subgrids of size 3 × 3 so that each row, column, and subgrid contains different digits from 1 to 9. A Sudoku problem includes prefilled cells, namely, the “givens,” which cannot be changed or moved (see Figure 1). Certainly, the amount of givens has limited or no impact on the difficulty of the problem. The difficulty is mostly dependent on the positioning of the givens along the puzzle. A useful difficulty classification including easy, medium, and hard Sudokus has been proposed by Mantere and Koljonen [1].

The literature reports various approaches to solve Sudoku puzzles. For instance, exact methods such as constraint programming [2–4] and boolean satisfiability [5] are wellknown candidates for the efficient handling of such kind of puzzles. In the context of approximate methods, genetic programming [1] and metaheuristics in general [6–10] have illustrated promising results. Additional, but less traditional, Sudoku solving techniques have been proposed as well, such as Sinkhorn balancing [11], rewriting rules [12], and entropy minimization [13]. In this paper, we focus on approximate methods. We propose a new algorithm for the efficient solving of Sudoku instances based on cuckoo search, geometric operators, and prefiltering phases. The cuckoo search is a relatively modern nature-inspired metaheuristic [14–18] to which we introduce geometric operators in order to correctly move to promising regions of a discrete space of solutions. This combination

2

The Scientific World Journal Subgrid

Column 2

1

5 7

4

2

9 4

Row

5

7

9 1

8 3

7

9

4

3

6

2

2

3

1 4

8 8

9 7

2

failing for medium and hard ones. In the same work, a genetic algorithm (GA) outperforms the hill-climber previously presented. Such a GA is tuned with geometric operators, in particular Hamming space crossovers and swap space crossovers, reporting solutions for a hard Sudoku. In Mantere and Koljonen [1], another GA is proposed that succeeds for easy and medium instances, but it only reaches the optimum in 2 out of 30 tries for a hard Sudoku. A cultural algorithm is proposed by the same authors [9], but it is generally outperformed by the GA previously reported. In Soto et al. [20], a tabu search is tuned with a prefiltered phase, being capable of solving 30 out of 30 tries for a hard Sudoku.

6 8

Figure 1: Sudoku puzzle instance.

is additionally enhanced with a domain reducer component based on local consistencies. The idea is to previously delete from the search space the values that do not conduct to any feasible solution. This integration straightforwardly alleviates the work of the metaheuristic leading to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature. This paper is organized as follows. In Section 2, we describe the previous work. Section 3 presents the classic cuckoo search algorithm. The geometric operators and the prefiltering phase employed are illustrated and exemplified in Sections 4 and 5, respectively. The resulting new cuckoo search algorithm is presented in Section 6, followed by the corresponding experimental results. Finally, in Section 8, we conclude and give some directions for future work.

2. Related Work Sudoku puzzles have been solved with various techniques during the last decades. For instance, complete methods such as Boolean satisfiability and constraint satisfaction can clearly be used to solve Sudokus [3–5, 19]. In this paper, we focus on incomplete search methods, specially on solving hard instances of such a puzzle. Within this scenario, different solutions have been suggested, mainly based on metaheuristics. For instance, Lewis [7] models the puzzle as an optimization problem where the number of incorrectly placed digits on the board must be minimized. The model is solved by using simulated annealing, but the approach is mostly focused on producing valid Sudokus than on the performance of the resolution. In [10], an ant colony algorithm is proposed, where the problem is modeled in an opposite form: maximizing the number of correctly filled cells. The best result completes only 76 out of 81 cells of the puzzle. In [8], a particle swarm optimizer (PSO) for solving Sudokus is presented, but the goal of authors was rather to validate the use of geometric operators in PSO for complex combinatorial spaces. In [6], a hill-climbing algorithm for Sudokus is reported. The approach succeeds in solving easy Sudoku instances,

3. Cuckoo Search Cuckoo search is a nature-inspired metaheuristic, based on the principle of the brood parasitism of some cuckoo species. This kind of bird has an aggressive reproduction strategy, which is based on the use of foreign nests for incubation. Cuckoos proceed by laying their eggs in nests from other bird species, removing the other bird eggs to increase incubation probability. Eventually, cuckoo eggs may be discovered by the host bird, which might act in two ways: taking off the alien eggs or simply abandoning its nest and building a new one elsewhere. In practice, an egg represents a solution and cuckoo eggs represent potentially better solutions than the current ones in nests. In the simplest form each nest has only one egg. The cuckoo search procedure for minimization is described in Algorithm 1. The process begins by randomly generating an initial population of host nests. Next, the algorithm iterates until a given stop criterion is reached, which is commonly a maximum number of iterations. At line 3, a new solution is created, normally by employing a random walk via L´evy flights. Equation (1) describes such a random walk, where 𝑥𝑖𝑡+1 is the new solution, 𝑡 corresponds to the iteration number, and the product ⊕ means entrywise multiplications. The 𝛼 parameter is the step size, where 𝛼 > 0, and determines how far the process can go for a fixed number of iterations. Then, at line 4, an 𝐸𝑔𝑔𝑗 is randomly chosen to be then compared with the previous one in order to keep the egg exhibiting the best cost. Finally, the worse nests are abandoned depending on the probability 𝑝𝑎 and new solutions are built: ́ (𝜆) , 𝑥𝑖𝑡+1 = 𝑥𝑖𝑡 + 𝛼 ⊕ Levy ́ ∼ 𝑢 = 𝑡−𝜆 , Levy

(1 < 𝜆 ≤ 3) .

(1)

4. Geometric Operators The cuckoo search has been originally designed for continuous domains, while Sudokus own discrete values. Then, a discretization phase for the CS algorithm is mandatory to correctly explore the potential solutions. The discretization phase applied here has been inspired from the work reported in [8], where a particle swarm optimization algorithm is adapted to solve discrete domains. The idea relies on the use of geometric-based operators able to correctly move to

The Scientific World Journal

3

Input: 𝑁𝑒𝑠𝑡𝑠𝑖𝑧𝑒 , 𝛼, 𝜆 Output: 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 (1) 𝑁𝑒𝑠𝑡𝑠 ← GenerateInitialPopulation(𝑁𝑒𝑠𝑡size ) (2) While ¬ 𝑆𝑡𝑜𝑝𝐶𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛 do (3) 𝐸𝑔𝑔𝑖 ← GetCuckooByLevyFlight(𝑁𝑒𝑠𝑡𝑠, 𝛼, 𝜆) (4) 𝐸𝑔𝑔𝑗 ← ChooseRamdomlyFrom(𝑁𝑒𝑠𝑡𝑠) (5) If cost(𝐸𝑔𝑔𝑖 ) ≤ cost(𝐸𝑔𝑔𝑗 ) (6) 𝐸𝑔𝑔𝑗 ← 𝐸𝑔𝑔𝑖 (7) End If (8) 𝐸𝑔𝑔best ← FindCurrentBest(𝑁𝑒𝑠𝑡𝑠) (9) 𝑁𝑒𝑠𝑡𝑠 ← AbandonWorseNests(𝑝𝑎 , 𝑁𝑒𝑠𝑡𝑠) (10) 𝑁𝑒𝑠𝑡𝑠 ← BuildNewSolutions(𝐸𝑔𝑔𝑏𝑒𝑠𝑡 ) (11) End While Algorithm 1: Cuckoo search.

Input: 𝑃𝑎𝑟𝑒𝑛t1 [], 𝑃𝑎𝑟𝑒𝑛𝑡2 [] Output: 𝐶ℎ𝑖𝑙𝑑[] (1) 𝐶ℎ𝑖𝑙𝑑 ← SelectRandomSegmentFrom(𝑃𝑎𝑟𝑒𝑛𝑡1 ) (2) 𝐿𝑖𝑠𝑡V𝑎𝑙 ← SelectNotCopiedToChild(𝑃𝑎𝑟𝑒𝑛𝑡2 , 𝐶ℎ𝑖𝑙𝑑) (3) For Each V𝑎𝑙 ∈ 𝐿𝑖𝑠𝑡V𝑎𝑙 (6) 𝑙V𝑎𝑙 ← V𝑎𝑙 (4) V ← 𝑃𝑎𝑟𝑒𝑛𝑡1 [IndexOf(V𝑎𝑙, 𝑃𝑎𝑟𝑒𝑛𝑡2 )] (5) If IndexOf(V, 𝑃𝑎𝑟𝑒𝑛𝑡2 ) ∈ SegmentOf(𝐶ℎ𝑖𝑙𝑑) (6) V𝑎𝑙 ← V (6) Go To (4) (7) Else (8) 𝐶ℎ𝑖𝑙𝑑[IndexOf(V, 𝑃𝑎𝑟𝑒𝑛𝑡2 )] ← 𝑙V𝑎𝑙 (9) End If (10) End For Each (11) CopyRemaining(𝐶ℎ𝑖𝑙𝑑, 𝑃𝑎𝑟𝑒𝑛𝑡2 ) Algorithm 2: PMX crossover.

promising regions of a discrete search space. In particular, for this work, we employ the partially matched crossover, the geometric crossover, and the feasible geometric mutation, which are described in the following. 4.1. Partially Matched Crossover. The partially matched crossover (PMX) basically works with two parents, creating two crossover points that are selected at random and then PMX proceeds by position swap. The PMX process is described in Algorithm 2. At the beginning, a segment from Parent 1 is randomly selected and copied to the child. Then, looking in the same segment positions in Parent 2 , each value not copied to the child is stored in a list. Then, for each value in this list, the V value is located in Parent 1 in the position given by the index of V𝑎𝑙 in Parent 2 . Then, an if-else conditional operates as follows: if the index of V is present in the original segment, V becomes the new V𝑎𝑙 and the process goes to line 4; otherwise, 𝑙V𝑎𝑙 is inserted into the child in the position given by the index of V in Parent 2 . Finally, the remaining positions from Parent 2 are copied to the child. The PMX operator applied to the Sudoku can be seen in Example 1.

Example 1 (PMX crossover). Let two rows located in the same position from different solutions of a Sudoku instance be the parents. The corresponding child produced by the PMX crossover is constructed as follows. (1) A random segment of consecutive digits from Parent 1 is copied to the child. Assuming that 1 corresponds to the index of the first position, the segment has size 5, from index 4 to 8: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 (2) 𝐶ℎ𝑖𝑙𝑑: - - - 3 6 2 5 1 (2) “4” is the first value in the observed segment of 𝑃𝑎𝑟𝑒𝑛𝑡2 that is not present in the child. Then, V𝑎𝑙 = 4, and the index of V𝑎𝑙 in Parent 2 is “5.” Hence, the V value corresponds to “6.” Next, the index of V in Parent 2 is “7.” This index exists in the observed segment, so the process comes back to line 4 using “6” as V𝑎𝑙: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 𝐶ℎ𝑖𝑙𝑑: - - - 3 6 2 5 1 -

(3)

4

The Scientific World Journal

Input: 𝑃𝑎𝑟𝑒𝑛𝑡1 [], 𝑃𝑎𝑟𝑒𝑛𝑡2 [], 𝑃𝑎𝑟𝑒𝑛𝑡3 [] Output: 𝐶ℎ𝑖𝑙𝑑[] (1) 𝑀𝑎𝑠𝑘 ← InitializeMask() (1) For Each 𝑖 ∈ {1, . . . , 𝑀𝑎𝑠𝑘𝑙𝑒𝑛𝑔𝑡ℎ } (2) If 𝑀𝑎𝑠𝑘[𝑖] = 1 (3) 𝐶ℎ𝑖𝑙𝑑[𝑖] ← 𝑃𝑎𝑟𝑒𝑛𝑡1 [𝑖] (4) 𝑃𝑎𝑟𝑒𝑛𝑡2 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡1 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡2 , 𝑖) (5) 𝑃𝑎𝑟𝑒𝑛𝑡3 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡1 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡3 , 𝑖) (6) Else If 𝑀𝑎𝑠𝑘[𝑖] = 2 (7) 𝐶ℎ𝑖𝑙𝑑[𝑖] ← 𝑃𝑎𝑟𝑒𝑛𝑡2 [𝑖] (8) 𝑃𝑎𝑟𝑒𝑛𝑡1 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡2 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡1 , 𝑖) (9) 𝑃𝑎𝑟𝑒𝑛𝑡3 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡2 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡3 , 𝑖) (10) Else (11) 𝐶ℎ𝑖𝑙𝑑[𝑖] ← 𝑃𝑎𝑟𝑒𝑛𝑡3 [𝑖] (12) 𝑃𝑎𝑟𝑒𝑛𝑡1 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡3 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡1 , 𝑖) (13) 𝑃𝑎𝑟𝑒𝑛𝑡2 ← Swap(𝑃𝑎𝑟𝑒𝑛𝑡3 [𝑖], 𝑃𝑎𝑟𝑒𝑛𝑡2 , 𝑖) (14) End If (15) End For Each Algorithm 3: Multiparental sorting crossover.

(3) Now, using “6” as V𝑎𝑙, the new V is “5.” Then, the index of “5” in Parent 2 also appears within the segment. So, the process comes back again to line 4 using “5” as V𝑎𝑙: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 𝐶ℎ𝑖𝑙𝑑: - - - 3 6 2 5 1 -

(4)

(4) Then, the V value is “2,” and its index in Parent 2 does not appear within the segment. Hence, we obtain a position in the child for the value “4” from step 2: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 𝐶ℎ𝑖𝑙𝑑: - - 4 3 6 2 5 1 -

(5)

(5) “7” is the next value from Parent 2 in the segment that is not already included in the child. Then, “1” is the V value, whose index does not appear within the segment as well. Hence, a position for the value “7” is obtained in the child: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 𝐶ℎ𝑖𝑙𝑑: - 7 4 3 6 2 5 1 -

(6)

(6) Now, everything else from Parent 2 is copied down to the child: 𝑃𝑎𝑟𝑒𝑛𝑡1: 8 4 7 3 6 2 5 1 9 𝑃𝑎𝑟𝑒𝑛𝑡2: 9 1 2 3 4 5 6 7 8 𝐶ℎ𝑖𝑙𝑑: 9 7 4 3 6 2 6 1 8

(7)

4.2. Multiparental Sorting Crossover. This operator may employ multiple parents; our approach based on [8] uses three, where each one represents a row of a potential solution:

one from the best solution of all generations, one from the best solution of the current generation, and one from the current solution. Each parent is associated with a weight (𝑤) according to (8) in order to control the relevance of each solution in the generation of the new one. The influence of parents given by the weights is reflected in a mask used in the process. The multiparental crossover is described in Algorithm 3. The three parents and the mask are the input of the procedure, and the child resulting from the crossover is the output. A for each loop is used to scan the mask, where every entry indicates which parent the other two parents need to be equal to for that specific position. The replacement depends on the value of the mask and it is performed by swapping the corresponding values as indicated in the conditionals stated at lines 2, 6, and 10. The swapping process is described in Algorithm 4: 𝑤1 + 𝑤2 + 𝑤3 = 1, such that 𝑤𝑖 > 0 ∀𝑖 ∈ {1, 2, 3} .

(8)

Example 2 (multiparental sorting crossover). Let Parent 1 be a row from the best solution of all generations, Parent 2 a row from the best solution of the current generation, and Parent 3 a row from the current solution. We employ 0.55 as the weight for Parent 1 , 0.33 for Parent 2 , and 0.12 for Parent 3 . Those weights represent the percentage of appearances of the given parent within the mask. For instance, 𝑃𝑎𝑟𝑒𝑛𝑡1 appears five times in the mask, Parent 2 three times, and Parent 3 once. The corresponding child produced by the multiparental sorting crossover is constructed as follows. (1) A mask of parent length is randomly generated according to the parent weights: 𝑀𝑎𝑠𝑘: 𝑃𝑎𝑟𝑒𝑛𝑡1: 𝑃𝑎𝑟𝑒𝑛𝑡2: 𝑃𝑎𝑟𝑒𝑛𝑡3:

3 8 9 4

1 4 1 7

2 7 2 9

1 3 3 3

1 6 4 6

1 2 5 2

1 5 6 5

2 1 7 1

2 9 8 8

(9)

The Scientific World Journal

5

Input: V𝑎𝑙𝑢𝑒, 𝑃𝑎𝑟𝑒𝑛𝑡[], 𝑝𝑜𝑠 Output: 𝑃𝑎𝑟𝑒𝑛𝑡[] (1) If 𝑃𝑎𝑟𝑒𝑛𝑡[𝑝𝑜𝑠] ≠ V𝑎𝑙𝑢𝑒 (2) For Each 𝑖 ∈ {1, . . . , 𝑃𝑎𝑟𝑒𝑛𝑡𝑙𝑒𝑛𝑔𝑡ℎ } (3) If 𝑃𝑎𝑟𝑒𝑛𝑡[𝑖] = V𝑎𝑙𝑢𝑒 (4) 𝑎𝑢𝑥 ← 𝑃𝑎𝑟𝑒𝑛𝑡[𝑖] (5) 𝑃𝑎𝑟𝑒𝑛𝑡[𝑖] ← 𝑃𝑎𝑟𝑒𝑛𝑡[𝑝𝑜𝑠] (6) 𝑃𝑎𝑟𝑒𝑛𝑡[𝑝𝑜𝑠] ← 𝑎𝑢𝑥 (7) End If (8) For Each (9) End If Algorithm 4: Swap.

Input: 𝑖𝑛𝑖𝑡𝑆𝑜𝑙[][], 𝑠𝑜𝑙[][] Output: 𝑠𝑜𝑙[][] (1) 𝑟𝑜𝑤𝑠 ← ChooseRowsRandomly(𝑖𝑛𝑖𝑡𝑆𝑜𝑙) (2) For Each 𝑟𝑜𝑤 ∈ 𝑟𝑜𝑤𝑠 (3) 𝑝𝑜𝑠1, 𝑝𝑜𝑠2 ← ChooseTwoEmptyCells(𝑟𝑜𝑤) (4) 𝑎𝑢𝑥 ← sol[𝑟𝑜𝑤][𝑝𝑜𝑠1] (5) sol[𝑟𝑜𝑤][𝑝𝑜𝑠1] ← sol[𝑟𝑜𝑤][𝑝𝑜𝑠2] (6) sol[𝑟𝑜𝑤][𝑝𝑜𝑠2] ← 𝑎𝑢𝑥 (7) End For Each Algorithm 5: Feasible geometric mutation.

(2) The first value of the mask corresponds to parent “3”, and then the first value of 𝑃𝑎𝑟𝑒𝑛𝑡1 and 𝑃𝑎𝑟𝑒𝑛𝑡2 needs to be equal to the first value of 𝑃𝑎𝑟𝑒𝑛𝑡3 , which is “4.” To this end, in 𝑃𝑎𝑟𝑒𝑛𝑡1 and 𝑃𝑎𝑟𝑒𝑛𝑡2 , the first cell is swapped with the cell that holds the value “4”: 𝑀𝑎𝑠𝑘: 𝑃𝑎𝑟𝑒𝑛𝑡1: 𝑃𝑎𝑟𝑒𝑛𝑡2: 𝑃𝑎𝑟𝑒𝑛𝑡3: 𝐶ℎ𝑖𝑙𝑑

3 4 4 4 4

1 8 1 7

2 7 2 9

1 3 3 3

1 6 9 6

1 2 5 2

2 5 6 5

2 1 7 1

3 9 8 8

(10)

(3) Next, the second value from the mask is “1.” The swapping process is analogous: 𝑀𝑎𝑠𝑘: 𝑃𝑎𝑟𝑒𝑛𝑡1: 𝑃𝑎𝑟𝑒𝑛𝑡2: 𝑃𝑎𝑟𝑒𝑛𝑡3: 𝐶ℎ𝑖𝑙𝑑

3 4 4 4 4

1 8 8 8 8

2 7 2 9

1 3 3 3

1 6 9 6

1 2 5 2

2 5 6 5

2 1 7 1

3 9 1 7

(11)

3 4 4 4 4

1 8 8 8 8

2 2 2 2 2

1 3 3 3 3

1 6 6 6 6

1 7 7 7 7

2 9 9 9 9

2 5 5 5 5

3 1 1 1 1

Example 3 (feasible geometric mutation). Let us consider a given Sudoku row and a solution candidate row, as shown below: Sudoku problem row: - - - 3 - - - - Solution candidate row: 9 7 4 3 6 2 5 1 8

(13)

The mutation is allowed in any cell except for cell 4, which owns the value 3 as given for the Sudoku instance. Examples of allowed and forbidden mutations are depicted below: allowed mutation: 9 6 4 3 7 2 5 1 8 forbidden mutation: 9 7 4 4 6 2 5 1 8

(14)

5. Prefiltering Phase

(4) Following the same procedure, the last step is shown below obtaining 4 8 2 3 6 7 9 5 1 as the new child: 𝑀𝑎𝑠𝑘: 𝑃𝑎𝑟𝑒𝑛𝑡1: 𝑃𝑎𝑟𝑒𝑛𝑡2: 𝑃𝑎𝑟𝑒𝑛𝑡3: 𝐶ℎ𝑖𝑙𝑑

4.3. Feasible Geometric Mutation. This is a simple operator used to maintain diversity in the solutions. It swaps two nonfixed elements in a row guaranteeing that mutation is applied only over the cells with no given value. The procedure is described in Algorithm 5.

(12)

In the presence of unfeasible solutions, the cuckoo procedure is responsible for detecting and discarding them in order to conduct the search to feasible regions of the space of solutions. The goal of the prefiltering phase is to alleviate the work of the cuckoo algorithm by previously eliminating those unfeasible values. This is possible by representing the Sudoku as a constraint network [4] and then applying efficient filtering techniques from the constraint satisfaction domain. In this context, arc-consistency [2] is a widely employed

6

The Scientific World Journal that V𝑖 = 𝜏[{𝑥𝑖 }]; such a tuple is called a support for (𝑥𝑖 , V𝑖 ) on 𝑐;

local consistency for filtering algorithms. Arc-consistency was initially defined for binary constraint [21, 22]. We employ here the more general filtering for nonarbitrary constraints named generalized arc-consistency (GAC). The idea is to enforce a local consistency to the problem in a process called constraint propagation. Before detailing this process, let us introduce some necessary definitions [23].

(ii) the domain 𝐷 is (generalized) arc-consistent on 𝑐 for 𝑥𝑖 if and only if all the values in 𝐷(𝑥𝑖 ) are consistent with 𝑐 in 𝐷; that is, 𝐷(𝑥𝑖 ) ⊆ 𝜋𝑥𝑖 (𝑐 ∩ 𝜋𝑋(𝑐) (𝐷));

Definition 4 (constraint). A constraint 𝑐 is a relation defined on a sequence of variables 𝑋(𝑐) = (𝑥𝑖1 , . . . , 𝑥𝑖|𝑋(𝑐)| ), called

(iii) the network 𝑁 is (generalized) arc-consistent if and only if 𝐷 is (generalized) arc-consistent for all variables in 𝑋 on all constraints in 𝐶.

the scheme of 𝑐; 𝑐 is the subset of Z|𝑋(𝑐)| that contains the combinations of tuples 𝜏 ∈ Z|𝑋(𝑐)| that satisfy 𝑐. |𝑋(𝑐)| is called the arity of 𝑐. A constraint 𝑐 with scheme 𝑋(𝑐) = (𝑥1 , . . . , 𝑥𝑘 ) is also noted as 𝑐(𝑥1 , . . . , 𝑥𝑘 ). Definition 5 (constraint network). A constraint network also known as constraint satisfaction problem (CSP) is defined by a triple 𝑁 = ⟨𝑋, 𝐷, 𝐶⟩, where (i) 𝑋 is a finite sequence of integer variables 𝑋 (𝑥1 , . . . , 𝑥𝑛 );

=

(ii) 𝐷 is the corresponding set of domains for 𝑋; that is, 𝐷 = 𝐷(𝑥1 ) × ⋅ ⋅ ⋅ × 𝐷(𝑥𝑛 ), where 𝐷(𝑥𝑖 ) ⊂ Zis the finite set of values that variable 𝑥𝑖 can take; (iii) 𝐶 is a set of constraints 𝐶 = {𝑐1 , . . . , 𝑐𝑒 }, where variables in 𝑋(𝑐𝑗 ) are in 𝑋. Example 6 (the Sudoku as a constraint network). Let ⟨𝑋, 𝐷, 𝐶⟩ be the constraint network, which is composed of the following. (i) 𝑋 = (𝑥1,1 , . . . , 𝑥𝑛,𝑚 ) is the sequence of variables, and 𝑥𝑖,𝑗 ∈ 𝑋 identifies the cell placed in the 𝑖th row and 𝑗th column of the Sudoku matrix, for 𝑖 = 1, . . . , 𝑛 and 𝑖 = 1, . . . , 𝑚. (ii) 𝐷 is the corresponding set of domains, where 𝐷(𝑥𝑖𝑗 ) ∈ 𝐷 is the domain of the variable 𝑥𝑖𝑗 . (iii) 𝐶 is the set of constraints defined as follows. (a) To ensure that values are different in rows and columns, 𝑥𝑘,𝑖 ≠ 𝑥𝑘,𝑗 ∧ 𝑥𝑖,𝑘 ≠ 𝑥𝑗,𝑘 , for all (𝑘 ∈ [1, 9], 𝑖 ∈ [1, 9], 𝑗 ∈ [𝑖 + 1, 9]). (b) To ensure that values are different in subgrid: 𝑥(𝑘1−1)∗3+𝑘2,(𝑗1−1)∗3+𝑗2 ≠ 𝑥(𝑘1−1)∗3+𝑘3,(𝑗1−1)∗3+𝑗3 , for all (𝑘1, 𝑗1, 𝑘2, 𝑗2, 𝑘3, 𝑗3 ∈ [1, 3] | 𝑘2 ≠ 𝑘3 ∧ 𝑗2 ≠ 𝑗3). Definition 7 (projection). A projection of 𝑐 on 𝑌 is denoted as 𝜋𝑌(𝑐) , which defines the relation with scheme 𝑌 that contains the tuples that can be extended to a tuple on 𝑋(𝑐) satisfying 𝑐. Definition 8 ((generalized) arc-consistency). Given a network 𝑁 = ⟨𝑋, 𝐷, 𝐶⟩, a constraint 𝑐 ∈ 𝐶, and a variable 𝑥𝑖 ∈ 𝑋(𝑐), (i) a value V𝑖 ∈ 𝐷(𝑥𝑖 ) is consistent with 𝑐 ∈ 𝐷 if and only if there exists a valid tuple tau satisfying 𝑐 such

The filtering process is achieved by enforcing the arcconsistency on the problem. This can be carried out by using Algorithms 6 and 7. The idea is to revise the arcs (the constraint relation between variables) by removing the values from 𝐷(𝑋𝑖 ) that lead to inconsistencies with respect to a given constraint. Such a revision process is done by Algorithm 6, which takes each value V𝑖 ∈ 𝐷(𝑥𝑖 ) (line 2) and analyses the space 𝜏 ∈ 𝑐∩𝜋𝑋(𝑐) (𝐷), searching for a support on constraint 𝑐 (line 3). If support does not exist, the value V𝑖 is eliminated from 𝐷(𝑥𝑖 ). Finally, the procedure informs if the domain has been modified by returning the corresponding Boolean value (line 8). The role of Algorithm 7 is to guarantee that every domain is arc-consistent. This is done by iteratively revising the arcs by performing calls to Algorithm 6. At the beginning, a list called 𝑄 is filled with pairs (𝑥𝑖 , 𝑐) such that 𝑥𝑖 ∈ 𝑋(𝑐). Pairs for which 𝐷(𝑥𝑖 ) is not ensured to be arc-consistent are kept in order to avoid useless calls to Algorithm 7. This is a main advantage of AC3 with respect to its predecessor Algorithms AC1 and AC2. Then, at line 2, a while statement controls the calls to Revise3. If a true value is received from Revise3, 𝐷(𝑋𝑖 ) is verified and if no value remains within the domain, the algorithm returns false. If there still exist values in 𝐷(𝑋𝑖 ), normally, a value for another variable 𝑥𝑗 has lost its support on 𝑐. Hence, the list 𝑄 must be refilled with all pairs (𝑥𝑖 ; 𝑐). The process finishes and returns true when all remaining values within domains are arc-consistent with respect to all constraints. Example 9 (enforcing AC3 on a Sudoku puzzle). Let us exemplify the work of the prefiltering phase by enforcing the AC3 on three constraints of a Sudoku instance. We begin by enforcing the AC3 on a given subgrid (enclosed with dashed lines in Figure 2) of the Sudoku puzzle. The subgrid has three variables with no value assigned (𝑥4,9 , 𝑥5,8 , and 𝑥6,9 ). Then, enforcing AC3 via the GAC3 algorithm with respect to the subgrid constraint (second constraint from Example 6) leads to the elimination of six values from 𝐷(𝑥4,9 ), 𝐷(𝑥5,8 ), and 𝐷(𝑥6,9 ). Those values have no support on the verified constraint; that is, they have been already taken for another cell on the same subgrid. Thus, the original domains for the three variables are reduced to {5, 6, 8}. In Figure 3, the AC3 is enforced to a row of the puzzle. This row has four variables (𝑥5,2 , 𝑥5,4 , 𝑥5,6 , and 𝑥5,8 ) with no assigned value. The GAC3 algorithm filters from domains five values with no support from 𝐷(𝑥5,2 ), 𝐷(𝑥5,4 ), and 𝐷(𝑥5,6 ). The variable 𝑥5,8 has been refiltered, remaining only two

The Scientific World Journal

7

5

2 1

7

4

9 4

8 3

7

9

7

1

3

3

4

6

2

2 9 7

7 9

3

4

1

8 8

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9}

2

5

2

9 1

4 2

6

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9}

8

Figure 2: Enforcing AC3 to a subgrid constraint.

Input: 𝑥𝑖 , 𝑐 Output: 𝐶𝐻𝐴𝑁𝐺𝐸 (1) 𝐶𝐻𝐴𝑁𝐺𝐸 ← false (2) Foreach V𝑖 ∈ 𝐷(𝑥𝑖 ) do (3) If ∄𝜏 ∈ 𝑐 ∩ 𝜋𝑋(𝑐) (𝐷) with 𝜏[𝑥𝑖 ] = V𝑖 do (4) remove V𝑖 from 𝐷(𝑥𝑖 ) (5) 𝐶𝐻𝐴𝑁𝐺𝐸 ← true (6) End If (7) End Foreach (8) Return 𝐶𝐻𝐴𝑁𝐺𝐸 Algorithm 6: Revise3. Input: 𝑋, 𝐷, 𝐶 Output: Boolean (1) 𝑄 ← (𝑥𝑖 , 𝑐) | 𝑐 ∈ 𝐶, 𝑥𝑖 ∈ 𝑋(𝑐) (2) While 𝑄 ≠ 0 do (3) select and remove (𝑥𝑖 , 𝑐) from 𝑄 (4) If Revise3(𝑥𝑖 , 𝑐) then (5) If 𝐷(𝑥𝑖 ) = 0 then (6) Return false (7) Else (8) 𝑄 ← 𝑄 ∪ {(𝑥𝑗 , 𝑐 ) | 𝑐 ∈ 𝐶 ∧ 𝑐 ≠ 𝑐 ∧ 𝑥𝑖 , 𝑥𝑗 ∈ 𝑋(𝑐 ) ∧ 𝑗 ≠ 𝑖} (9) End If (10) End If (11) End While (12) Return true Algorithm 7: AC3/GAC3.

possible values. Finally, in Figure 4, four values are filtered from four variables, remaining only one possible value for variable 𝑥5,8 .

6. The Prefiltered Cuckoo Search via Geometric Operators The cuckoo search proposed here combines geometric operators with prefiltering phases. The goal is to enhance the performance of the cuckoo search as well as to allow it

to correctly explore a discrete search space. Algorithm 5 illustrates the new hybrid algorithm. Now the input set is quite larger. It considers the size of the nest, the constraint network ⟨𝑋, 𝐷, 𝐶⟩ representing the Sudoku problem, and two parameters that define probabilities for regulating the usage of geometric operators. As output, the procedure returns the best egg reached by the algorithm. The prefiltering phase via the AC3 algorithm is triggered at the beginning. The AC3 algorithm receives as input the constraint network ⟨𝑋, 𝐷, 𝐶⟩ of the Sudoku and reduces if possible the

8

The Scientific World Journal

Input: 𝑁𝑒𝑠𝑡𝑠𝑖𝑧𝑒 , 𝑋, 𝐷, 𝐶, 𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 , 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 Output: 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 (1) 𝐷 ← AC3(𝑋, 𝐷, 𝐶) (2) 𝑁𝑒𝑠𝑡𝑠 ← GenerateInitialPopulation(𝑁𝑒𝑠𝑡𝑠𝑖𝑧𝑒 , 𝐷) (3) While ¬ 𝑆𝑡𝑜𝑝𝐶𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛 do (4) 𝐸𝑔𝑔𝑖 ← GetRandomlyFrom(𝑁𝑒𝑠𝑡𝑠) (5) 𝐸𝑔𝑔𝑖 ← GeometricOperators(𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 , 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 ) (6) 𝐸𝑔𝑔𝑗 ← ChooseRamdomlyFrom(𝑁𝑒𝑠𝑡𝑠) (7) If cost(𝐸𝑔𝑔𝑖 ) ≤ cost(𝐸𝑔𝑔𝑗 ) (8) 𝐸𝑔𝑔𝑗 ← 𝐸𝑔𝑔𝑖 (9) End If (10) 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 ← FindCurrentBest(𝑁𝑒𝑠𝑡𝑠) (11) 𝑁𝑒𝑠𝑡𝑠 ← AbandonWorseNests(𝑝𝑎 , 𝑁𝑒𝑠𝑡𝑠) (12) 𝑁𝑒𝑠𝑡𝑠 ← BuildNewSolutions(𝐸𝑔𝑔best , 𝐷) (13) End While Algorithm 8: Prefiltered discrete cuckoo search.

Input: 𝐸𝑔𝑔, 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 , 𝐸𝑔𝑔𝑙𝑎𝑠𝑡𝑏𝑒𝑠𝑡 , 𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 , 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 Output: 𝐸𝑔𝑔 (1) Foreach 𝐸𝑔𝑔 ∈ 𝑁𝑒𝑠𝑡𝑠 (2) If (𝐸𝑔𝑔 ⟨⟩ 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 ) (3) If (rand() < 𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 ) (4) 𝐸𝑔𝑔 ← PMXCrossover(𝐸𝑔𝑔, 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 ) (5) Else (6) 𝐸𝑔𝑔 = MultiParentalSortingCrossover(𝐸𝑔𝑔, 𝐸𝑔𝑔𝑏𝑒𝑠𝑡 , 𝐸𝑔𝑔𝑙𝑎𝑠𝑡𝑏𝑒𝑠𝑡 ) (7) End If (8) If (rand() < 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 ) (9) 𝐸𝑔𝑔 ← FeasibleGeometricMutation(𝐷, 𝐸𝑔𝑔) (10) End If (11) End If (12) End Foreach Algorithm 9: Geometric operators.

set of domains 𝐷 by deleting the unfeasible values. Then, an initial population of nests is generated but, unlike the classic cuckoo, the generation is bounded to the reduced set of domains 𝐷. Between lines 3 and 13, a while loop manages the iteration process until the stop condition is reached, which for the current implementation corresponds to a maximum number of iterations. At line 4, an 𝐸𝑔𝑔𝑖 is randomly chosen from nests to which the geometric operators are then applied. The usage of geometric operators is illustrated in Algorithm 9, where they apply only whether the evaluated egg is not the best one. The PMX and multiparent sorting crossovers act depending on a random value and on the 𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 probability. Analogously, the mutation operates using the 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 probability, and 𝐷 is used as input of the mutation to validate that only feasible mutations are carried out. At line 6, an 𝐸𝑔𝑔𝑗 is randomly chosen to be then compared with the previous one in order to keep the one exhibiting the best cost. The cost of a solution corresponds to the sum of wrong values in subgrids, columns, and rows (see Figure 5). At line 11, the worse nests are abandoned depending on probability 𝑝𝑎 . Finally, new solutions are built, but again, the set of filtered domains 𝐷 is considered in order to avoid

unfeasible solutions. Algorithm 8 illustrates the prefiltered discrete cuckoo search.

7. Experimental Results Different experiments have been performed in order to validate our approach. The Sudoku benchmarks used have been taken from [24], which are organized in three difficulty levels: easy, medium, and hard. All tested Sudokus have a unique solution. The algorithms have been implemented in Octave 3.6.3, and the experiments have been performed on a 2.0 GHz Intel Core2 Duo T5870 with 1 Gb RAM running Fedora 17. The configuration of the proposed cuckoo search is the following, which corresponds to the best one achieved after a tuning phase: 𝑃𝑎 = 0.25, 𝑃𝑝𝑚𝑥-𝑚𝑢𝑙𝑡𝑖 = 0.7, 𝑃𝑚𝑢𝑡𝑎𝑡𝑒 = 0.9, and 𝑁𝑒𝑠𝑡𝑠𝑖𝑧𝑒 = 10. Table 1 illustrates the results of solving 9 problems, 3 from each difficulty level, by using the proposed cuckoo search algorithm considering 10000 iterations. From left to right, the table states the number of tries performed, the number of tries solved, the minimum solving time reached, the average

The Scientific World Journal

9

Table 1: Solving Sudokus with the prefiltered discrete cuckoo search considering 10000 iterations.

Easy a Easy b Easy c Medium a Medium b Medium c Hard a Hard b Hard c

Tries

Solved

Min. solving time (sec)

𝑥 solving time (sec)

Max. solving time (sec)

SD time (sec)

50 50 50 50 84 67 97 62 71

50 50 50 50 50 50 50 50 50

1.302 1.005 1.019 2.501 38.32 26.807 385.652 49.608 9.497

1.310 1.007 1.021 384.341 729.673 800.471 2059.690 1484.692 771.211

1.329 1.029 1.036 1171.312 1932.691 1923.442 3777.340 6810.882 3561.042

0.004 0.003 0.003 291.336 516.867 561.877 984.921 1473.731 825.516

2

5

1

7

4

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

2

5 9

4

7

9

8

7

1 3

3

3

4

6

1

3

4

9

1

8 8

8

9

2

2

{5 , 6 , 8 }

4

9

2

1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9}

6

7

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

8

Figure 3: Enforcing AC3 to a row constraint.

2 1

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

5 7

4

2

5 9

4 8 3

7 3

3

4

6

2

2 9 7

3 {5 , 6 }

9 1

1

8 8

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

7

9 1

2

{1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

4 2

6

6 {1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 }

8

Figure 4: Enforcing AC3 to a column constraint.

solving time, the maximum solving time, and the standard deviation (SD). Regarding the easy Sudokus, the proposed cuckoo search is able to rapidly solve them reaching a 100% of success (50 out of 50 tries solved). Then, increasing one difficulty level, the percentage of success shortly diminishes, keeping the 100% of success for the “medium a” benchmark. Finally, for hard Sudokus, the runtime naturally gets bigger (see Figure 6); however, the performance is reasonable,

reaching a 51% (50 out of 97 tries solved) of success for “hard a,” 80% (50 out of 62 tries solved) of success for “hard b,” and a 70% (50 out of 71 tries solved) of success for “hard c.” In Table 2, the performance of the proposed cuckoo search is compared with the best-performing incomplete methods reported in the literature: a genetic algorithm (GA) [1] and a hybrid tabu search (hybrid TS) [20]. We contrast the number of problems solved from a total of 30 tries

10

The Scientific World Journal

Table 2: Comparing the prefiltered discrete cuckoo search with the best-performing incomplete methods for Sudokus considering 100000 and unlimited iterations. Prefiltered discrete CS Unlimited 100000 iterations iterations

Problem Easy a Easy b Easy c Medium a Medium b Medium c Hard a Hard b Hard c

30 30 30 30 30 30 30 30 30

Hybrid TS

GA

Unlimited iterations

100000 iterations

Unlimited iterations

100000 iterations

30 30 30 30 30 30 30 30 30

30 30 30 30 30 30 30 30 30

30 30 30 30 — — 30 — —

29 30 30 10 — — 2 — —

30 30 30 30 30 30 30 30 30

9

7

2

8

6

3

5

4

1

9

7

2

8

6

3

5

4

1

6

1

8

7

4

5

9

2

3

6

1

8

7

4

5

9

2

3

4

5

5

2

9

1

6

8

7

4

5

3

2

9

1

6

8

7

5

4

9

1

2

8

7

3

6

5

4

9

1

2

8

7

3

6

8

2

1

6

3

7

4

5

9

8

2

1

6

3

7

4

5

9

3

3

6

4

5

9

2

1

8

7

3

6

4

5

9

2

1

8

2

9

5

3

8

6

1

7

4

2

9

5

3

8

6

1

7

4

1

8

4

9

7

2

3

6

5

1

8

4

9

7

2

3

6

5

3

6

7

5

1

4

8

9

2

3

6

7

5

1

4

8

9

2

Cost

6

Cost

0

Figure 5: Solution cost of a Sudoku puzzle.

(10 out of 30 tries solved). The cuckoo search and the hybrid TS keep their 100% of success. Finally, observing hard Sudokus, the performance of the cuckoo search and the hybrid TS is considerably better than GA, which solves only 2 out of 30 tries, while our proposal as well as the hybrid TS is able to reach a 100% of success. Hard c

Hard b

Hard a

Medium c

Medium b

Medium a

Easy c

Easy b

Easy a

8000 7000 6000 5000 4000 3000 2000 1000 0

Minimum solving (s) Average solving (s) Maximum solving (s)

Figure 6: Comparing solving times for Sudoku.

taking into account both unlimited iterations and 100000 iterations. The results depict that the three techniques are able to easily succeed for the first Sudoku level. In the presence of medium Sudokus, the GA begins to decrease its performance, being able to solve a medium Sudoku with a 33% of success

8. Conclusions and Future Work In this paper, we have presented a prefiltered cuckoo search algorithm tuned with geometric operators. The geometric operators allow one to drive the search to promising regions of a discrete space of solutions, while the prefiltering phase attempts to previously delete from domains the values that do not conduct to any feasible solution. In this way, the work of the metaheuristic is alleviated leading to a faster solving process. We have performed a set of experiments in order to compare our approach with the best-performing approximate methods reported in the literature. We have considered Sudokus from different difficulty levels: easy, medium, and hard. In the presence of easy Sudokus, the proposed cuckoo search is able to reach a 100% of success.

The Scientific World Journal When solving medium difficulty Sudokus, the cuckoo search keeps its 100% of success, while the best GA reported is able only to solve 10 out of 30 tries. Finally, regarding hard Sudokus, the cuckoo search noticeably competes against the best incomplete method reported for Sudokus, both reaching a 100% of success considering 100000 iterations. We visualize different directions for future work; perhaps the clearest one is the introduction of prefiltering phases to additional metaheuristics such as particle swarm optimization, ant, or bee colony algorithms to solve Sudokus or any combinatorial problem. Evaluating the behaviour of geometric operators in other swarm-based metaheuristics is another interesting work to develop. Finally, the use of autonomous search [25–28] for the self-tuning of a metaheuristic interacting with prefiltering phases will be also an appealing research direction to pursue.

Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments Ricardo Soto is supported by Grant CONICYT/FONDECYT/INICIACION/11130459, Broderick Crawford is supported by Grant CONICYT/FONDECYT/REGULAR/ 1140897, and Fernando Paredes is supported by Grant CONICYT/FONDECYT/REGULAR/1130455.

References [1] T. Mantere and J. Koljonen, “Solving, rating and generating sudoku puzzles with GA,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC ’07), pp. 1382–1389, IEEE Computer Society, Singapore, September 2007. [2] F. Rossi, P. van Beek, and T. Walsh, Handbook of Constraint Programming, Elsevier, 2006. [3] T. K. Moon and J. H. Gunther, “Multiple constraint satisfaction by belief propagation: an example using sudoku,” in Proceedings of the IEEE Mountain Workshop on Adaptive and Learning Systems, pp. 122–126, Logan, Utah, USA, July 2006. [4] H. Simonis, “Sudoku as a constraint problem,” in Proceedings of the 4th International Workshop on Modelling and Reformulating Constraint Satisfaction Problems, pp. 13–27, Barcelona, Spain, 2005. [5] I. Lynce and J. Ouaknine, “Sudoku as a SAT problem,” in Proceedings of the International Symposium on Artificial Intelligence and Mathematics (ISAIM ’06), Fort Lauderdale, Fla, USA, 2006. [6] A. Moraglio, J. Togelius, and S. Lucas, “Product geometric crossover for the sudoku puzzle,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC ’06), pp. 470–476, IEEE Computer Society, Vancouver, Canada, July 2006. [7] R. Lewis, “Metaheuristics can solve sudoku puzzles,” Journal of Heuristics, vol. 13, no. 4, pp. 387–401, 2007. [8] A. Moraglio and J. Togelius, “Geometric particle swarm optimization for the sudoku puzzle,” in Proceedings of the 9th Annual Genetic and Evolutionary Computation Conference (GECCO ’07), pp. 118–125, ACM Press, July 2007.

11 [9] T. Mantere and J. Koljonen, “Solving and analyzing sudokus with cultural algorithms,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC ’08), pp. 4053–4060, IEEE Computer Society, Hong Kong, June 2008. [10] M. Asif and R. Baig, “Solving NP-complete problem using ACO algorithm,” in Proceedings of the International Conference on Emerging Technologies (ICET ’09), pp. 13–16, IEEE Computer Society, Islamabad, Pakistan, October 2009. [11] T. K. Moon, J. H. Gunther, and J. J. Kupin, “Sinkhorn solves sudoku,” IEEE Transactions on Information Theory, vol. 55, no. 4, pp. 1741–1746, 2009. [12] G. Santos-Garc´ıa and M. Palomino, “Solving sudoku puzzles with rewriting rules,” Electronic Notes in Theoretical Computer Science, vol. 176, no. 4, pp. 79–93, 2007. [13] J. Gunther and T. K. Moon, “Entropy minimization for solving sudoku,” IEEE Transactions on Signal Processing, vol. 60, no. 1, pp. 508–513, 2012. [14] X.-S. Yang and S. Deb, “Cuckoo search via l´evy flights,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NABIC ’09), pp. 210–214, IEEE, Coimbatore, India, December 2009. [15] X.-S. Yang and S. Deb, “Multiobjective cuckoo search for design optimization,” Computers & Operations Research, vol. 40, no. 6, pp. 1616–1624, 2013. [16] P. R. Srivastava, A. Varshney, P. Nama, and X.-S. Yang, “Software test effort estimation: a model based on cuckoo search,” International Journal of Bio-Inspired Computation, vol. 4, no. 5, pp. 278–285, 2012. [17] M. K. Marichelvam, “An improved hybrid cuckoo search (ihcs) metaheuristics algorithm for permutation flow shop scheduling problems,” International Journal of Bio-Inspired Computation, vol. 4, no. 4, pp. 200–205, 2012. [18] A. Gherboudj, A. Layeb, and S. Chikhi, “Solving 0-1 knapsack problems by a discrete binary version of cuckoo search algorithm,” International Journal of Bio-Inspired Computation, vol. 4, no. 4, pp. 229–236, 2012. [19] B. Crawford, M. Aranda, C. Castro, and E. Monfroy, “Using constraint programming to solve sudoku puzzles,” in Proceedings of the 3rd International Conference on Convergence and Hybrid Information Technology (ICCIT ’08), pp. 926–931, IEEE Computer Society, Busan, Republic of Korea, November 2008. [20] R. Soto, B. Crawford, C. Galleguillos, E. Monfroy, and F. Paredes, “A hybrid ac3-tabu search algorithm for solving sudoku puzzles,” Expert Systems with Applications, vol. 40, no. 15, pp. 5817–5821, 2013. [21] A. K. Mackworth, “Consistency in networks of relations,” Artificial Intelligence, vol. 8, no. 1, pp. 99–118, 1977. [22] A. Mackworth, “On reading sketch maps,” in Proceedings of the 5th International Joint Conference on Artificial Intelligence (IJCAI ’77), pp. 598–606, 1977. [23] C. Bessi`ere, Handbook of Constraint Programming, Elsevier, 2006. [24] T. Mantere and J. Koljonen, “Sudoku research page,” 2008, http://lipas.uwasa.fi/∼timan/sudoku/. [25] B. Crawford, R. Soto, E. Monfroy, W. Palma, C. Castro, and F. Paredes, “Parameter tuning of a choice-function based hyperheuristic using Particle Swarm Optimization,” Expert Systems with Applications, vol. 40, no. 5, pp. 1690–1695, 2013. [26] E. Monfroy, C. Castro, B. Crawford, R. Soto, F. Paredes, and C. Figueroa, “A reactive and hybrid constraint solver,” Journal of Experimental and Theoretical Artificial Intelligence, vol. 25, no. 1, pp. 1–22, 2013.

12 [27] R. Soto, B. Crawford, E. Monfroy, and V. Bustos, “Using autonomous search for generating good enumeration strategy blends in constraint programming,” in Proceedings of the 12th International Conference on Computational Science and Its Applications (ICCSA ’12), vol. 7335 of Lecture Notes in Computer Science, pp. 607–617, Springer, Berlin, Germany, 2012. [28] B. Crawford, R. Soto, C. Castro, and E. Monfroy, “Extensible cpbased autonomous search,” in Proceedings of the HCI International, vol. 173 of Communications in Computer and Information Science, pp. 561–565, Springer, Berlin, Germany, 2011.

The Scientific World Journal