[gecode-users] Restarting search with a new constrain()

Francesco Santini francesco.santini at dmi.unipg.it
Sat Mar 14 13:00:27 CET 2015


Dear all,

my problem has several partially-ordered best solutions, which I would like to exhaustively find (all of them).
They are partially ordered because they are sets with some properties, but each of these sets is not included into another.
Elements can be taken or not in such sets: hence, I use boolean variables (1 yes I include this element in my solution, 0 no I do not include it).

I was thinking to use an array of boolean variables, and post constraints to satisfy the properties that I request on them (e.g., "I cant take this if I take that").
Then, to channel such variables to sets.
Then define the constrain() function in such a way that each successive solution has to be a -superset- of the previous one.
In this way, I rapidly climb up to one of my best solutions.

However, restarting with the same space does not help me because all the successive best solutions are not strictly included into the first one: search stops after the first best solution. But, again, I would like to have all of them.

I was thinking to -copy- the first clean space, where constrain() has not been invoked yet adding all the superset constraints. Calling "virtual Space * 	copy (bool share)".
And then to do a new search over such first space, by posting a new constraint with the purpose to avoid to consider (all subsets of) the first best solution. And so on for all the successive steps, considering all the previous best solutions found.

Is it the right/fastest way to do it?
So, a “while until s.next()=NULL”, each time using the first clean space s, but also removing from s all (the subsets of) previous best solutions.

Thanks for your help,
Francesco




==================================
Dr. Francesco Santini
Institute of Informatics and Telematics - CNR
Security group: http://security.iit.cnr.it
Via Moruzzi, 1
56124 Pisa, Italy
http://sites.google.com/site/santinifrancesco/
==================================






More information about the users mailing list