Weight range for CQM constraints

The new version/release of CQM solver can take weights inside cqm.add_constraint(some constraint, weight = ?) method. What is the weight range for the parameter?

i.e what is the minimum and maximum possible value for the weight parameter inside the cqm's add_constraint() method? 

0

Comments

2 comments
  • Hi.

    I guess according to Joel's video lecture, (has since left D-Wave), from what I gather, you start with some conservative value, say it's single-digit, and work up, or work down from that value. I think it really depends on how out-of-balance the objective factors are. You can first try, eg., for a optimization of 2 objective functions, you can do: 

    objective = A*obj_1 + B*obj_2

    Here, the goal is to have both factors "compete" agaonst each other. That said, you don't want A = B. You won't get a minimum energy (ground state) that way. The Eigenspectrum will be bogus,

    For a simple QM, If I didn't have any way to know where to start, intuitively, I'd start with;

    A=1, B=2 

    Submit to the Solver, eg., LeapHybridCQMSolver, and interpret your results. I'd then reverse the numbers:

    A=2, B=1 

    Submit and compare and interpret.

    /////////////////////////////////////////////////////

    For my project, I had this simple setup:

    QM = objective + L*constraint   s.t. L - Lambda, the "fudge" factor.

    I found a "sweet spot", "L"ambda, for one of my problems in the project to be = 5. Ie., Lambda=5.

    The related problem in the same project showed that the lowest energy, was when L=15.

    I covered cases: L=1, 2, 3, 6, 8, 15 and 30 for both of my related test-suites. 

    The ground states showed that L=5 for case-a, and L=15 for case_b, I was happy : )

    I hope this helps.

    T

     

     

    0
    Comment actions Permalink
  • Hi Aniruddha,

    The weight parameter must be a positive float. There is no maximum range for the parameter but that being said you do not want to assign a randomly large number.

    Similar to Lagrange multiplier, the weight parameter acts as a weighing coefficient to balance the objective and soft constraint of the problem. Setting the weight parameter to a value smaller than problem biases might lead to violated constraint and to an arbitrarily high value might be bad for performance.

    Perez provided good insights to tune the weight parameter. A good point to start is by estimating the value that the objective function will have in an optimal solution and choose a value for weight parameter that puts the constraint in same order. Then increase or decrease the weight iteratively until you get good results.

    I hope this helps!

    0
    Comment actions Permalink

Please sign in to leave a comment.

Didn't find what you were looking for?

New post