How to efficiently remove redundant linear constraints for optimization? How to efficiently remove redundant linear constraints for optimization? numpy numpy

How to efficiently remove redundant linear constraints for optimization?


You can think of each constraint as a hyperplane, intercepting each axis at (sum constraint constant) / (coefficient for that axis); if the coefficient is 0, the hyperplane is parallel to that axis (== "intercepts at infinity").

Trivially, if the axis-intercepts for one hyperplane are all equal to or greater than the corresponding intercepts for another, that hyperplane is redundant.

In order to cull as many constraints as early as possible, you want to start by comparing against ones whose hyperplane is (a) as close to the origin as possible and (b) parallel to as few axes as possible, as it can only cull other hyperplanes also parallel to that axis. [A hyperplane not parallel to a given axis may be able to cull one parallel to that axis, but the inverse is never true.]

I suggest you sort the list by (number of axis-parallel axes) then (sum of non-infinite axis intercepts).


I think for speed up this problem you can use backtracking for ex you can check out coefficients in arrays one by one ! if you find an index that is less than other index you can stop checking and delete the row !