
Didier Lime wrote:
I guess I should be using arbitrary length integers anyway! But for now, I will take my chance with 64 bit checked integers (they should still be more efficient even on 32 bit computers).
They should be.
If I remember well, in the real application, I had this overflow in remove_space_dimensions (in the conversion routine called by some minimize function, if I remember my stack frames correctly). Maybe due to the same emptiness test.
Both the emptiness test and remove_space_dimensions() (may) call the conversion routine; so, yes, it is the same phenomenon.
Note also that C_Polyhedron objects are significantly more efficient than NNC_Polyhedron objects: so, if you don't need to work with non-closed polyhedra (i.e., don't need strict inequalities) you may consider to use the former for efficiency.
Good idea but, in Romeo, I actually need those non-strict inequalities so this is not an option.
Do you mean that you need those _strict_ inequalities?
If so, this may be one application of strict inequalities we do not know. Can you give us some intuition or point us to a paper that explains the role played by strict inequalities in the verification of Time Petri Nets?
Thanks again for your answer and more generally, I must say I do appreciate programming with this library. From my point of view, you have really done a great piece of work!
Glad to be useful. Thanks!
Roberto (on behalf of the entire development team)