Re: Benchmarks for PPL 0.6.1 -> 0.7 and HSCC paper

Goran Frehse wrote:
thank you for another improvement of the PPL! This time it took me a while to upgrade, because I was a little cautious about the necessary renaming. But in the end it wasn't a problem at all. So here are some results of a recompilation:
- Verification of a 3-dimensional system, i.e., several 10000 computations with 3--6-dimensional polyhedra
- PPL 0.6.1: 61.46 s user time, Max VSize = 36724KB
- PPL 0.7 : 59.24 s user time, Max VSize = 37112KB
- Verification of a 4-dim. system, i.e., several 10000 computations with 4--8-dimensional polyhedra
- PPL 0.6.1: 228.73 s user time, Max VSize = 109888KB
- PPL 0.7 : 204.39 s user time, Max VSize = 110124KB
Again, you managed to improve the speed by more than 10% (for higher dimensions), I'm impressed!
Also, I've attached a paper (to appear in March) on the verification tool that I'm developing. Using the PPL, it performs quite well against tools that only use floating point computations -- quite astonishing, really, and a certificate of quality for the PPL.
In analyzing systems of higher complexity, the exact computations quickly reach coefficients of 100000 bits or more, and hundreds of constraints. In the paper, I'm addressing this issue with a guaranteed overapproximation, using the linear programming algorithm of the PPL on top of the normal polyhedral operations. For systems where an exact computation is still possible, the gain in speed is more than tenfold. More complex systems are simply impossible to analyze without limiting the number of bits and constraints (see the section in the paper on "Managing Complexity of Polyhedra").
In my current implementation of the algorithms limiting the number of bits and constraints, the polyhedra are reconstructed constraint by constraint, which is very costly, and I think there is a lot of overhead because I have no access to the non-public methods and data structures of the PPL. Since these methods to limit complexity have a quite general range of applications, might it be worthwhile to consider adopting them in the PPL?
Again, I applaud the ongoing good work you're doing with the PPL, and thank you for the developments.
Dear Goran,
thank you very much for your message. I really want to share it with all the PPL developers, since it is thanks to their hard work that the library begins to be applicable to real problems. Your encouragement gives us even more incentive to continue doing our best.
I skimmed through the paper and I think it is very interesting. I have read the sections where you describe the techniques for limiting the number of bits in the coefficients and the number of constraints in polyhedra and I agree with you that these techniques ought to be added to the library. We will discuss this possibility as soon as possible While we are at papers, please help us to complete the bibliography in http://www.cs.unipr.it/ppl/Documentation/ppl_citations.bib: we miss page numbers for your former two papers and an entry for this paper you sent (BTW, I will pass it to the other developers).
I have also checked the web page of PHAVer at http://www.cs.ru.nl/~goranf/. I was looking for the sources because I wanted to test the impact of some improvements we are making to the library, but could not find them. Notice that, in order to comply with the GNU General Public License (the license under which you are using the PPL) you should make the sources available. See point 3 of the GPL for the gory details. As far as I know, however, bundling the full sources with the package is, by far, the most convenient solution. All the best, and thanks again,
Roberto
participants (1)
-
Roberto Bagnara