Benchmarking Environment: Issueshttps://code.cor-lab.de/https://code.cor-lab.de/favicon.ico?14019720732013-08-29T14:44:45ZOpen Source Collaboration Platform
Redmine Feature #1603 (New): Export benchmark results in JMeter formathttps://code.cor-lab.de/issues/16032013-08-29T14:44:45ZAnonymous
<p>Apache JMeter format:<br /><a class="external" href="http://wiki.apache.org/jmeter/LogAnalysis#The_JMeter_Log_Format">http://wiki.apache.org/jmeter/LogAnalysis#The_JMeter_Log_Format</a></p> Feature #1010 (In Progress): Valgrind as intrinsic execution engine https://code.cor-lab.de/issues/10102012-06-19T22:04:23ZM. Rolfmrolf@cor-lab.uni-bielefeld.de
<p>Allow benchmark calls that execute valgrind in background. Something like<br /><pre>
MyBenchmarkExecutable --engine cpu
Benchcase [mu_suite:my_case]
Total (corrected) time: 1.3 s
Estimated cost per operation: 1.7 us
Operations per second: 5.9e5
</pre><br />vs. <br /><pre>
MyBenchmarkExecutable --engine valgrind
Benchcase [mu_suite:my_case]
Total (corrected) time: 1.3 GCycles
Estimated cost per operation: 1.7 MCycles
Operations per second: 5.9e5
</pre></p>
<p>The "--engine valgrind" must call valgrind in background, read and parse the logfile and transfer the results into the internal result data-structure.</p>
Some ideas...
<ul>
<li>Interally disable warmup and init-count (pointless when using valgrind)</li>
<li>Automatically reduce repitition-count, e.g. by factor 100</li>
</ul>