PerfStats
PerfStats is a framework for the low-overhead selective collection of internal performance metrics. The results are accessible through ChromeUtils, Browsertime output, and in select performance tests.
Adding a new PerfStat
Define the new PerfStat by adding it to this list in PerfStats.h.
Then, in C++ code, wrap execution in an RAII object, e.g.
PerfStats::AutoMetricRecording<PerfStats::Metric::MyMetric>()
or call the following function manually:
PerfStats::RecordMeasurement(PerfStats::Metric::MyMetric, Start, End)
For incrementing counters, use the following:
PerfStats::RecordMeasurementCount(PerfStats::Metric::MyMetric, incrementCount)
Here’s an example of a patch where a new PerfStat was added and used.
Enabling collection
To enable collection, use ChromeUtils.setPerfStatsFeatures(array<string> metrics), where passing an empty array disables all metrics. Pass an array of metric names corresponding to the Metric enum values, e.g. ["LayerBuilding", "DisplayListBuilding"]. To enable all metrics, you may use ChromeUtils.enableAllPerfStatsFeatures().
Accessing results
Results can be accessed with ChromeUtils.CollectPerfStats().
The Browsertime test framework will sum results across processes and report them in its output.
The raptor-browsertime Windows essential pageload tests also collect all PerfStats.