Open Bug 1774619 Opened 2 years ago Updated 2 years ago

Validation failure when using timestamps

Categories

(Testing :: mozperftest, defect, P2)

Default
defect

Tracking

(Not tracked)

People

(Reporter: sfink, Unassigned)

Details

I was experimenting with adding an option --browsertime-existing-results=/tmp/browsertime-results/ to mach perftest --perfherder, and I'm getting a validation failure:

jsonschema.exceptions.ValidationError: 1651717265847.32 is greater than the maximum of 1000000000000.0

Failed validating 'maximum' in schema['properties']['suites']['items']['properties']['subtests']['items']['properties']['value']:
    {'description': 'Summary value for subtest',
     'maximum': 1000000000000.0,
     'minimum': -1000000000000.0,
     'title': 'Subtest value',
     'type': 'number'}

On instance['suites'][0]['subtests'][8]['value']:
    1651717265847.32

It appears that the test in question here is statistics.timings.navigationTiming.startTime. In other words, it's recording a timestamp, which is outside the sane range that is probably expecting milliseconds or something. It also probably isn't something we want to display in perfherder ("when you ran this test 3 days later, its start times were significantly larger than the first run!").

I don't think my new option does anything that would mess this up.

I see this happening from time to time as well with metrics that are too large to fit into this performance-artifact-schema: https://searchfox.org/mozilla-central/source/testing/mozharness/external_tools/performance-artifact-schema.json

If you use --perfherder-metrics you can filter all the metrics you want to keep: https://searchfox.org/mozilla-central/search?q=perfherder-metrics&path=&case=false&regexp=false

There are various ways of specifying the metrics you want through that option.

Severity: -- → S3
Priority: -- → P2
You need to log in before you can comment on or make changes to this bug.