λ. Thoughts -> Text

The Hitchhiker's Guide to Servo Contributor - Part II

August 27, 2019

Preface

In software engineering, writing tests to verify business logic is indispensable. Same for Browser Engines! In the Web Platform world, there are test suites for specs called web-platform-tests a.k.a wpt at https://github.com/web-platform-tests/wpt.

Introduction to Web Platform Tests

The Web Platform Tests is maintained by W3C, WHATWG, browser vendors and others (ex. Bocoup, Igalia and so on). Most of browser engines run the test suites in wpt to verify corresponding implementation and Servo is one of the browser engine.

There are many test suites for each standard specs in Web Platform Tests, including html, url, dom and so on.

For example, if you’d like to find the test for sort in URLSearchParams in url spec, you can try to find it under url folder in wpt and it’s url/urlsearchparams-sort.any.js.

About how to write WPT tests and what .any.js is, I won’t introduce them too much in this post because the WPT members already write a pretty awesome document in their official website.

WPT in Servo

As mentioned above, Servo also runs the wpt test suites. The upstream wpt tests (web-platform-tests/wpt) are synced into tests/wpt/web-platform-tests in Servo.

So, how can contributors run the corresponding test(s) after implementation?

Let’s take Implement URLSearchParams.prototype.sort() PR from me as an example.

After reading the spec of sort() and following the spec for implementation, I ran the command below to verify

./mach test-wpt tests/wpt/web-platform-tests/url/urlsearchparams-sort.any.js

With executing ./mach test-wpt, mach will help you launch wpt servers on 8000, 8001 and 8443 ports and run the test(s) from the 2nd argument (it’s tests/wpt/web-platform-tests/url/urlsearchparams-sort.any.js in this case).

The output should be like

Expand to check `test-wpt` logs
→ ./mach test-wpt tests/wpt/web-platform-tests/url/urlsearchparams-sort.any.js
 0:03.72 INFO Using 8 client processes
 0:09.06 INFO Starting http server on 127.0.0.1:8000
 0:09.07 INFO Starting http server on 127.0.0.1:8001
 0:09.07 INFO Starting https server on 127.0.0.1:8443
 0:11.28 SUITE_START: web-platform-test - running 2 tests
 0:11.28 INFO Running reftest tests
 0:11.29 INFO No reftest tests to run
 0:11.29 INFO Running wdspec tests
 0:11.29 INFO No wdspec tests to run
 0:11.29 INFO Running testharness tests
 0:11.29 INFO Starting runner
 0:11.30 INFO Starting runner
 0:11.30 INFO No more tests
 0:11.30 INFO No more tests
 0:11.30 INFO No more tests
 0:11.31 INFO No more tests
 0:11.31 TEST_START: /url/urlsearchparams-sort.any.html
 0:11.32 INFO No more tests
 0:11.32 INFO No more tests
 0:11.32 TEST_START: /url/urlsearchparams-sort.any.worker.html
 0:11.44 pid:68576 Full command: /servo/target/debug/servo --hard-fail -u Servo/wptrunner -Z replace-surrogates -z http://web-platform.test:8000/url/urlsearchparams-sort.any.worker.html --user-stylesheet /servo/resources/ahem.css --certificate-path /var/folders/dx/l5pn75zx5v9_cwstvgwc5qyc0000gn/T/tmpMS_ZlW/cacert.pem
pid:68576 VMware, Inc.
 0:11.44 pid:68575 Full command: /servo/target/debug/servo --hard-fail -u Servo/wptrunner -Z replace-surrogates -z http://web-platform.test:8000/url/urlsearchparams-sort.any.html --user-stylesheet /servo/resources/ahem.css --certificate-path /var/folders/dx/l5pn75zx5v9_cwstvgwc5qyc0000gn/T/tmpMS_ZlW/cacert.pem
pid:68575 VMware, Inc.
 0:11.44 pid:68576 softpipe
 0:11.44 pid:68575 softpipe
 0:11.44 pid:68576 3.3 (Core Profile) Mesa 18.3.0-devel
 0:11.44 pid:68575 3.3 (Core Profile) Mesa 18.3.0-devel
 0:12.97 TEST_END: Test OK. Subtests passed 16/17. Unexpected 16
UNEXPECTED-PASS Parse and sort: z=b&a=b&z=a&a=aUNEXPECTED-PASS URL parse and sort: z=b&a=b&z=a&a=aUNEXPECTED-PASS Parse and sort: �=x&&�=aUNEXPECTED-PASS URL parse and sort: �=x&&�=aUNEXPECTED-PASS Parse and sort: ffi&🌈UNEXPECTED-PASS URL parse and sort: ffi&🌈UNEXPECTED-PASS Parse and sort: é&e�&éUNEXPECTED-PASS URL parse and sort: é&e�&éUNEXPECTED-PASS Parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=gUNEXPECTED-PASS URL parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=gUNEXPECTED-PASS Parse and sort: bbb&bb&aaa&aa=x&aa=yUNEXPECTED-PASS URL parse and sort: bbb&bb&aaa&aa=x&aa=yUNEXPECTED-PASS Parse and sort: z=z&=f&=t&=xUNEXPECTED-PASS URL parse and sort: z=z&=f&=t&=xUNEXPECTED-PASS Parse and sort: a🌈&a💩UNEXPECTED-PASS URL parse and sort: a🌈&a💩
 0:12.97 INFO No more tests
 0:12.99 INFO Closing logging queue
 0:12.99 INFO queue closed
 0:13.13 TEST_END: Test OK. Subtests passed 16/17. Unexpected 16
UNEXPECTED-PASS Parse and sort: z=b&a=b&z=a&a=aUNEXPECTED-PASS URL parse and sort: z=b&a=b&z=a&a=aUNEXPECTED-PASS Parse and sort: �=x&&�=aUNEXPECTED-PASS URL parse and sort: �=x&&�=aUNEXPECTED-PASS Parse and sort: ffi&🌈UNEXPECTED-PASS URL parse and sort: ffi&🌈UNEXPECTED-PASS Parse and sort: é&e�&éUNEXPECTED-PASS URL parse and sort: é&e�&éUNEXPECTED-PASS Parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=gUNEXPECTED-PASS URL parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=gUNEXPECTED-PASS Parse and sort: bbb&bb&aaa&aa=x&aa=yUNEXPECTED-PASS URL parse and sort: bbb&bb&aaa&aa=x&aa=yUNEXPECTED-PASS Parse and sort: z=z&=f&=t&=xUNEXPECTED-PASS URL parse and sort: z=z&=f&=t&=xUNEXPECTED-PASS Parse and sort: a🌈&a💩UNEXPECTED-PASS URL parse and sort: a🌈&a💩
 0:13.13 INFO No more tests
 0:13.17 INFO Closing logging queue
 0:13.17 INFO queue closed
 0:13.17 INFO Got 32 unexpected results
 0:13.17 SUITE_END

web-platform-test
~~~~~~~~~~~~~~~~~
Ran 36 checks (2 tests, 34 subtests)
Expected results: 4
Unexpected results: 32
  subtest: 32 (32 pass)

Unexpected Results
------------------
/url/urlsearchparams-sort.any.html
  UNEXPECTED-PASS Parse and sort: z=b&a=b&z=a&a=a
  UNEXPECTED-PASS URL parse and sort: z=b&a=b&z=a&a=a
  UNEXPECTED-PASS Parse and sort: �=x&&�=a
  UNEXPECTED-PASS URL parse and sort: �=x&&�=a
  UNEXPECTED-PASS Parse and sort: ffi&🌈
  UNEXPECTED-PASS URL parse and sort: ffi&🌈
  UNEXPECTED-PASS Parse and sort: é&e�&é
  UNEXPECTED-PASS URL parse and sort: é&e�&é
  UNEXPECTED-PASS Parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=g
  UNEXPECTED-PASS URL parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=g
  UNEXPECTED-PASS Parse and sort: bbb&bb&aaa&aa=x&aa=y
  UNEXPECTED-PASS URL parse and sort: bbb&bb&aaa&aa=x&aa=y
  UNEXPECTED-PASS Parse and sort: z=z&=f&=t&=x
  UNEXPECTED-PASS URL parse and sort: z=z&=f&=t&=x
  UNEXPECTED-PASS Parse and sort: a🌈&a💩
  UNEXPECTED-PASS URL parse and sort: a🌈&a💩
/url/urlsearchparams-sort.any.worker.html
  UNEXPECTED-PASS Parse and sort: z=b&a=b&z=a&a=a
  UNEXPECTED-PASS URL parse and sort: z=b&a=b&z=a&a=a
  UNEXPECTED-PASS Parse and sort: �=x&&�=a
  UNEXPECTED-PASS URL parse and sort: �=x&&�=a
  UNEXPECTED-PASS Parse and sort: ffi&🌈
  UNEXPECTED-PASS URL parse and sort: ffi&🌈
  UNEXPECTED-PASS Parse and sort: é&e�&é
  UNEXPECTED-PASS URL parse and sort: é&e�&é
  UNEXPECTED-PASS Parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=g
  UNEXPECTED-PASS URL parse and sort: z=z&a=a&z=y&a=b&z=x&a=c&z=w&a=d&z=v&a=e&z=u&a=f&z=t&a=g
  UNEXPECTED-PASS Parse and sort: bbb&bb&aaa&aa=x&aa=y
  UNEXPECTED-PASS URL parse and sort: bbb&bb&aaa&aa=x&aa=y
  UNEXPECTED-PASS Parse and sort: z=z&=f&=t&=x
  UNEXPECTED-PASS URL parse and sort: z=z&=f&=t&=x
  UNEXPECTED-PASS Parse and sort: a🌈&a💩
  UNEXPECTED-PASS URL parse and sort: a🌈&a💩
 0:13.20 INFO Closing logging queue
 0:13.20 INFO queue closed

‼️ Notice the UNEXPECTED-PASS tests ‼️

The Test Expectation Configuration

As I mentioned in my previous post, there are a bunch of ini files under tests/wpt/metadata.

With upstream wpt path, you can find corresponding ini file and synced test. For example, the wpt test for sort() in URLSearchParams is under url/urlsearchparams-sort.any.js so its wpt test in the synced folder in Servo is tests/wpt/web-platform-tests/url/urlsearchparams-sort.any.js and ini configuration is tests/wpt/metadata/url/urlsearchparams-sort.any.js.ini.

Those ini files will indicate current status for each tests.

Possible status for those tests are

FAIL
PASS
OK
ERROR
NOTRUN
TIMEOUT
CRASH

So, what is UNEXPECTED-PASS?

Let’s check the tests/wpt/metadata/url/urlsearchparams-sort.any.js.ini in servo/servo#22638.

Let’s take part of the ini so that we can understand easily:

[urlsearchparams-sort.any.html]
-  [Parse and sort: a🌈&a💩]
-    expected: FAIL
...
  [Sorting non-existent params removes ? from URL]
    expected: FAIL

-  [Parse and sort: é&e�&é]
-    expected: FAIL
...

For example, the Parse and sort: a🌈&a💩 test was expected to FAIL because before the patch, the sort() was not implemented. So, after implementing the sort() function, the test was now PASS.

However, the expected status was still set to FAIL in the ini file so when I ran ./mach test-wpt, mach just yelled at me that the test was UNEXPECTED PASS.

Unexpected Results
------------------
/url/urlsearchparams-sort.any.html
  UNEXPECTED-PASS Parse and sort: z=b&a=b&z=a&a=a
  UNEXPECTED-PASS Parse and sort: ffi&🌈
  ...
  UNEXPECTED-PASS Parse and sort: a🌈&a💩

Then, all I need to do is to determine the correct expectation. In this case, the tests were not UNEXPECTED-PASS-ed so I just removed them from the ini configuration.

Test Expectations Update

After removing those UNEXPECTED-PASS tests, I ran the test to verify again

Expand to check `test-wpt` logs
→ ./mach test-wpt tests/wpt/web-platform-tests/url/urlsearchparams-sort.any.js
 0:03.44 INFO Using 8 client processes
 0:08.72 INFO Starting http server on 127.0.0.1:8000
 0:08.72 INFO Starting http server on 127.0.0.1:8001
 0:08.73 INFO Starting https server on 127.0.0.1:8443
 0:10.95 SUITE_START: web-platform-test - running 2 tests
 0:10.95 INFO Running reftest tests
 0:10.96 INFO No reftest tests to run
 0:10.96 INFO Running wdspec tests
 0:10.96 INFO No wdspec tests to run
 0:10.96 INFO Running testharness tests
 0:10.97 INFO Starting runner
 0:10.98 INFO Starting runner
 0:10.98 INFO No more tests
 0:10.98 INFO No more tests
 0:10.99 INFO No more tests
 0:10.99 INFO No more tests
 0:10.99 INFO No more tests
 0:10.99 TEST_START: /url/urlsearchparams-sort.any.html
 0:10.99 INFO No more tests
 0:11.00 TEST_START: /url/urlsearchparams-sort.any.worker.html
 0:11.18 pid:70796 Full command: /servo/target/debug/servo --hard-fail -u Servo/wptrunner -Z replace-surrogates -z http://web-platform.test:8000/url/urlsearchparams-sort.any.html --user-stylesheet /servo/resources/ahem.css --certificate-path /var/folders/dx/l5pn75zx5v9_cwstvgwc5qyc0000gn/T/tmpRe9gag/cacert.pem
pid:70796 VMware, Inc.
 0:11.18 pid:70797 Full command: /servo/target/debug/servo --hard-fail -u Servo/wptrunner -Z replace-surrogates -z http://web-platform.test:8000/url/urlsearchparams-sort.any.worker.html --user-stylesheet /servo/resources/ahem.css --certificate-path /var/folders/dx/l5pn75zx5v9_cwstvgwc5qyc0000gn/T/tmpRe9gag/cacert.pem
pid:70797 VMware, Inc.
 0:11.18 pid:70796 softpipe
 0:11.18 pid:70797 softpipe
 0:11.18 pid:70796 3.3 (Core Profile) Mesa 18.3.0-devel
 0:11.18 pid:70797 3.3 (Core Profile) Mesa 18.3.0-devel
 0:12.74 TEST_END: Test OK. Subtests passed 16/17. Unexpected 0
 0:12.74 INFO No more tests
 0:12.75 INFO Closing logging queue
 0:12.75 INFO queue closed
 0:12.76 TEST_END: Test OK. Subtests passed 16/17. Unexpected 0
 0:12.76 INFO No more tests
 0:12.77 INFO Closing logging queue
 0:12.77 INFO queue closed
 0:12.77 INFO Got 0 unexpected results
 0:12.77 SUITE_END

web-platform-test
~~~~~~~~~~~~~~~~~
Ran 36 checks (2 tests, 34 subtests)
Expected results: 36
OK
 0:12.80 INFO Closing logging queue
 0:12.80 INFO queue closed

‼️ Notice the UNEXPECTED-PASS tests disappear ‼️

From the above logs, there is no UNEXPECTED test results and that’s what I wanted. So, I could update the expectations and the implementation to the PR.

How if there are a large amount of test expectations we need to update

You can check the detailed documentation at Updating test expectations section in Servo wpt readme.

The command is like:

$ ./mach test-wpt --log-raw=/tmp/servo.log <path>
$ ./mach update-wpt /tmp/servo.log
  1. Testing with --log-raw argument to make mach log the test expectations into the given log path
  2. Using update-wpt with the log path to update ini files automatically

There’s a PR from me running the commands in servo/servo#22278.

Adding / Updating WPT tests

Not only running existing tests, you can also add more tests to upstream wpt from Servo if necessary. Also, if a existing upstream test is wrong, you can update it in Servo and it will be synchronized to upstream repo.

Adding WPT tests as an example

Let’s see servo/servo#23073 PR as an example.

The issue was found by @KwanEsq when updating insertRule in CSSStyleSheet.webidl. After updating the second argument of insertRule as optional unsigned long index = 0, the css/cssom/insertRule-namespace-no-index.html test case became CRASH. So, the follow-up issue was filed at servo/servo#23028 and fixed by @sbansal3096 at servo/servo#23073.

The root cause is, the rule was inserted with no owner (ref) so actually Servo should pass a loader from a corresponding owner.

After @sbansal3096’s patch, the CRASH was fixed but the test case was about namespace. So, @jdm commented about having a specific test case for insertRule("@import url(xxx)") is good.

Once @sbansal3096 updated the PR with upstream tests, @servo-wpt-sync bot, which listens on GitHub WebHook, sensed the change and sent a PR to upstream. Besides, a comment was made in the PR to notify the author accordingly.

Reviewers also need to review the added WPT tests. After the Servo PR merged, @servo-wpt-sync knows it and helps to merge the upstream WPT PR.

Upstream WPT Synchronization

Speaking of @servo-wpt-sync, it has another important mission: upstream WPT synchronization.

Most browser engines have a bot to synchronize upstream WPT tests between their own codebase and the web platform tests repo.

It’s important to sync the upstream WPT tests to each codebase of those browser engines. Imagine wpt tests under those browser engines are forked repositories so once the upstream wpt repo updates, forked repositories will want to synchronize to have latest tests.

For example, Gecko engineers find a bug in a test and fix it in Gecko codebase. Other browser vendors would like to run the fixed test. That’s why all of them need to do synchronization with upstream everyday.

In Servo, @servo-wpt-sync not only sync updated or added tests in Servo to upstream but also sync upstream wpt tests to Servo everyday.

You can check those Servo daily synchronization PRs here.

Conclusion

While understanding features and implementation in software engineering, tests is an important part. To Web Platform, WPT plays the role. When you’d like to implement a feature to a browser engine, you can play your implementation with WPT tests.

Sometimes, there might be some existing buggy tests. Don’t be afraid to file an issue to WPT! Also, WPT itself is a good open source project to contribute. Try good first issues to start!

Thanks for reviews from @jdm and @xu3u4