New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch integration tests runner from Jest to Mocha #6517
Conversation
9384c01
to
8f216c3
Compare
Otherwise it may be purely an effect of terdown not been propagated successfully
It was broken when smart service name resolution got introduced with 868c8a3
8f216c3
to
1d2810e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just looked through the code and it looks good 👍
Great to see only one test runner being used. While testing the log format was a little bit off and the logs were only shown once the test was done. I'm fine with that but I really enjoyed the live log-streaming of Jest (with lines in the codebase). This helped a lot debugging the integration tests since I could e.g. immediately cat
the serverless.yml
file in the temporary directory without waiting for the test to finish.
Since this is something more Jest vs. Mocha related I'd say that we're good and there's no need to complicate things via Monkey-patching.
@@ -12,7 +12,8 @@ const { createRestApi, deleteRestApi, getResources } = require('../../utils/api- | |||
|
|||
const CF = new AWS.CloudFormation({ region }); | |||
|
|||
describe('AWS - API Gateway Integration Test', () => { | |||
describe('AWS - API Gateway Integration Test', function() { | |||
this.timeout(1000 * 60 * 10); // Involves time-taking deploys |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any way to set the timeout globally for all integration tests?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's nothing in Mocha that allows to set directory specific settings, and I don't think it's good to have it set to this value globally (e.g. that's how it was set with Jest)
Still I think it's not that big drawback, also timeout across integration tests may differ. Here we deal with CF deploys which are very long, but in case of packaging where there are only SDK calls such long timeout is not needed.
Jest by default doesn't output anything (apart of progress bar) until test finalizes. We actually kinda hacked jest with Still there's actually a problem with this output even here. We're still not notified of individual test fails while test goes (no live stream of that), and what's worse it mixes output from all tests being run. While in our current case it's not that eminent (yet), In some scenarios it prevents to reliably debug a test fail, as we can't clearly see which exactly logs came from given test run. With newly configured Mocha version, in parallel run output for given test is flushed once test finalizes (so you can easily see which log output is from which test, as they appear one after another) While when single test file is run, all output is live streamed on the go, together with tests results info (that part is missing on Jest side, no matter what -> jestjs/jest#6616). So if you want to have output on the go, just run one test file that focuses on functionality you're working on, and you'll have then even better output than Jest gave. Alternatively you may also run multiple test files but sequentially via Providing output on the go with parallelization is involved, to be reliable will have to involve some window splitting I believe. I'm not sure if ability to consume mixed output from different test runs can be considered as worthwhile feature.
Yes, this is how Jest decorates |
Thanks for the in-depth explanation @medikoo 👍 |
There are various problems with Jest:
stdout
andstderr
are output of out sync stdout and stderr are written to console out of sync jestjs/jest#6718useStderr
option)beforeAll
do not prevent execution of tests -> Tests are run even when beforeAll throws error jestjs/jest#2713, failing beforeAll() causes even passing tests in the scope to fail jestjs/jest#6695afterAll
are not exposed -> afterAll inside a describe hides errors jestjs/jest#6692only
andskip
are not fully respected -> Nested BeforeAlls execute when marking higher level tests as only jestjs/jest#4166Mocha is also not free from problems (still all known to me are patchable). It also doesn't provide test run parallelization and proper isolation, but we have achieved that through
script/test-isolated.js
Additionally:
script/test-isolated.js
to handle well case of integration testsaws-nodejs
, after it reflected folder name). Uncommented also test run in Travis CIIs it a breaking change?: NO