Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf: remove ts resolved module cache file #4293

Merged
merged 1 commit into from May 19, 2024

Conversation

jsaguet
Copy link
Contributor

@jsaguet jsaguet commented May 4, 2024

Summary

I noticed that running ts-jest was very slow in one of my projects so I dig a little into why and noticed the performance bottleneck comes from the getCacheKey function and in particular the writeFileSync that is being called to store the dependency graph.

Writing this file is what's taking most of the time in this process as the content we write is very heavy : the whole file content and its resolved module names.

I ran a few tests on a project with different configurations and here are the result:

Case 1st run (clean jest cache) 2nd run (with jest cache)
Current ts-jest (Writing dependency graph file) 218s 16s
No cache (--no-cache) 44s 44s
This PR version (No dependency graph file) 47s 14s

As we can see from the results, the fastest way to run ts-jest is by disabling jest cache (--no-cache) but we can get pretty close to that timing when we skip writing the dep graph file.

What's even more interesting is that on subsequent runs, even reading the dep graph file is less performant than recomputing the graph (2s less in this case)

So I'm opening this PR to completely remove the dependency graph file as it seems to only slow down the process without any gain. It should largely improve ts-jest performance when not using isolated modules.

Fixes #2696

Test plan

Does this PR introduce a breaking change?

  • Yes
  • No

Other information

@jsaguet jsaguet requested a review from kulshekhar as a code owner May 4, 2024 18:17
@jsaguet jsaguet force-pushed the perf/remove-resolved-modules-cache branch from a17731b to d9faba0 Compare May 10, 2024 19:22
@ahnpnl
Copy link
Collaborator

ahnpnl commented May 19, 2024

In the past, we introduced that checking dep graph and includes related files into cache key because we had the issue in watch mode.

You can check #2436 for more information.

Reverting that PR is fine, it will solve problems for some users but introduce past problems for other users.

I haven’t found a way how to do type checking probably either. Apparently doing type checking inside Jest transformer is very poor performant. If we can somehow do it from top level before Jest runs, it would be ideal.

@jsaguet-betclic
Copy link

jsaguet-betclic commented May 19, 2024

The dep grah itself and how it's computed is not removed or changed with this PR.
It will just be recomputed every time the cache key is computed.

The only change is that it will not be stored on the file system between runs because that's the part that slows ts-jest a lot.

Re-computing it is a lot faster

@ahnpnl
Copy link
Collaborator

ahnpnl commented May 19, 2024

Ah I see, lgtm

@ahnpnl
Copy link
Collaborator

ahnpnl commented May 19, 2024

Something is already wrong with the tests on main, we can merge this since it’s not related.

@ahnpnl ahnpnl merged commit 4c88da5 into kulshekhar:main May 19, 2024
7 of 13 checks passed
@decline
Copy link

decline commented May 21, 2024

Just wanted to leave a big THANK YOU for this fix! We have a very large codebase where we couldn't even run all tests at once anymore, without setting isolatedModules: true because of the long run times. Now everything is super fast and we can have type-safety again by removing the isolatedModules setting. Really happy 🎉

@jsaguet jsaguet deleted the perf/remove-resolved-modules-cache branch May 21, 2024 19:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Massive performance degradation in getCacheKey()
4 participants