Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use React.createContext() with observed bits #1021

Closed
wants to merge 20 commits into from

Conversation

theKashey
Copy link

This is extension of #1000 enhanced with "Observed Bits" specially for #1018

Key findings:

  1. It works out of the box!
  2. If changed bits are zero(nothing changed) context will not propagate anything. Probably this is what we want, but at least some tests are failing due it. I am adding a new prop to connect - observer, to let "observe" any changes in state, and setting the first bit in the change always to true. Probably pure is better fit for this. Probably we could just remove tests.
  3. If observed bits are 0, but mapStateToProps defined - component is treated as observer and will not use observedBits. This is for cases state => state (no slice used). Probably it's better scan result in search of state, but right now mappedProps are not accessible from connect.
  4. Normal elements await for changes in bit 2(store), and bit 3(hashing func) changes, rest bits are used to represent changes among slices.
  5. mapStateToProps is given mocked state, observed bits calculated using es5 getters. Proxy is not needed here, but it's better to double measure the speed.

cellog and others added 16 commits August 14, 2018 11:57
also clean up unused canary values
This will allow third party apps to use these in their code
* export Context instead of just Consumer/Provider
* fix error messages for removed functionality
* minor displayName change
when the time is right, one need only
change the BABEL_ENV for build:umd:min back to "rollup-production"
* React.forwardRef is VERY slow, on the order of 2x slower in our benchmark. So we only enable it if withRef="forwardRef" folks using withRef=true will get an error telling them to update and not rely on getWrappedInstance() but just to use the ref directly
* renderCountProp is removed, as this is natively supported in React dev tools now
* all usages of shallowEquals are removed for pure components, it was unnecessary.
* instead of allowing passing in a custom Context consumer in props, it is now required to be passed in via connect options at declaration time.
@markerikson
Copy link
Contributor

Cool! I'll try to play around with this and get a better understanding of how it works, and see how it performs in the benchmarks.

One initial question: how well does this handle the top-level state shape changing at runtime, such as dynamically adding slice reducers from code-splitting?

@theKashey
Copy link
Author

How well does this handle the top-level state shape changing at runtime?
Quite well :) First three bits in the mask are reserved for:

  • b1. Any changes. All unpure connects has this bit to be always informed about any changes
  • b2. Store updates. I am not sure we need this bit, sounds like it's impossible to change store without changing state share.
  • b3. hashingFunction updates. Provider provides this function to the Connect for consistent hash calculations, and updates this function every time state shape changes.

PS: actually I forgot to add shallow check to provider, to actually update hashing function, gonna add these 3 lines right now :)

@markerikson
Copy link
Contributor

markerikson commented Sep 18, 2018

Awright, ran this through the benchmark suite, and... uh... it performs far worse than #995 or #1000 :

Results for benchmark stockticker:
+----------------------------------------------------------------
¦ Version          ¦ Avg FPS ¦ Scripting ¦ Rendering ¦ Painting ¦
¦ 5.0.7            ¦ 17.76   ¦ 16775.78  ¦ 10057.74  ¦ 1974.30  ¦
¦ 6.0-1000-4855c84 ¦ 15.33   ¦ 18308.03  ¦ 8906.81   ¦ 1719.75  ¦
¦ 6.0-1021-129ad67 ¦ 1.63    ¦ 23771.81  ¦ 1969.22   ¦ 335.66   ¦
¦ 6.0-995-9210282  ¦ 14.61   ¦ 19251.80  ¦ 8157.55   ¦ 1581.22  ¦
+----------------------------------------------------------------


Results for benchmark twitter-lite:
+-----------------------------------------------------------------
¦ Version          ¦ Avg FPS ¦ Scripting ¦ Rendering ¦ Painting ¦ 
¦ 5.0.7            ¦ 58.37   ¦ 20583.79  ¦ 6644.07   ¦ 763.82   ¦ 
¦ 6.0-1000-4855c84 ¦ 55.94   ¦ 22541.77  ¦ 5413.20   ¦ 674.57   ¦ 
¦ 6.0-1021-129ad67 ¦ 48.36   ¦ 23830.79  ¦ 4338.93   ¦ 596.82   ¦ 
¦ 6.0-995-9210282  ¦ 55.17   ¦ 23124.32  ¦ 4980.42   ¦ 648.05   ¦ 
+-----------------------------------------------------------------

And at the moment, I can't even get it to finish the tree-view benchmark with the current settings I'm using (4000 components, 50ms updates). The page just locks up, both in Puppeteer and the browser.

Looking at the code, I think I can see a couple possibilities why.

First, this looks like it's going to call setState() in Connect every time the component re-renders and the state changed in some way:

componentDidUpdate(prevProps) {
if (prevProps.observedBits !== this.props.observedBits) {
this.props.setObservedBits(this.props.observedBits)
}
}

setObservedBits(bits) {
if (!connectOptions.observer && (bits || !selectorFactory)) {
this.setState(state => {
if (state.observedBits !== bits) {
return {
observedBits: 6 | bits
}
}
return null;
})
}
}

Second, it's doing at least a little bit extra work for each memoization check:

const couldProxyState = typeof state === 'object' && !Array.isArray(state)
if (couldProxyState) {
const stateProxy = {}
Object.defineProperties(stateProxy, stateGetter);
lastDerivedProps = sourceSelector(stateProxy, props)
return {
derivedProps: lastDerivedProps,
observedBits
}
}

We'd have to dig further and see exactly where the bottleneck is here, I guess.

@theKashey , can you clarify why that component is calling this.props.setObservedBits() like that?

@theKashey
Copy link
Author

Ok. I have a few other options to try. The one I did is just more vanilla and passing all the tests.
Actually I was pretty sure that this should be fast, as long we are removing much more expensive React operations.
Could you point me on performance tests to try? Also - there is a few better options to handle setObservedBits

@theKashey , can you clarify why that component is calling this.props.setObservedBits() like that?

So, it's complex. It's a problem.

  • Connect renders Consumer, proving render prop
  • Inside renderProp we could calculate observedBits, but there is no way to call setState from a render.
  • Even componentDidUpdate will not work here, as long it did not - Consumer just used class method.
  • So - the only way for sync update - pass observedProp to the underlaying component, and call a prop, passed from a parent, on componentDidUpdate.
  • It should be called only if bits got changed
  • This will set a new observedBits on Consumer, and re-render Connect (double render! whats expensive!)
  • But as long derivedProps are not changed, and observed bits are not changed (this time) - PureComponent wrapped around WrappedComponent will block update on the real component.
  • Probably here we shall skip derivedProps calculation, and it's easy to do.

I would try to change 2 places:

  • how state wrapped with observer (I've got 3 different versions)
  • how derivedProps are generated, to skip any computations on bits change.

@theKashey
Copy link
Author

Actually - this is quite strange, but both examples are using flat arrays as a store. The logic behind my change not working for arrays at all. Probably slow down was driven my double render of components.

PS: I could not run benchmark - package installation failing. Working only with yarn.

@markerikson
Copy link
Contributor

Yeah, you're right about the array state shape being unusual.

We ought to rework the benchmarks so they use something that looks a bit more typical, even if it's just splitting the root state into four different slices or something.

@markerikson
Copy link
Contributor

Tried to do some perf investigations yesterday. It looks like the calculation of observedBits in every call to the memoizer is relatively expensive, although I'm kinda struggling to figure out what else is slowing things down if anything.

I think what I'd like to see for now is a new connectAdvanced option that basically says "I confirm this component's data access patterns don't change over time", so that we can precalculate the observedBits in the constructor of Connect.

Normally we wouldn't have access to the store state in the constructor there. However, 16.5 now has Context.unstable_read(), which means we can imperatively pull the value out of the context in the constructor, and use that to calculate observedBits once on creation.

@theKashey
Copy link
Author

Context.unstable_read will solve half of perf issues, but I dont think it's possible to precalculate used slices. To be more concrete - it is possible if you are using reselect - first select, next combine, but with normal code, where select and combine are used together - it's not stable.

Right now realization of observing is not quite cool - two function calls. Will try to improve it.

! What are you thinking about:

  • enabling this feature only for modern browsers, with Proxy support, and disabling for the rest (IE11, React Native)
  • adding a special function, with predeclared slice usage. Basically it's quite easy to store this information in reselect stats and calculate once on selector creation.

@theKashey
Copy link
Author

I made some performance computations:

  • https://jsperf.com/obj-vs-set/1, about using Sets or Objects for recording property access. Set is a clear winner(25% faster). Not using Set or Object to store result, but just to store flag, and storing result in a separate array - doubles the speed(73% faster). Especially on Safari (90% faster).

  • https://jsperf.com/proxy-vs-decorator, about using Proxy or ES5 decorator. There is no clear winner - Proxies are faster in Chrome (13%), but slower in Safari(25% slower).

@theKashey
Copy link
Author

Updated with faster key tracker, I've spiked in https://github.com/theKashey/with-known-usage, and Context.unstable_read, which, anyway, is not yet supported :)
@markerikson - could you remeasure speed?

@theKashey
Copy link
Author

facebook/react#13739 - changing context information itself causes whole tree re-render, as long you render parent component.
Probably we have to have a propagation blocker next to Provider, or wait for a better solution.

PS: Actually we could use observedBits even with "old" redux.

@timdorr timdorr changed the title 6.x with "observed bits" Use React.createContext() with observed bits Oct 28, 2018
@timdorr
Copy link
Member

timdorr commented Nov 6, 2018

#1000 merged in, so I'm going to close this out. If we want to try more experiments with bitmasks, we should probably start over.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants