Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Range is calculated incorrectly for masonry layouts, causing items to be missed #936

Open
2 tasks done
soren121 opened this issue Feb 26, 2025 · 0 comments · May be fixed by #937
Open
2 tasks done

Range is calculated incorrectly for masonry layouts, causing items to be missed #936

soren121 opened this issue Feb 26, 2025 · 0 comments · May be fixed by #937

Comments

@soren121
Copy link
Contributor

soren121 commented Feb 26, 2025

Describe the bug

I think this is likely another issue related to #573.

While debugging an issue plaguing my project, I've noticed that calculateRange does not properly account for masonry layouts. It assumes that visible items will be directly adjacent in the measurements array, and while this is true for linear layouts, it's not true for masonry layouts.

For example, imagine a masonry layout with 3 lanes (this is the same as in the attached reproduction.) What if the first lane has an item that is much longer than the items in the other two lanes? Let's examine what will happen:

  1. calculateRange will find the item in measurements that has a start offset closest to the scroll offset.
  2. Then, it will continue incrementing indices until it finds an item whose end offset is past the virtualizer window.

The problem is, items from different lanes are interleaved in measurements. If I scroll past the start of the long item, it's very likely that items in the other lanes will have a start offset that's closer to the scroll offset. If that happens, the long item probably won't be included in the returned range, and so it will disappear too early.

There are a couple approaches I can think of to solve this:

  1. We could calculate ranges per lane, and return an array of ranges. This would be a breaking change, because we expose range and rangeExtractor in the public API.
  2. We maintain a running list of visible items, and add/evict items as needed when the scroll offset changes. Compared to the current approach where the range is always calculated "from scratch", this would avoid removing items too early, but I think it could be more challenging to implement. This would probably be a breaking change as well.

Your minimal, reproducible example

https://stackblitz.com/edit/tanstack-virtual-wg3vpdoo

Steps to reproduce

  1. Open the StackBlitz reproduction.
  2. Scroll down in the preview pane.

Items will disappear from view before they are fully outside the viewport. Very large items are especially noticeable-- they will disappear from view much too early.

Expected behavior

Masonry items should be visible until they are fully outside the viewport.

How often does this bug happen?

Every time. It's most obvious when the items vary greatly in size, and less obvious if the items are all similar sizes.

Screenshots or Videos

The background color of the example is black. Areas that appear black are gaps where masonry items should be visible, but aren't.

Screen.Recording.2025-02-25.at.4.26.53.PM.mov

Platform

  • OS: macOS 15.3.1
  • Browser: Firefox 135.0.1

tanstack-virtual version

3.13.2

TypeScript version

5.7.2

Additional context

This bug is most apparent when overscan is set to 0.

One way to work around it is to increase overscan, but this doesn't actually alleviate the problem, and depending on how large your items are, you may still have "missed" items.

Terms & Code of Conduct

  • I agree to follow this project's Code of Conduct
  • I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant