React Virtualization: A Developer's Guide to Optimizing Large Lists
From Manual Implementation to Modern Solutions: Exploring React-Virtualized and React-Window
Introduction
One time as a React developer, I worked on a project that had the response of a request with over 5,000 objects in an array. I had recently started consuming APIs so I was all about calling APIs and looping the responses. I noticed a huge performance drop, but I didn't care much as I had done what was needed. I did push my code to production. Lol felt like the man of the hour. Not until my team lead came at me and complained with crazy feedback. I felt bad and that was when the idea of loading specific data per scroll came to birth for me - there was a performance breach.
Think Optimization, think performance:
From that scenario, my mind was fixated on thinking about performance and optimization. I quickly started my research and finally found a way to load components per scroll. I wrote custom functions for all of this, and boom the project became faster. My code was not all that because I did lots of iterations myself.
On doing this, I could see why it's important to have data in chunks. This doesn't only improve performance but also gives a better UI feel.
For the sake of this article, I am going to share my first custom implementation of this concept and how things have changed over time. What then is react virtualization?
React Virtualization:
React Virtualization is a technique used to optimize the rendering of large lists and grids in React applications. The idea is to only render the items that are currently visible in the user's viewport, thereby reducing the number of DOM nodes that React needs to manage.
The basic concept of virtualization is windowing. Instead of rendering the entire list or grid, React virtualization creates a 'window' of visible content and renders only the components within this window. As the user scrolls, the window moves, and React intelligently unmounts the components that have scrolled out of view and mounts the ones that have scrolled into view.
In simpler terms, like in the case of the short story of my code - instead of loading and displaying tons of data, we only load what you can see, and as you scroll down more contents are loaded. This here is the idea behind virtualization/windowing.
Implementation:
As previously mentioned, there are both manual and library-based approaches to implementing this method. In my initial experience, I opted for the manual route, which provided valuable insights into the process. Additionally, there are two established libraries available that simplify the implementation.
In this section, we will explore all available options. We'll begin by learning how to implement the solution manually and then proceed to leveraging the libraries for a more streamlined approach.
Manual Implementation:
Let's look into creating a virtualized list from scratch. The core concept involves calculating which items should be visible based on the scroll position and viewport size.
First, let's look at our imports and initial setup:
import { useEffect, useRef, useState } from "react";
const useVirtualization = (initialData = []) => {
const ITEMS_PER_PAGE = 20;
const [items, setItems] = useState([]);
const [loading, setLoading] = useState(false);
const [hasMore, setHasMore] = useState(true);
const loadingRef = useRef(null);
const currentPage = useRef(1);
What's happening here? We're creating a custom hook and setting up our essential state management:
ITEMS_PER_PAGE
: This is like our window size - we're saying "show 20 items at a time"items
: This holds our currently visible itemsloading
: A flag to show when we're fetching more itemshasMore
: Tells us if there's more data to loadloadingRef
: This is our secret weapon - we'll use it with Intersection ObservercurrentPage
: Keeps track of where we are in the data
Now, let's look at our initialization:
useEffect(() => {
if (initialData?.length > 0) {
setItems(initialData.slice(0, ITEMS_PER_PAGE));
currentPage.current = 1;
setHasMore(initialData.length > ITEMS_PER_PAGE);
}
}, [initialData]);
This part is like setting up our initial view:
When we get our data, we take the first chunk (20 items)
Reset our page counter to 1
Check if there's more data to show later
Here's where the magic happens - loading more items:
const loadMoreItems = async () => {
if (loading || !hasMore) return;
setLoading(true);
try {
await new Promise(resolve => setTimeout(resolve, 500));
const startIndex = currentPage.current * ITEMS_PER_PAGE;
const endIndex = startIndex + ITEMS_PER_PAGE;
const newItems = initialData?.slice(startIndex, endIndex);
if (!newItems?.length) {
setHasMore(false);
} else {
setItems(prev => [...prev, ...newItems]);
currentPage.current += 1;
setHasMore(endIndex < initialData?.length);
}
} finally {
setLoading(false);
}
};
This is our workhorse function:
First, we check if we're already loading or if we've run out of items
We calculate where to start and end based on our current page
Grab the next chunk of items
If we got new items, add them to our list and move to next page
If no new items, we've reached the end!
The really cool part is how we detect when to load more:
useEffect(() => {
const observer = new IntersectionObserver(
entries => {
if (entries[0].isIntersecting && hasMore && !loading) {
loadMoreItems();
}
},
{ threshold: 0.1 }
);
if (loadingRef.current) {
observer.observe(loadingRef.current);
}
return () => observer.disconnect();
}, [hasMore, loading]);
This is like having a sensor at the bottom of your list:
When you scroll near the bottom, it triggers more items to load
The threshold of 0.1 means it triggers when the loading element is 10% visible
We clean up our observer when the component unmounts
Here’s a complete custom hook for this implementation
import { useEffect, useRef, useState } from "react";
const useVirtualization = (initialData = []) => {
const ITEMS_PER_PAGE = 20;
const [items, setItems] = useState([]);
const [loading, setLoading] = useState(false);
const [hasMore, setHasMore] = useState(true);
const loadingRef = useRef(null);
const currentPage = useRef(1);
useEffect(() => {
if (initialData?.length > 0) {
setItems(initialData.slice(0, ITEMS_PER_PAGE));
currentPage.current = 1;
setHasMore(initialData.length > ITEMS_PER_PAGE);
}
}, [initialData]);
const loadMoreItems = async () => {
if (loading || !hasMore) return;
setLoading(true);
try {
await new Promise(resolve => setTimeout(resolve, 500));
const startIndex = currentPage.current * ITEMS_PER_PAGE;
const endIndex = startIndex + ITEMS_PER_PAGE;
const newItems = initialData?.slice(startIndex, endIndex);
if (!newItems?.length) {
setHasMore(false);
} else {
setItems(prev => [...prev, ...newItems]);
currentPage.current += 1;
setHasMore(endIndex < initialData?.length);
}
} finally {
setLoading(false);
}
};
useEffect(() => {
const observer = new IntersectionObserver(
entries => {
if (entries[0].isIntersecting && hasMore && !loading) {
loadMoreItems();
}
},
{ threshold: 0.1 }
);
if (loadingRef.current) {
observer.observe(loadingRef.current);
}
return () => observer.disconnect();
}, [hasMore, loading]);
return {
items,
loading,
hasMore,
loadingRef,
loadMoreProducts,
formatPrice,
};
};
export default useVisualization;
Usage in a component.
import { Card } from "@/components/ui/card";
import useVisualization from "@/hooks/useVisualization";
import { Loader2 } from "lucide-react";
const MoreProducts = ({ productsDemo }) => {
const { items, loading, hasMore, loadingRef, formatPrice } =
useVisualization(productsDemo);
return (
<div className="container mx-auto px-4">
<h2 className="text-3xl font-bold my-8 text-gray-800 dark:text-gray-100">
More to love
</h2>
<div className="grid grid-cols-1 sm:grid-cols-2 md:grid-cols-3 lg:grid-cols-4 xl:grid-cols-5 gap-6">
{items?.map((product) => (
<Card
key={product?.asin}
className="group bg-white dark:bg-gray-800 rounded-2xl overflow-hidden hover:shadow-xl transition-all duration-300 relative border-0">
<div className="relative aspect-[4/3] p-3 flex items-center justify-center overflow-hidden bg-gray-50 dark:bg-gray-900">
<img
src={product?.product_photo}
alt={product?.product_title}
className="w-[full] h-full object-contain transform group-hover:scale-105 transition-transform duration-500"
/>
</div>
<div className="p-5 space-y-4">
<h3 className="font-medium text-gray-800 dark:text-gray-100 line-clamp-2 min-h-[2.5rem] text-sm">
{product?.product_title}
</h3>
<div className="flex items-baseline gap-2">
<span className="text-xl font-bold text-gray-900 dark:text-white">
{formatPrice(product?.product_price)}
</span>
</div>
</div>
</Card>
))}
</div>
<div ref={loadingRef} className="w-full py-12 text-center">
{loading && (
<div className="flex flex-col items-center gap-3">
<Loader2 className="w-8 h-8 animate-spin text-gray-500 dark:text-gray-400" />
<p className="text-gray-500 dark:text-gray-400 font-medium">
Loading more products...
</p>
</div>
)}
{!hasMore && products.length > 0 && (
<p className="text-gray-600 dark:text-gray-400 font-medium">
You've reached the end of the list
</p>
)}
</div>
</div>
);
};
export default MoreProducts;
Key Concepts in Manual Implementation:
Scroll Position Tracking:
We use useState to track the current scroll position
The onScroll event handler updates this value as the user scrolls
Viewport Calculations:
Calculate which items should be visible based on the scroll position
Include a buffer zone to prevent blank spaces during fast scrolling
Position Management:
Use absolute positioning to place items correctly
Maintain a container with the full height of all items
Advantages of Manual Implementation:
Complete Control
Full control over rendering logic
Can be optimized for specific use cases
Easy to add custom features
No Dependencies
No need to install additional packages
Smaller bundle size
No version compatibility issues
Learning Opportunity
Better understanding of virtualization concepts
Valuable experience in performance optimization
Enhanced debugging skills
Disadvantages of Manual Implementation:
Complex Implementation
Requires careful handling of scroll events
Need to manage edge cases manually
More prone to bugs
Maintenance Overhead
Updates and fixes must be handled internally
May need regular optimization
Testing requirements are higher
Limited Features
Basic functionality only
Advanced features need to be built from scratch
May miss optimizations present in established libraries
React Virtualization Implementation:
React-Virtualized is a comprehensive library for implementing virtualization. Here's a detailed look at its implementation:
import { List, AutoSizer, WindowScroller, InfiniteLoader } from 'react-virtualized';
const rowHeight = 50;
const ITEMS_PER_LOAD = 20;
The imports give us our tools:
List
: The main virtualized list componentAutoSizer
: Handles responsive sizingWindowScroller
: Manages window scrollingInfiniteLoader
: Handles loading more data
Here's our row renderer:
const rowRenderer = ({ key, index, style }) => {
const item = items[index];
if (!item) {
return (
<div key={key} style={style} className="loading-row">
Loading...
</div>
);
}
return (
<div key={key} style={style} className="row">
{item.content}
</div>
);
};
This is like our template for each row:
Gets called for each visible item
Shows a loading state if the item isn't loaded yet
Style comes from React-Virtualized and handles positioning
Advantages of React-Virtualized:
Rich Feature Set
Multiple components (List, Grid, Table)
Advanced features like variable heights
Built-in AutoSizer and CellMeasurer
Production Ready
Well-tested in production environments
Active community support
Regular updates and bug fixes
Flexibility
Highly customizable
Supports complex layouts
Handles edge cases automatically
Disadvantages of React-Virtualized:
Bundle Size
Larger package size
May impact initial load time
Includes unused features
Learning Curve
Complex API
Many configuration options
Requires understanding of concepts
Performance Overhead
Additional wrapper components
More complex rendering cycle
Memory usage with large datasets
React-Window Implementation:
React-Window is a modern, lightweight alternative to React-Virtualized. Here's a detailed implementation:
import React, { useState, useCallback } from 'react';
import { FixedSizeList as List } from 'react-window';
import InfiniteLoader from 'react-window-infinite-loader';
import AutoSizer from 'react-virtualized-auto-sizer';
Let's talk about what each import does:
FixedSizeList
: This is our main component for fixed-height itemsInfiniteLoader
: Handles loading more data as we scrollAutoSizer
: Makes our list responsive to container size
Here's our basic setup:
const WindowList = ({ data }) => {
const [items, setItems] = useState([]);
const [loading, setLoading] = useState(false);
const itemCount = 1000;
const itemSize = 50;
Now, let's break down each major component:
- First, our loading function:
const loadMoreItems = async (startIndex, stopIndex) => {
if (loading) return;
setLoading(true);
try {
const newItems = await fetchMoreItems(startIndex, stopIndex);
setItems(prev => {
const updated = [...prev];
newItems.forEach((item, index) => {
updated[startIndex + index] = item;
});
return updated;
});
} finally {
setLoading(false);
}
};
This function:
Takes start and end indexes from InfiniteLoader
Updates our items array at specific positions
Handles loading state
- ItemLoaded check:
const isItemLoaded = useCallback(index => {
return !!items[index];
}, [items]);
This tells InfiniteLoader:
Which items are already loaded
When to trigger more loading
- Our Item renderer:
const Item = ({ index, style }) => {
const item = items[index];
return (
<div style={style} className="list-item">
{item ? item.content : 'Loading...'}
</div>
);
};
This component:
Renders each item
Shows loading state when item isn't loaded
Uses style from React-Window for positioning
- Now, putting it all together with all our components:
return (
<div style={{ height: '100vh' }}>
<AutoSizer>
{({ height, width }) => (
<InfiniteLoader
isItemLoaded={isItemLoaded}
itemCount={itemCount}
loadMoreItems={loadMoreItems}
>
{({ onItemsRendered, ref }) => (
<List
height={height}
width={width}
itemCount={itemCount}
itemSize={itemSize}
onItemsRendered={onItemsRendered}
ref={ref}
>
{Item}
</List>
)}
</InfiniteLoader>
)}
</AutoSizer>
</div>
);
Let's break down what each component is doing in this structure:
AutoSizer
:Wraps everything
Provides the width and height based on parent container
Makes our list responsive
InfiniteLoader
:Manages the loading of new items
Uses
isItemLoaded
to check what's loadedCalls
loadMoreItems
when neededProvides
onItemsRendered
andref
to the List
List
(FixedSizeList):Handles the actual virtualization
Takes dimensions from AutoSizer
Uses itemSize for each item's height
Renders only visible items using our Item component
Here's how to use this component:
const MyListComponent = () => {
const data = Array.from({ length: 1000 }, (_, index) => ({
id: index,
content: `Item ${index}`
}));
return (
<div style={{ height: '500px' }}>
<WindowList data={data} />
</div>
);
};
The data flow works like this:
AutoSizer measures available space
InfiniteLoader checks what items need loading
List renders only visible items
As you scroll, InfiniteLoader triggers new loads
New items are added to the state
List re-renders with new items
Key advantages of this setup:
Smooth scrolling with minimal memory usage
Automatic loading of new items
Responsive to container size
Clean loading states
Efficient updates
Common gotchas to watch for:
Remember to set a height on the container
Make sure itemSize matches your actual item heights
Handle loading errors appropriately
Memoize callbacks if needed for performance
This implementation is perfect when you need:
Fast scrolling performance
Simple, fixed-height items
Infinite loading capability
Responsive list sizing
Conclusion:
React virtualization is a powerful technique for handling large lists of data efficiently. Whether you choose a manual implementation, React-Virtualized, or React-Window depends on your specific needs and constraints. Understanding the advantages and disadvantages of each approach will help you make the right choice for your project.
Remember that performance optimization is an iterative process, and the best solution often depends on your specific use case. Start with the simplest approach that meets your needs and optimize further as required.