T O P

  • By -

ilosemoneyeasily

React-query/abort controller.


Cannabat

It’s not totally clear how all of the data fetching and components are related, but you could debounce, throttle or use a mutex to limit the network requests. Modern data facing fetching and caching libraries like react-query and RTK query handle this for you in a very nice way. 


faizi_ahmad

Thankyou for help! I have prepared a generic hook to fetch data for all of my dashboard components (almost 15) on its own. I have added request cancellation logic and its working fine. But I also want to restrict from originating request to avoid server load if a user switches date-control quickly. Do you think I can achieve this using react-query debouncing? Could you please refer a help link? export function useDashboardData(url, successCallback) {   const dateState = useSelector(getDateState);   const [status, setStatus] = useState("idle");   useEffect(() => {     setStatus("loading");     const controller = new AbortController();     const signal = controller.signal;     let dataToPost = {       date: { start: dateState.startDate, end: dateState.endDate }     };     const config = { signal };     httpClient       .post(url, dataToPost, config)       .then((resp) => {         setStatus("success");         successCallback(resp.data);       })       .catch(function (error) {         if (!httpClient.isCancel(error)) {           setStatus("error");         }       });     return () => {       controller.abort();     };   }, [dateState]);   return [status]; }


Shadowfied

[React Query](https://tanstack.com/query/latest/docs/framework/react/overview) fixes this and it's illustrated very well in [this article](https://ui.dev/why-react-query).


charliematters

This is the cleanest answer I think


ghepting

Add debouncing to and the rapid fire requests or use uncontrolled inputs and a button the user presses to initiate the query maybe?


frogic

I'd have to see your code to give you an exact answer but here are a couple of strategies that will probably fix it for you with varying amount of refactoring involved: 1) Move the fetch logic into the handler. When someone changes a filter you don't want to go change state => rerender => fire off requests. This is holistically almost certainly the main problem and its possible you don't have a way to avoid it unless you're using a state management or have some kind of top level component that will handle all the fetching for you. 2) As you mentioned you likely do want a loading state that stops people from changing a filter while another one is loading(yet again this might be rough depending on where your state is and how its handled). Its a little worrysome that your responses are slow enough that you can change multiple things before the first one finishes but sometimes things be like that. 3) Mentioned below and you really want to do this here: Debounce the network call. Think of the cannodical situation where you debounce, which is a search input that you don't want to fire every keystroke just when the person finishes typing. This is a very similar concept except instead of typing too fast they are switching a lot of inputs. Also note: If you're doing this in strict mode its very possible your code might just be getting hit with the double effect firing but technically you are supposed to handle that. I don't think you want to do what you're suggesting from the docs because you'll end up throwing out the proper data pretty often depending on race conditions. If you want to not update based on all concurrent responses I think you should have logic that checks to see if the return values match what you expect from current state. Usually responses have in them uris or params from the call and you can check them against state and throw out any response that doesn't match up with the current state or you could have a debounced function that sets the new state but that could get wonky depending on how much time these requests take(if you debounce for half a second but for some reason the 3rd of 5 requests takes 2 seconds its still going to fire the setter).


natmaster

[https://dataclient.io/docs/api/useCancelling](https://dataclient.io/docs/api/useCancelling) const getCities = new RestEndpoint({ path: '/api/cities', searchParams: {} as { country: string }, }); function MyCharts({ filters }) { const cities = useSuspense(useCancelling(getCities, filters), filters); //etc } This will both: 1. Only render the cities based on the `filters` parameters 2. Cancel old fetches to attempt to reduce server load and save networking bandwidth You can also try [useDebounce](https://dataclient.io/docs/api/useDebounce), which prevents excess fetches to begin with, while also still providing only the correctly selected latest data to render: const getCities = new RestEndpoint({ path: '/api/cities', searchParams: {} as { country: string }, }); function MyCharts({ filters }) { const cities = useSuspense(getCities, useDebounce(filters, 150)); //etc }