Ever since writing about them, the generator in JavaScript has become my favorite hammer. I'll wield it nearly any chance I can get it. Usually, that looks like rolling through a finite batch of items over time. For example, doing something with a bunch of leap years:
function*generateYears(start =1900){const currentYear =newDate().getFullYear();for(let year = start +1; year <= currentYear; year++){if(isLeapYear(year)){yield year;}}}for(const year ofgenerateYears()){console.log('the next leap year is:', year);}
...or lazily processing some files:
const csvFiles =["file1.csv","file2.csv","file3.csv"];function*processFiles(files){for(const file of files){// load and process filesyield`the result for: ${file}`;}}for(const result ofprocessFiles(csvFiles)){console.log(result);}
In both examples, the pool of items is exhausted once and never replenished. The for loop stops, and the final item returned by the iterator contains done: true. C’est fini.
That behavior makes sense – a generator wasn’t designed to be resurrected after it's completed. It’s a one-way street. But on at least one occasion, I've wanted it to be possible. Most recently, it happened while building a file upload tool for PicPerf. I wanted (read: demanded) to use a generator to power a replenishable, first-in-first-out (FIFO) queue. I did some tinkering, and liked where the effort ended up.
First, a bit more on what I mean by “replenishable.” A generator can't be turned on again, but we can get around that by holding it open when the queue of items becomes depleted. A great job for promises!
Let's start with this setup: dots in a queue that are individually processed every 500ms.
<html><ulid="queue"><liclass="item"></li><liclass="item"></li><liclass="item"></li></ul>
total processed: <spanid="totalProcessed">0</span></html><script>asyncfunction*go(){// A queue with some initial items.const queue =Array.from(document.querySelectorAll("#queue .item"));for(const item of queue){yield item;}}// Iterate over each one, removing along the way.forawait(const value ofgo()){awaitnewPromise((res)=>setTimeout(res,500));
value.remove();
totalProcessed.textContent=Number(totalProcessed.textContent)+1;}</script>
Here's our one-way queue:
If we had a button for pushing items to the queue, and it were clicked after the generator had completed, nothing would happen. It's dead. So, let's do some refactoring.
The one remaining problem is that return statement. We'll replace it with a promise to pause the loop until we have more items to process:
letresolve=()=>{};const queue =Array.from(document.querySelectorAll('#queue .item'));asyncfunction*go(){while(true){// Create a promise & set our // resolver for this iteration of the generator.const promise =newPromise((res)=>(resolve = res));// No items... wait until our promise is resolved.if(!queue.length)await promise;yield queue.shift();}}
addToQueueButton.addEventListener("click",()=>{const newElement = document.createElement("li");
newElement.classList.add("item");
queueElement.appendChild(newElement);// Add new item and reignite the queue!
queue.push(newElement);resolve();});// ...the rest of the code.
This time around, a new promise is created for every item. If there aren't any items to process, that promise will be await-ed until some indeterminate point in the future. For us, that's whenever the button is clicked, adding a new item to the queue.
For some finishing touches, let's put it behind a prettier API.
One quick call-out about this: you don't have to discard every item from the queue. If you'd like them all to stick around, just pivot to a version using a pointer:
Like I mentioned, PicPerf allows you to upload a bunch of images to be optimized, hosted, and cached. The UI follows a common pattern: drag things in and they'll get progressively uploaded.
This is where I wanted my first-in-first-out queue. If the pool of ”pending” images has been depleted, I should still be able to drag in more images and see the process continue. The queue would simply pick back up with the new set of items.
First, let's try our hand with a React-first approach. We'll lean hard into React's state + render lifecycle, depending on two pieces of state:
files: UploadedFile[] - this represents every file dragged into the UI. Each of these items manages its own status: pending, uploading, or completed.
isUploading: boolean - a flag for storing whether we're currently uploading a file. This'll be used as a lock, preventing another upload loop from beginning while there's one already in progress.
This version of the component works by watching whether anything has been added to files. As soon as it gets something, useEffect() kicks off the upload process. Toggling isUploading back to false will trigger another effect, causing the next image in the queue to be handled.
Here's a stripped down, contrived example of how it's structured.
import{ processUpload }from'./wherever';exportdefaultfunctionMediaUpload(){const[files, setFiles]=useState([]);const[isUploading, setIsUploading]=useState(false);const updateFileStatus =useEffectEvent((id, status)=>{setFiles((prev)=>
prev.map((file)=>(file.id=== id ?{...file, status }: file)));});useEffect(()=>{if(isUploading)return;const nextPending = files.find((f)=> f.status==='pending');if(!nextPending)return;setIsUploading(true);updateFileStatus(nextPending.id,'uploading');processUpload(nextPending).then(()=>{updateFileStatus(nextPending.id,'complete');setIsUploading(false);});},[files, isUploading]);return<UploadComponentfiles={files}setFiles={setFiles}/>;}
While there's an outstanding upload, we're still fully able to add newfiles as we wait. They'll just be tacked onto the overall list, and processed incrementally:
In terms of React component design, this isn't the worst tactic in the world. It's common to listen for state changes like this, and then respond accordingly.
Still... I think you'd be hard-pressed to find an honest person who finds the approach intuitive. The useEffect() hook is intended to synchronize a component with an external system. But this is acting more like an event-driven state machine orchestration thing. That hook is core to the behavior of the component.
Let's remedy that by swapping out all those effects for a generator-powered queue.
Rather than allowing React to own the entire list of files and their statuses, we'll pull them out and signal re-renders to occur from a different location. That'll make our component a little more "dumb" and focused on what its ultimate purpose is: render some UI.
To do this, React comes with a tool that fits our circumstances nicely: useSyncExternalStore(). This enables a component to listen for changes to data managed elsewhere. In a way, the "React-ness" of the component takes a bit of a backseat and waits for instructions from afar, rather than wholly owning the state itself. In our case, the "external store" will be a separate module responsible for processing our files.
At bare minimum, useSyncExternalStore() requires two functions: one for listening for changes to the relevant data (used to know when a component relying on the store should be re-rendered), and another for returning the latest version of that data. Here's our skeleton:
// store.tslet listeners:Function[]=[];let files: UploadableFile[]=[];// *Must* return a function for unsubscribing the listener. // (Used internally by React.)exportfunctionsubscribe(listener:Function){
listeners.push(listener);return()=>{
listeners = listeners.filter((l)=> l !== listener);};}exportfunctiongetSnapshot(){return files;}
Now, let's quickly fill in the other functions needed to make this work:
updateStatus() - Used to set whether a file is waiting, being currently uploaded, or finished.
add() - Places new files onto the queue.
process() - Kicks everything off and runs through the queue.
emitChange() - Tells React's listeners that a change has occurred and components should be updated.
In all, here's the state of the store:
// store.tsimport{ buildQueue, processUpload }from'./whatever';let listeners:Function[]=[];let files:any[]=[];const queue =buildQueue();functionemitChange(){// Quirk of using an external store: // our `files` must point to a new // reference when a change occurs.
files =[...queue.queue];for(let listener of listeners){listener();}}functionupdateStatus(file:any, status:string){
file.status = status;emitChange();}// ===// "Public" functions: // ===exportfunctiongetSnapshot(){return files;}exportfunctionsubscribe(listener:Function){
listeners.push(listener);return()=>{
listeners = listeners.filter((l)=> l !== listener);};}exportfunctionadd(newFiles:any[]){
queue.push(newFiles);emitChange();}exportasyncfunctionprocess(){forawait(const file of queue.go()){updateStatus(file,'uploading');awaitprocessUpload(file);updateStatus(file,'complete');}}
There's just one piece we're missing: proper clean-up. In the event that the component unmounts, we don't want any lingering upload processes. Let's add an abort() method to force the generator to wrap up, and stick it into our useEffect():
There are some bold assumptions we're making here for the sake of simplicity, by the way. Among them: the upload process will never fail, process() will only ever be called once at a time, and there's only one user of the store). Forgive all those things and whatever else I might've missed. The point is the bag of gains we get from this approach:
The component's behavior no longer relies on repeated useEffect() triggers.
All the file upload business is abstracted away into its own, React-free module.
You finally had a reason to leverage useSyncExternalStore().
We can gloat about how we implemented a replenishable queue with an async generator in React.
To some, this probably feels way more complicated than that "React-ish" route we initially took. I totally get that. But consider this: the more complicated we make our code now, the longer we hold off AI agents from wholly displacing us, killing our futures, and harvesting our organs. Build with that in mind!
But for real: for AI-assisted engineering to continue to be valuable, it'll need humans to help it understand the purposes, trade-offs, and futures of underlying primitives. There'll always be value in mastering that.
might be a bit more elegant? it's a bit more verbose, but your example ("extracting" the resolve function from the promise body) had me confused for a second. i think Promise.withResolvers might just be the right tool for that job; at least it doesn't feel like a hack :D
(btw, i think i never really used javascript generators before. your buildQueue function might have just convinced me to try using one next time i get the chance!)
Ondřej Velíšek
An inspiring article. Thanks for sharing it!
1
reply
Alex MacArthur
Thank you, Ondřej!
0
replies
No spam. Might be irregular. Unsubscribe whenever. Won't be hurt.