The 16 mm film format mentioned is usually a motion-picture, rather than still, camera format, generally shot at 24 frames per second (FPS), though values from 8 to 48 FPS were fairly common.
Presumably the stacking was of 100s of frames would involve combining several seconds (8.33 seconds would be 200 frames of film at 24 FPS) to generate a higher-resolution image. Success would depend on how much camera and subject movement occurred over that time, though image stabilisation should help somewhat with the former.
I suspect a clearer explanation of the process was omitted from the article, perhaps due to poor editing.
Oh, and "image stacking" is the technique of combining multiple images within a single frame, allowing for greater clarity (as in here), removing noise (frequently applied in astrophotography), or depth-of-field / focal-plane stacking.
This almost always implies digital image processing, and depending on the goal / intent, various filters or masks may be applied. De-noising typically relies on including only visual elements appearing in most (but not all) frames of the stack, eliminating glitches such as satellite flares, meteors, terrestrial-based light sources, aircraft, sensor noise, or even radiation spots.
"Take Better Night Sky Photos with Image Stacking"
Again, from TFA the source was motion-film footage from fixed-position cameras which could be used to generate images with much greater resolution than any individual frame. Note that 16mm film grain is typically fairly large, but with stacking and post-processing, smaller details can be inferred.
Presumably the stacking was of 100s of frames would involve combining several seconds (8.33 seconds would be 200 frames of film at 24 FPS) to generate a higher-resolution image. Success would depend on how much camera and subject movement occurred over that time, though image stabilisation should help somewhat with the former.
I suspect a clearer explanation of the process was omitted from the article, perhaps due to poor editing.