YouTube user daveachuk took over 400,000 high-resolution images captured by the Spitzer Space Telescope and combined them into an impressive simulation of the Milky Way Galaxy from the telescope’s point of view. They explained in detail how the simulation–which took five months to create–came together over on reddit.
There were two techniques — one was using an astrophotography program called PixInsight (which is entirely scriptable/automate-able, handy when dealing with many gigapixels of data!). Basically, it has a few functions which work by converting an image into wavelets (equations instead of pixels) — you then select features by size (eg objects that are 50 pixels or more across, and it does edge detection on them to find their extent), and then by messing with the wavelet equations, it can stretch the surrounding area to fill in the gap left by removing the features (stars, etc). There about 5-10 parameters for each operation that took many many days of experimenting with to try to get decent results. And then the whole process was repeated for different size/type objects, and then scripted to run through the whole dataset — which also took several days of processing time.
The other technique which worked well for objects which only appeared in one wavelength (eg the distant background stars that were only visible in the blue channel), is to apply a noise reduction filter on that blue part of the image, and subtract that from the original to retrieve the details the noise filter deleted, and those are your distant stars.