Subject: | Add an option to parallelize the fetching of content |
Modern CPUs have 4, 8, 16 or 32 cores. If wallflower was able to process its queue of links through several subprocesses (with the main process maintaining the list of URLs to fetch), the total runtime of generating a site could be greatly reduced.
It's not possible to settle for a simple solution like `xargs -P` or `parallel`, because wallflower maintains a list of already-visited links, which can't be shared between different processes.