On a fun #DestinationLinux #podcast 241! 😂🐧🐧

We discuss the best privacy focused search engine.
Who reigns supreme?

And we head to Jill’s Museum to see what treasure of hardware history Jill has to show us! 💻🐧❤

+our #Linux tips, tricks & picks!

destinationlinux.org/episode-2…

in reply to Destination Linux (Podcast)

I had machine that was 128GB not MB and it didn't run for shit with the default settings, it required considerably more than that to allow enough threads to decently utilize CPU and crawl websites as fast as possible, closer to 64GB which was half the machine's memory. Else it took weeks to crawl a sizeable site. See that chart you have, every time it goes from high to low it's going through a garbage collection cycle, you're spending more CPU on garbage collection than web crawling. That's no good, inefficient, you won't be able to crawl a tiny percentage of the Internet in a lifetime at that rate.