Right now, we will introduce scrapyd. Scrapyd is an application that allows us to deploy spiders on a server and schedule crawling jobs using them. Let's get a feeling of how easy it is to use this. We have it preinstalled in our dev machine, so we can check this immediately by going back to the code from Chapter 3, Basic Crawling. The exact same process that we used back then works here with just a single change.
Let's first have a look on scrapyd's web interface that we can find at
You can see that it has different sections for Jobs, Items, Logs and Documentation. It also gives us some instructions ...