Crawly.DataStorage.Worker (Crawly v0.16.0) View Source
A worker process which stores items for individual spiders. All items are pre-processed by item_pipelines.
All pipelines are using the state of this process for their internal needs (persistancy).
The DataStorage.Worker will not write anything to a filesystem. Instead it would expect that pipelines are going to do that work.
Link to this section Summary
Link to this section Functions
Returns a specification to start this module under a supervisor.
Callback implementation for
Inspect the inner state of the given data worker