Crawly.Spider behaviour (Crawly v0.17.0) View Source

A behavior module for implementing a Crawly Spider

A Spider is a module which is responsible for defining:

  1. init/0 function, which must return a keyword list with start_urls/start_requests list
  2. init/1 same as init, but also takes a list of options sent from Engine
  3. base_url/0 function responsible for filtering out requests not related to a given website
  4. override_settings/0 function that is called each time a setting is referenced internally. Allows overriding of Crawly configuration at the spider-level.
  5. parse_item/1 function which is responsible for parsing the downloaded request and converting it into items which can be stored and new requests which can be scheduled
  6. custom_settings/0 an optional callback which can be used in order to provide custom spider specific settings. Should define a list with custom settings and their values. These values will take precedence over the global settings defined in the config.

Link to this section Summary

Link to this section Callbacks

Specs

base_url() :: binary()

Specs

init() :: [start_urls: list(), start_requests: list()]

Specs

init([{:options, keyword()}]) :: [start_urls: list(), start_requests: list()]

Specs

override_settings() :: Crawly.Settings.t()

Specs

parse_item(response :: HTTPoison.Response.t()) :: Crawly.ParsedItem.t()