Crawly v0.9.0 Crawly.Spider behaviour View Source
A behavior module for implementing a Crawly Spider
A Spider is a module which is responsible for defining:
init/0function, which must return a keyword list with start_urls listbase_url/0function responsible for filtering out requests not related to a given websiteparse_item/1function which is responsible for parsing the downloaded request and converting it into items which can be stored and new requests which can be scheduledcustom_settings/0an optional callback which can be used in order to provide custom spider specific settings. Should define a list with custom settings and their values. These values will take precedence over the global settings defined in the config.
Link to this section Summary
Link to this section Callbacks
Link to this callback
base_url()
View Source
base_url()
View Source
base_url() :: binary()
base_url() :: binary()
Link to this callback
init()
View Source
init()
View Source
init() :: [{:start_urls, list()}]
init() :: [{:start_urls, list()}]
Link to this callback
override_settings()
View Source
override_settings()
View Source
override_settings() :: Crawly.Settings.t()
override_settings() :: Crawly.Settings.t()
Link to this callback
parse_item(response)
View Source
parse_item(response)
View Source
parse_item(response :: HTTPoison.Response.t()) :: Crawly.ParsedItem.t()
parse_item(response :: HTTPoison.Response.t()) :: Crawly.ParsedItem.t()