PgLargeObjects (PgLargeObjects v0.2.1)
View SourceHigh-level API for managing large objects.
This exposes commonly-used functionality for streaming data into or out of
the database using the functions import/3 and export/3.
See PgLargeObjects.LargeObject for a lower-level API which exposes more
functionality for individual large objects.
Summary
Functions
@spec export(Ecto.Repo.t(), pos_integer(), keyword()) :: :ok | {:ok, binary()} | {:error, :not_found}
Export data out of large object.
This exports the data in the large object referenced by the object ID oid.
Depending on the :into option, the data is returned a single binary or fed
into a given Collectable.
This function needs to be executed as part of a transaction.
Options
:bufsize- number of bytes to transfer per chunk. Defaults to 65536 bytes.:into- can benilto download all data into a single binary or anyCollectable. Defaults tonil.
Return value
:okin case the:intooption references aCollectable.{:ok, data}in case the:intooption isnil{:error, :not_found}in case there is no large object with the givenoid.
@spec import(Ecto.Repo.t(), binary() | Enumerable.t(), keyword()) :: {:ok, pos_integer()}
Import data into large object.
This imports the data in data into a new large object in the database
referenced by repo.
data can either be a binary which will be uploaded in multiple chunks, or
an arbitrary Enumerable.
This function needs to be executed as part of a transaction.
Options
:bufsize- number of bytes to transfer per chunk. Defaults to 65536 bytes.
Return value
{:ok, object_id}in case of success.