PgLargeObjects (PgLargeObjects v0.2.0)

View Source

High-level API for managing large objects.

This exposes commonly-used functionality for streaming data into or out of the database using the functions import/3 and export/3.

See PgLargeObjects.LargeObject for a lower-level API which exposes more functionality for individual large objects.

Summary

Functions

Export data out of large object.

Import data into large object.

Functions

export(repo, oid, opts \\ [])

@spec export(Ecto.Repo.t(), pos_integer(), keyword()) ::
  :ok | {:ok, binary()} | {:error, :not_found}

Export data out of large object.

This exports the data in the large object referenced by the object ID oid. Depending on the :into option, the data is returned a single binary or fed into a given Collectable.

This function needs to be executed as part of a transaction.

Options

  • :bufsize - number of bytes to transfer per chunk. Defaults to 65536 bytes.
  • :into - can be nil to download all data into a single binary or any Collectable. Defaults to nil.

Return value

  • :ok in case the :into option references a Collectable.
  • {:ok, data} in case the :into option is nil
  • {:error, :not_found} in case there is no large object with the given oid.

import(repo, data, opts \\ [])

@spec import(Ecto.Repo.t(), binary() | Enumerable.t(), keyword()) ::
  {:ok, pos_integer()}

Import data into large object.

This imports the data in data into a new large object in the database referenced by repo.

data can either be a binary which will be uploaded in multiple chunks, or an arbitrary Enumerable.

This function needs to be executed as part of a transaction.

Options

  • :bufsize - number of bytes to transfer per chunk. Defaults to 65536 bytes.

Return value

  • {:ok, object_id} in case of success.