Cheatsheet
View SourceTransactions Required
References to large objects are only valid for the duration of a
transaction. In practice, all operations on large objects need to be in
an Repo.transaction/1 or Repo.transact/1 call.
Any large object value will be closed automatically at the end of the transaction.
Operating on objects as a whole
Streaming API
Writing local file to an object
stream = File.stream!("/tmp/bigfile.dat")
{:ok, object_id} =
Repo.import_large_object(stream)Reading object to local file
stream = File.stream!("/tmp/bigfile.dat")
:ok =
Repo.export_large_object(object_id, into: stream)Buffered API
Writing data to an object
large_binary = "This is a large binary."
{:ok, object_id} =
Repo.export_large_object(large_binary)Reading data from an object
{:ok, data} =
Repo.export_large_object(object_id)Granular access to objects
Create a new object
{:ok, object_id} =
Repo.create_large_object(mode: :write)Open an existing object
# For reading
{:ok, object} =
Repo.open_large_object(object_id)
# For writing
{:ok, object} =
Repo.open_large_object(object_id, mode: :write)Read from object
# Read 1024 bytes
{:ok, data} =
PgLargeObjects.LargeObject.read(object, 1024)Write to object
binary = "Some data to store."
:ok =
PgLargeObjects.LargeObject.write(object, binary)Get object size
{:ok, size} =
PgLargeObjects.LargeObject.size(object)