View Source Bulk Actions
Bulk actions are ways to create, update or destroy many records at once, backed by scalable patterns.
Currently, only bulk creates are implemented. Bulk updates and bulk destroys will come next.
bulk-creates
Bulk Creates
Bulk creates take a list or stream of inputs for a given action, and batches calls to the underlying data layer. For example, with an action like this:
create :create do
accept [:title, :subtitle]
end
You could then call YourApi.bulk_create
like so:
YourApi.bulk_create([ %{title: "foo", subtitle: "bar"}, %{title: "baz", subtitle: "buz"}], Resource, :action)
considerations
Considerations
Generally speaking, all regular Ash create actions are compatible (or can be made to be compatible) with bulk create actions. However, there are some important considerations.
Ash.Resource.Change
modules can be optimized for bulk actions by implementingbatch_change/3
,before_batch/3
andafter_batch/3
. If you implementbatch_change/3
, thechange
function will no longer be called, and you should swap any behavior implemented withbefore_action
andafter_action
hooks to logic in thebefore_batch
andafter_batch
callbacks.Actions that reference arguments in changes, i.e
change set_attribute(:attr, ^arg(:arg))
will prevent us from using thebatch_change/3
behavior. This is usually not a problem, for instance that change is lightweight and would not benefit from being optimized withbatch_change/3
If your action uses
after_action
hooks, or hasafter_batch/3
logic defined for any of its changes, then we must ask the data layer to return the records it inserted. Again, this is not generally a problem because we throw away the results of each batch by default. If you are usingreturn_records?: true
then you are already requesting all of the results anyway.
returning-a-stream
Returning a Stream
Returning a stream allows you to work with a bulk action as an Elixir Stream. For example:
input_stream()
|> YourApi.bulk_create(Resource, :action, return_stream?: true, return_records?: true)
|> Stream.map(fn {:ok, result} ->
# process results
{:error, error} ->
# process errors
end)
|> Enum.reduce(%{}, fn {:ok, result}, acc ->
# process results
{:error, error} ->
# process errors
end)
considerations-1
Considerations
Because streams are lazily evaluated, if you were to do something like this:
[input1, input2, ...] # has 300 things in it
|> YourApi.bulk_create(Resource, :action, return_stream?: true, return_records?: true, batch_size: 100) # the default is 100
|> Enum.take(150)
What would happen is that we would insert 200 records (assuming no errors were emitted). Because the stream would end after we process the first two batches. If you want to make sure that everything happens, just be sure you aren't using things like Stream.take
or Enum.take
to limit the amount of things pulled from the stream.