View Source Glific.BigQuery.BigQueryWorker (Glific v5.1.6)

Process the message table for each organization. Chunk number of messages in groups of 128 and create a bigquery Worker Job to deliver the message to the bigquery servers

We centralize both the cron job and the worker job in one module

Link to this section Summary

Functions

Standard perform method to use Oban worker

This is called from the cron job on a regular schedule. we sweep the messages table and queue them up for delivery to bigquery

This is called from the cron job on a regular schedule. We updates existing tables

Moving this logic to a different function so that we can reuse it for gcs worker also.

Link to this section Functions

@spec perform(Oban.Job.t()) :: :ok | {:error, :string}

Standard perform method to use Oban worker

Link to this function

perform_periodic(org_id)

View Source
@spec perform_periodic(non_neg_integer()) :: :ok

This is called from the cron job on a regular schedule. we sweep the messages table and queue them up for delivery to bigquery

Link to this function

periodic_updates(organization_id)

View Source
@spec periodic_updates(non_neg_integer()) :: :ok

This is called from the cron job on a regular schedule. We updates existing tables

Link to this function

queue_message_media_data(media_list, organization_id, attrs)

View Source
@spec queue_message_media_data(list(), non_neg_integer(), map()) :: :ok

Moving this logic to a different function so that we can reuse it for gcs worker also.