API Reference Glific v5.1.6

modules

Modules

Glific keeps the contexts that define your domain and business logic.

This is a module to convert speech to text

The AccessControl context.

A pipe for managing the role flows

A pipe for managing the role groups

The minimal wrapper for the base Access Control Permission structure

The minimal wrapper for the base Access Control Role structure

A pipe for managing the role triggers

A pipe for managing the user roles

A simple interface that connect Oban job status to Appsignal

Glific BigQuery Dataset and table creation

Book keeping table to keep track of the last job that we processed from the messages belonging to the organization

Process the message table for each organization. Chunk number of messages in groups of 128 and create a bigquery Worker Job to deliver the message to the bigquery servers

Schema for tables to be created for a dataset

Represent a menu interpreted from the CSV. Each Menu item either sends a content message or sends a sub-menu. A menu is an array of menu items

First implementation to convert sheets to flows using a menu structure and UUID

Given a CSV model, and a tracking shortcode, generate the json flow for the CSV incorporating the UUID's used in previous conversions. Store the latest UUID mapping back in the database

Represent a menu interpreted from the CSV. Each Menu item either sends a content message or sends a sub-menu. A menu is an array of menu items

Wrapper to allow each organization to modify how the templates are assembled. We will store this either in the DB and/or in the Flow CSV.

Glific Cache management

The cache API behaviour

Wrapper module that allows us to invoke organization specific callback functions to tweak the way the system handles things. This allows clients to override functionality in a similar manner to plugins wordpress.

Custom code extenison for ArogyaWorld

Fetches data from Plio Bigquery dataset to send reports to users

Custom webhook implementation specific to balajanaagraha usecase

Tweak GCS Bucket name based on group that the contact is in (if any)

Tweak GCS Bucket name based on group that the contact is in (if any)

Custom webhook implementation specific to DigitalGreen Jharkhand usecase

Tweak GCS Bucket name based on group that the contact is in (if any)

Custom webhook implementation specific to Lahi usecase

Custom webhook implementation specific to MukkaMaar usecase

Custom webhook implementation specific to NayiDisha usecase

Custom webhook implementation specific to NayiDisha usecase

Custom webhook implementation specific to OBLF use case

Custom webhook implementation specific to PehlayAkshar use case

Custom webhook implementation specific to QuestAlliance

Tweak GCS Bucket name based on group that the contact is in (if any)

Tweak GCS Bucket name based on group that the contact is in (if any)

Example implementation of survey computation for STiR Glific.Clients.Stir.compute_art_content(res)

This module will focus on suno sunao usecase

Tweak GCS Bucket name based on group that the contact is in (if any)

Glific Cloak migration management on changing encryption keys

Glific interface for all provider communication

This module provides a simple interface for sending emails.

The Message Communication Context, which encapsulates and manages tags and the related join tables.

The Contacts context.

The minimal wrapper for the base Contact structure

The minimal wrapper for the base Contact structure

The minimal wrapper for the base Contact structure

The Contact Importer Module

Current location of a contact

The main Glific abstraction that exposes the data in a stuctured manner as a set of conversations. For now each contact is associated with one and only one conversation. We will keep the API simple for now, but it is likely to become more complex and will require a fair number of iterations to get right

The Glific Abstraction to represent the conversation with a user. This unifies a vast majority of the glific data types including: message, contact, and tag

The main Glific abstraction that exposes the group conversation data in a stuctured manner as a set of conversations. For now each group is associated with a set of outgoing messages We will keep the API simple for now, but it is likely to become similar to the contact conversations API

Module to communicate with DialogFlow v2. This module was taken directly from: https://github.com/resuelve/flowex/

The intent object which has the intent information from any classifiers

A worker to handle send message processes

Helper to help manage intents

Cloak for encrypting strings

Cloak for encrypting maps

The Enum provides a location for all enum related macros. All the constants that Ecto/Elixir used are exposed here as macros, so other files can invoke them as simple functions

The Enums constant are where all enum values across our entire application should be defined. This is the source of truth for all enums

A simple module to periodically delete old data to clean up db

The table structure for all our extensions

Centralizing all the code we need to handle flags across Glific. For now, we'll also put operational code on flags here, as we figure out the right structure

The Flows context.

The Action object which encapsulates one action in a given node.

Start a flow to a group so we can blast it out as soon as possible and ensure we are under the rate limits.

A worker to handle send message processes

The Case object which encapsulates one category in a given node.

The Category object which encapsulates one category in a given node.

Since many of the functions, also do a few actions like send a message etc centralizing it here

Since many of the functions set/update fields in contact and related tables, lets centralize all the code here for now

Since many of the functions set/update fields in contact and related tables, lets centralize all the code here for now

The Exit object which encapsulates one exit in a given node.

The flow object which encapsulates the complete flow as emitted by by https://github.com/nyaruka/floweditor

When we are running a flow, we are running it in the context of a contact and/or a conversation (or other Glific data types). Let encapsulate this in a module and isolate the flow from the other aspects of Glific

The flow count object

The flow label object

Table which stores the flow results for each run of a contact through a flow

The flow revision object which encapsulates the complete flow as emitted by by https://github.com/nyaruka/floweditor

The Localization object which stores all the localizations for all languages for a flow

When we are running a flow, we are running it in the context of a contact and/or a conversation (or other Glific data types). Let encapsulate this in a module and isolate the flow from the other aspects of Glific

When we are running a flow, we are running it in the context of a contact and/or a conversation (or other Glific data types). Let encapsulate this in a module and isolate the flow from the other aspects of Glific

substitute the contact fields and result sets in the messages

The Node object which encapsulates one node in a given flow

A central place to define and execute all periodic flows. The current periodic flows in priority order are

The Router object which encapsulates the router in a given node.

The Case object which encapsulates one category in a given node.

The Wait object which encapsulates the wait for a router

Lets wrap all webhook functionality here as we try and get a better handle on the breadth and depth of webhooks.

The webhook log object

Glific GCS Manager

Book keeping table to keep track of the last job that we processed from the messages belonging to the organization

Process the media table for each organization. Chunk number of message medias in groups of 128 and create a Gcs Worker Job to deliver the message media url to the gcs servers

The Groups context.

A pipe for managing the contact groups

Simple container to hold all the contact groups we associate with one contact

The minimal wrapper for the base Group structure

Simple container to hold all the group contacts we associate with one group

Simple container to hold all the group users we associate with one group

A pipe for managing the user groups

Simple container to hold all the user groups we associate with one user

The Jobs context.

Module for checking remaining balance

Processes the tasks that need to be handled on a minute schedule

This module is used to send an email to the user when their balance is low.

CriticalNotificationMail is a mail that is sent to the org admin when a critical error occurs.

The mail log object

NewPartnerOnboardedMail will have the content for formatting for the new partner onboarded email.

This is an auto generated file from waffle, that is used to control storage behavior

The Messages Conversations context.

The Messages context.

Message conversation are mapped with a message

Message media are mapped with a message

Wrapper for various statistical tables which we can cache and write to in batch. For now, we are managing the flow_counts table

Simple worker which caches all the counts for a specific flow and writes them out in batches. This allows us to amortize multiple requests into one DB write.

Glific Navanatech for all API calls to Navanatech

The notifications manager and API to interface with the notification sub-system

The Partners context. This is the gateway for the application to access/update all the organization and Provider information.

We will use this as the main context interface for all billing subscriptions and the stripe interface.

Organization's credentials

Invoice model wrapper

Organizations are the group of users who will access the system

The Glific abstraction to represent the organization settings of out of office

The Glific abstraction to represent the out of office enabled day schema

The Glific abstraction to represent the regular expression flow

Provider are the third party Business Service providers who will give a access of WhatsApp API

Saas is the DB table that holds the various parameters we need to run the service.

Given a message, run it thru the flow engine. This is an auxilary module to help consumer_worker which is the main workhorse

Process all messages of type consumer and run them thru the various in-built taggers.

Process all messages of type consumer and run them thru the various in-built taggers. At a later stage, we will also do translation and dialogflow queries as an offshoot from this GenStage

A mock for the consumer worker for poolboy

Helper functions for all processing modules. Might promote this up at a later stage

The Profiles context.

The schema for profile

Https API client to interact with Gupshup

Message API layer between application and Airtel

Module for handling response from Provider end or Handle response for simulators

Module for handling template operations specific to Airtel

Module for checking airtel remaining balance

A worker to handle send message processes

Contacts API layer between application and Airtel

The contact behaviour which all the providers needs to implement for communication

Https API client to interact with Gupshup

Http API client to interact with Gupshup Enterprise

Message API layer between application and Gupshup

Module for handling response from Provider end or Handle response for simulators

A worker to handle send message processes.

Module for checking gupshup remaining balance

Message API layer between application and Gupshup

A module to handle fetching tier related information like quality rating and app rating using partner API

Module for handling response from Provider end or Handle response for simulators

Module for handling template operations specific to Gupshup

A worker to handle send message processes

Contacts API layer between application and Gupshup

Module for handling template operations specific to Gupshup

Contacts API layer between application and Gupshup Enterprise

The message behaviour which all the providers needs to implement for communication

The message behaviour which all the providers needs to implement for communication

A common worker to handle send message processes irrespective of BSP

This file will be handling production database migrations. This is a standard elixir/ecto release file. Copied from: https://hexdocs.pm/phoenix/releases.html

A repository that maps to an underlying data store, controlled by the Postgres adapter.

The table structure to record consulting hours

For now, we will build this on top of organization table, and have a group of helper functions here to manage global operations across all organizations. At some later point, we might decide to have a separate onboarding table and managment structure

Lets keep all the onboarding queries and validation here

Glific interface to Postgres's full text search

The Searches context.

Module for checking collection count

The minimal wrapper for the base Saved Search structure

The Glific Abstraction to represent the conversation with a user. This unifies a vast majority of the glific data types including: message, contact, and tag

First experiments with PhilColumns. Hopefully it will work

Our first attempt at a deployment seeder script. Wish us luck

Script for populating the database. We can call this from tests and/or /priv/repo

One shot migration of data to add simulators and saas admin. We use the functions in this file to add simulators for new organizations as they are created

Script for importing optin contacts for an organization

Script for populating the database at scale

One shot migration of data to seed the stats table

The Settings context. This includes language for now.

Ecto schema and minimal interface for the languages table

The Sheets context

Http API client to interact with Gupshup

The minimal wrapper for the base Sheet structure

The minimal wrapper for the base Sheet structure

Manage simulator and flows, managing state and allocation to ensure we can have multiple simulators and flow run at the same time

Manage flow state and allocation to ensure we only have one user modify a flow at a time

Manage simulator state and allocation to ensure we can have multiple simulators run at the same time

The stats manager and API to interface with the stat sub-system

The API for a generic tagging system on messages that coordinate with different types of taggers. The proposed taggers are: Numeric Keyword Emojis

This module is user driven via keywords associated with tags. It reads in all the keywords associated with each tag in the DB and matches it to the input text.

The numeric tagger which takes the message body and checks if the body is mainly a number in different ways including: Ordinal Numbers (0..19) Cardinal Number (Zero - Ten) Emojis (0..9) Ordinal Hindi Numbers Cardinal Hindi Numbers

This module will be responsible for all the contact and message status tagging. Like new contact tag and unread

The Tags Context, which encapsulates and manages tags and the related join tables.

A pipe for managing the contact tags

Simple container to hold all the contact tags we associate with one contact

A file for managing the join table message tags

Simple container to hold all the message tags we associate with one message

The minimal wrapper for the base Tag structure

A pipe for managing the template tags

Simple container to hold all the template tags we associate with one template

The Templates context.

The InteractiveTemplate Context, which encapsulates and manages interactive templates

Using this module to bulk apply template to Gupshup

The trigger manager for all the trigger system that starts flows within Glific

The trigger helper for the trigger system that deals with the complexity of time queries

This file has been copied (and modified a wee bit)from https://github.com/jerel/ecto_fields/blob/master/lib/fields/url.ex

The Users context.

Cloak Vault

The entrypoint for defining your web interface, such as controllers, views, channels and so on.

The Glific Onboarding Controller

The Pow User Registration Controller

The Pow User Session Controller

PoW error handler for API Authentication

Setting the absinthe context, so we can store the current user there

This is a basic plug that ensure the organization is loaded.

This is a struct that holds the configuration for GlificWeb.EnsurePlug.

Conveniences for translating and building error messages.

The controller to process events received from exotel

The Flow Editor Controller

A module providing Internationalization with a gettext-based API.

Simple macro to conditionally load Oban.Web only if already loaded. This allows us to include it only in the production release and hence make it a lot easier on potential open source contributors. We thus avoid the problem of sharing the oban key and/or them hacking the code to get it working

Code to cache the raw body in a conn variable before being processed by Phoenix. Used to validate the signature

Verify that the signature matches from the incoming webhook

This file and the below files have been "borrowed and modified" from triplex: https://github.com/ateliware/triplex The original copyright and license - MIT belong to the authors and contributors of Triplex

Dedicated controller to handle billing events from Airtel

Dedicated controller to handle different types of inbound message from Airtel

Dedicated controller to handle all the message status requests like read, delivered etc..

Dedicated controller to handle different types of user events requests like optin an optout form

A Airtel shunt which will redirect all the incoming requests to the airtel router based on there event type.

A Airtel router which will redirect all the airtel incoming request to there controller actions.

Dedicated controller to handle billing events from Gupshup

Dedicated controller to handle different types of inbound message from Gupshup

Dedicated controller to handle all the message status requests like read, delivered etc..

Dedicated controller to handle different types of user events requests like optin an optout form

Dedicated controller to handle different types of inbound message form Gupshup

Dedicated controller to handle all the message status requests like sent, delivered etc..

A Gupshup shunt which will redirect all the incoming requests to the gupshup router based on there event type.

A Gupshup router which will redirect all the gupsup incoming request to there controller actions.

A Gupshup shunt which will redirect all the incoming requests to the gupshup router based on there event type.

A Gupshup router which will redirect all the gupsup incoming request to there controller actions.

Enforcing rate limits on our AP's both authenticated and non-authenticated

Billing Resolver which sits between the GraphQL schema and Glific Billing Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Consulting Hours Resolver which sits between the GraphQL schema and Glific Consulting Hour Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Contact Resolver which sits between the GraphQL schema and Glific Contact Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Contact Field Resolver which sits between the GraphQL schema and Glific Contact Field Context API.

Tag Resolver which sits between the GraphQL schema and Glific Conversation Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Extensions Resolver which sits between the GraphQL schema and Glific Extensions API.

Flow Labels Resolver which sits between the GraphQL schema and Glific Flow Label Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Flow Resolver which sits between the GraphQL schema and Glific Flow Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Group Resolver which sits between the GraphQL schema and Glific Group Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Helper funcations for GQL resolvers

Interactives Resolver which sits between the GraphQL schema and Glific Interactives Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Location Resolver which sits between the GraphQL schema and Glific Location Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Resolver to deal with file uploads, which we send directly to GCS

Message Resolver which sits between the GraphQL schema and Glific Message Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Notification Resolver which sits between the GraphQL schema and Glific Notification Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Partners Resolver which sits between the GraphQL schema and Glific Partners Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Profile Resolver which sits between the GraphQL schema and Glific Profile Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Roles Resolver which sits between the GraphQL schema and Glific role Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Searches Resolver which sits between the GraphQL schema and Glific saved_search Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Settings Resolver which sits between the GraphQL schema and Glific Settings Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Trigger Resolver which sits between the GraphQL schema and Glific Sheets Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Tag Resolver which sits between the GraphQL schema and Glific Tag Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Templates Resolver which sits between the GraphQL schema and Glific Templates Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

Trigger Resolver which sits between the GraphQL schema and Glific Trigger Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

User Resolver which sits between the GraphQL schema and Glific User Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

WebhookLog Resolver which sits between the GraphQL schema and Glific WebhookLog Context API. This layer basically stitches together one or more calls to resolve the incoming queries.

a default gateway for all the external requests

This is the container for the top level Absinthe GraphQL schema which encapsulates the entire Glific Public API. This file is primarily a container and pulls in the relevant information for data type specific files.

GraphQL Representation of Glific's Billing DataType

GraphQL Representation of Glific's Consulting Hours DataType

GraphQL Representation of Glific's Contact Group DataType

GraphQL Representation of Glific's Contact Tag DataType

GraphQL Representation of Glific's Contact DataType

GraphQL Representation of Glific's Contact Field DataType

GraphQL Representation of Glific's Organization Credential DataType

Representing our enums in the style Absinthe expects them. We can now use these atoms in the object definitions within the GraphQL Schema

GraphQL Representation of Glific's Extension DataType

GraphQL Representation of FlowLabel DataType

GraphQL Representation of Flow DataType

GraphQL Representation of common data representations used across different Glific's DataType

GraphQL Representation of Glific's Group DataType

GraphQL Representation of Glific's Interactive DataType

GraphQL Representation of Glific's Language DataType

GraphQL Representation of Glific's Location DataType

GraphQL Representation of Glific's Location DataType

GraphQL Representation of Glific's MessageMedia DataType

GraphQL Representation of Glific's Message Tag DataType

GraphQL Representation of Glific's Message DataType

Implementing middleware functions to transform errors from Ecto Changeset into a format consumable and displayable to the API user. This version is specifically for mutations.

Implementing middleware functions to transform errors from Ecto Changeset into a format consumable and displayable to the API user. This version is specifically for mutations.

Implementing middleware functions to transform errors from Ecto Changeset into a format consumable and displayable to the API user. This version is specifically for mutations.

Implementing middleware functions to transform errors from Elixir and friends into a format consumable and displayable to the API user. This version is specifically for queries.

GraphQL Representation of Glific's Notification DataType

GraphQL Representation of Glific's Organization DataType

GraphQL Representation of Glific's Profile

GraphQL Representation of Glific's Provider DataType

GraphQL Representation of Role DataType

GraphQL Representation of Glific's Search DataType

GraphQL Representation of Glific's Session Template DataType

GraphQL Representation of Glific's Sheet DataType

GraphQL Representation of Glific's Tag DataType

GraphQL Representation of Glific's Template Tag DataType

GraphQL Representation of Glific's Trigger DataType

GraphQL Representation of Glific's User Group DataType

GraphQL Representation of Glific's User DataType

GraphQL Representation of Glific's WebhookLog DataType

StatsLive uses phoenix live view to show current stats

The controller for all events received from Stripe

Simple plug to handle and authenticate incoming webhook calls from Stripe

This is a basic plug that loads the current organization assign from a given value set on subdomain.

This is a struct that holds the configuration for GlificWeb.SubdomainPlug.

This is the main module of multi-tenancy in Glific. It has been borrowed from Triplex. (https://github.com/ateliware/triplex). However we are going to us postgres row level security instead, and hence copying the code from there. The original copyright and license (MIT) belong to the contributors to Triplex.