View Source EXLA.Op (EXLA v0.4.1)
Wrapper around XLA's ops.
Link to this section Summary
Functions
Unary abs.
Unary acos.
Unary acosh.
Element-wise add with broadcasting.
Unary asin.
Unary asinh.
Element-wise atan2 with broadcasting.
Unary atan.
Unary atanh.
Element-wise bitwise_and with broadcasting.
Unary bitwise_not.
Element-wise bitwise_or with broadcasting.
Element-wise bitwise_xor with broadcasting.
Broadcasts the tensor to shape
.
Unary cbrt.
Unary ceil.
Creates a n-dimensional constant from binary data
with shape
.
Creates a numeric constant.
Unary cos.
Unary cosh.
Unary count_leading_zeros.
Element-wise divide with broadcasting.
Element-wise equal with broadcasting.
Unary erf.
Unary erf_inv.
Unary erfc.
Unary exp.
Unary expm1.
Unary floor.
The XLA gather operation stitches together several slices of an input array.
Gets the shape of an operator.
Element-wise greater with broadcasting.
Element-wise greater_equal with broadcasting.
Creates iota tensor.
Element-wise left_shift with broadcasting.
Element-wise less with broadcasting.
Element-wise less_equal with broadcasting.
Unary log1p.
Unary log.
Element-wise max with broadcasting.
Element-wise min with broadcasting.
Element-wise multiply with broadcasting.
Unary negate.
Element-wise not_equal with broadcasting.
Pads the tensor with value and padding config.
Specifies a parameter at position i
with shape
and name
.
Unary population_count.
Element-wise power with broadcasting.
Element-wise remainder with broadcasting.
Reshapes the tensor to shape
.
Element-wise right_shift_arithmetic with broadcasting.
Element-wise right_shift_logical with broadcasting.
Creates tensor with normal distribution.
Creates tensor with uniform distribution.
Unary round.
Unary rsqrt.
Unary sigmoid.
Unary sign.
Unary sin.
Unary sinh.
Unary sqrt.
Element-wise subtract with broadcasting.
Unary tanh.
Builds a tuple with the given elements.
Link to this section Functions
Unary abs.
Unary acos.
Unary acosh.
Element-wise add with broadcasting.
Unary asin.
Unary asinh.
Element-wise atan2 with broadcasting.
Unary atan.
Unary atanh.
Element-wise bitwise_and with broadcasting.
Unary bitwise_not.
Element-wise bitwise_or with broadcasting.
Element-wise bitwise_xor with broadcasting.
Broadcasts the tensor to shape
.
Unary cbrt.
Unary ceil.
Creates a n-dimensional constant from binary data
with shape
.
Creates a numeric constant.
conv_general_dilated(op1, op2, strides, padding, lhs_dilation, rhs_dilation, dim_nums, feature_group_count, batch_group_count, precision_config)
View SourceUnary cos.
Unary cosh.
Unary count_leading_zeros.
Element-wise divide with broadcasting.
Element-wise equal with broadcasting.
Unary erf.
Unary erf_inv.
Unary erfc.
Unary exp.
Unary expm1.
Unary floor.
gather(op1, op2, index_vector_dim, slice_sizes, offset_dims, collapsed_slice_dims, start_index_map)
View SourceThe XLA gather operation stitches together several slices of an input array.
Note that this operation is extremely generic and far from intuitive for regular usage. However, it can be used to implement many specific operations that have to do with combining multiple tensor slices.
parameteres
Parameteres
The XLA docs are rather cryptic unless already understood, so here's an attempt of a more intuitive description.
index_vector_dim
index_vector_dim
Determines which dimension contains index vectors. In most cases we want to set this to the last dimension.
given
start_indices = [[0, 1], [1, 1]]
and given
index_vector_dim = 1
then
index vectors are [0, 1] and [1, 1]
Note that we can set this to last_dimension + 1
, in which case
start_indices
are implicitly reshaped to have a trailing dimension
of 1.
given
start_indices = [[0, 1], [1, 1]]
and given
index_vector_dim = 2
then
start_indices <- [[[0], [1]], [[1], [1]]]
index vectors are [0], [1], [1], [1]
start_index_map
start_index_map
Note: though given as a list, it can be treated as a map of list_idx -> value
.
An index vector may have less elements than the operand tensor shape. For example:
given
operand = [[1, 2], [3, 4]]
start_indices = [[1], [0]]
index_vector_dim = 1
As described above, in this case index vectors are [1]
, [0]
and they have
length 1. However, the operand has rank 2, so we need vectors of the form [_, _]
to point to a specific element in the operand. The start_index_map
determines
where indices go into this template:
and given
start_index_map = [0] # effectively %{0 => 0}
then
actual index vectors are [1, _] and [0, _]
and given
start_index_map = [1] # effectively %{0 => 1}
then
actual index vectors are [_, 1] and [_, 0]
Finally, the missing elements (_
) are assumed to be 0.
Complete examples:
given
operand = [[1, 2], [3, 4]]
start_indices = [[0], [1]]
index_vector_dim = 1
and given
start_index_map = [1] # effectively %{0 => 1}
then
actual index vectors are [0, 0], [0, 1] (leading 0 is inserted)
given
operand = [[1, 2], [3, 4]]
start_indices = [[0, 1], [1, 1]]
index_vector_dim = 1
and given
start_index_map = [0, 1] # effectively %{0 => 0, 1 => 1}
then
actual index vectors are [0, 1], [1, 1] (as expected)
given
operand = [[1, 2], [3, 4]]
start_indices = [[0, 1], [1, 1]]
index_vector_dim = 1
and given
start_index_map = [1, 0] # effectively %{0 => 1, 1 => 0}
then
actual index vectors are [1, 0], [1, 1] (see how the first vector is reversed)
slice_sizes
slice_sizes
For every starting point (as described above) we take a slice given
by slice_sizes
. Naturally, slice_sizes
must have the same length
as operand rank, so that we have one size per dimension.
given
operand = [[1, 2], [3, 4]]
actual index vector [1, 0]
and given
slice_sizes = [1, 2]
then
slice for actual index vector is [[3, 4]]
collapsed_slice_dims
collapsed_slice_dims
A list of dimensions that are collapsed (effectively removed) in the slice shape. Only dimensions of size 1 can be collapsed.
given
slice is [[3, 4]] # shape: [1][2]
and given
collapsed_slice_dims = [0]
then
actual slice is [3, 4] # shape [2]
offset_dims
offset_dims
A list of dimensions in the output tensor corresponding to the non-collapsed dimensions in slice tensors. In other words, these dimensions are used for indexing elements of the slice tensors.
given
operand = [[1, 2], [3, 4]]
start_indices = [[1, 0], [0, 0], [1, 0]]
index_vector_dim = 1
start_index_map = [1, 2] # effectively %{0 => 0, 1 => 1}
collapsed_slice_dims = [0]
and given
offset_dims = [1]
then
result is [[3, 4], [1, 2], [3, 4]]
In the above example the collapsed slices are [3, 4]
, [1, 2]
, [3, 4]
and have rank 1. Using offset_dims
we specify that the first
dimension in each slice corresponds to the second dimension in
the output tensor.
If we use the first output dimension instead, we get:
and given
offset_dims = [0]
then
result is [[3, 1, 3], [4, 2, 4]]
docs
Docs
More formal specification can be found in the XLA Gather docs.
Gets the shape of an operator.
Element-wise greater with broadcasting.
Element-wise greater_equal with broadcasting.
Creates iota tensor.
Element-wise left_shift with broadcasting.
Element-wise less with broadcasting.
Element-wise less_equal with broadcasting.
Unary log1p.
Unary log.
Element-wise max with broadcasting.
Element-wise min with broadcasting.
Element-wise multiply with broadcasting.
Unary negate.
Element-wise not_equal with broadcasting.
Pads the tensor with value and padding config.
Specifies a parameter at position i
with shape
and name
.
Unary population_count.
Element-wise power with broadcasting.
Element-wise remainder with broadcasting.
Reshapes the tensor to shape
.
Element-wise right_shift_arithmetic with broadcasting.
Element-wise right_shift_logical with broadcasting.
Creates tensor with normal distribution.
Creates tensor with uniform distribution.
Unary round.
Unary rsqrt.
scatter(op1, op2, op3, computation, indices_rank, update_window_dims, inserted_window_dims, index_dims_to_window_dims)
View Sourceselect_and_scatter(op1, computation1, window_dimensions, window_strides, padding_config, op2, op3, computation2)
View SourceUnary sigmoid.
Unary sign.
Unary sin.
Unary sinh.
Unary sqrt.
Element-wise subtract with broadcasting.
Unary tanh.
triangular_solve(op1, op2, left_side, lower, unit_diagonal, transpose_a)
View SourceBuilds a tuple with the given elements.