# `Mob.Camera`
[🔗](https://github.com/genericjam/mob/blob/main/lib/mob/camera.ex#L1)

Native camera capture for photos and videos.

Requires `:camera` permission (and `:microphone` for video). iOS
additionally needs `NSCameraUsageDescription` (and
`NSMicrophoneUsageDescription` for video) in `Info.plist`;
Android needs `CAMERA` (and `RECORD_AUDIO` for video) in
`AndroidManifest.xml`. The default `mix mob.new` templates ship
both. See the [permissions guide](permissions.html) for the
cross-platform table.

Opens the native OS camera UI. Results arrive as:

    handle_info({:camera, :photo, %{path: path, width: w, height: h}}, socket)
    handle_info({:camera, :video, %{path: path, duration: seconds}},   socket)
    handle_info({:camera, :cancelled},                                   socket)

The `path` is a local temp file. Copy it elsewhere before the next capture.

iOS: `UIImagePickerController`. Android: `TakePicture` / `CaptureVideo` activity contracts.

## Live frame stream

For real-time work (object detection, AR, custom filters) `start_frame_stream/2`
delivers per-frame pixel data as messages:

    handle_info({:camera, :frame, %{bytes: bin, width: w, height: h,
                                     format: :rgb_f32,
                                     timestamp_ms: t, dropped: n}}, socket)

The native side handles resize + format conversion (vImage on iOS) so
the BEAM never sees raw camera buffers. Late frames are dropped on
the native side — the BEAM mailbox can't unbounded-grow if your
receiver lags behind the camera's 30 fps cadence.

# `capture_photo`

```elixir
@spec capture_photo(
  Mob.Socket.t(),
  keyword()
) :: Mob.Socket.t()
```

Open the camera to capture a photo.

Options:
  - `quality: :high | :medium | :low` (default `:high`) — JPEG compression level

# `capture_video`

```elixir
@spec capture_video(
  Mob.Socket.t(),
  keyword()
) :: Mob.Socket.t()
```

Open the camera to record a video.

Options:
  - `max_duration: integer` — maximum clip length in seconds (default `60`)

# `frame_stream_opts`

```elixir
@spec frame_stream_opts(keyword()) :: map()
```

Build the option map passed to `camera_start_frame_stream/1`. Pure
function exposed so tests can pin defaults + serialisation without
going through the NIF.

# `start_frame_stream`

```elixir
@spec start_frame_stream(
  Mob.Socket.t(),
  keyword()
) :: Mob.Socket.t()
```

Start streaming camera frames to the calling process. Frames arrive
as messages of shape:

    handle_info({:camera, :frame, %{
      bytes:        binary(),      # pixel data, format-dependent
      width:        non_neg_integer(),
      height:       non_neg_integer(),
      format:       :rgb_f32 | :bgra_u8,
      timestamp_ms: non_neg_integer(),
      dropped:      non_neg_integer()  # frames skipped since last delivery
    }}, socket)

## Options

  * `:width`, `:height` — target frame size in pixels. Defaults to
    `640` × `640` (YOLO-friendly). Pass `nil` for both to receive the
    camera's native resolution. Mismatched aspect ratios are
    center-cropped on the long axis before scaling. Capped at ~4 MP
    to keep the BEAM mailbox bounded.

  * `:format` — pixel format. One of:
    - `:rgb_f32` (default) — interleaved RGB floats normalised to
      `[0.0, 1.0]`. Byte size: `width * height * 3 * 4`. Ready for
      `Nx.from_binary(bin, :f32, ...) |> Nx.reshape({1, h, w, 3})`.
    - `:bgra_u8` — raw 32-bit BGRA bytes, native iOS pixel layout.
      Byte size: `width * height * 4`. 4× smaller than `:rgb_f32`;
      useful for forwarding to another NIF or doing custom
      preprocessing.

  * `:facing` — `:back` (default) or `:front`. Same camera the
    preview uses; calling `start_frame_stream/2` alone will activate
    the capture session without a visible preview.

  * `:throttle_ms` — minimum interval between deliveries (default
    `0`). Native-side throttle, complementary to the OS's late-frame
    drop. Use `throttle_ms: 100` for 10 Hz delivery when full
    camera-rate inference isn't needed.

## Notes

Returns the socket immediately; frames begin arriving asynchronously
once the OS has activated the capture session (typically <100 ms).
Receiver is the **calling process** at the time of invocation —
call from a `Mob.Screen` callback (mount, handle_info), not from a
task or genserver running elsewhere.

# `start_preview`

```elixir
@spec start_preview(
  Mob.Socket.t(),
  keyword()
) :: Mob.Socket.t()
```

Start a live camera preview session. Pair with a `:camera_preview` component
in your render tree to display the feed.

Options:
  - `facing: :back | :front` (default `:back`)

# `stop_frame_stream`

```elixir
@spec stop_frame_stream(Mob.Socket.t()) :: Mob.Socket.t()
```

Stop the camera frame stream. Safe to call when no stream is
active. The visible preview (if `start_preview/2` was called
separately) is left untouched.

# `stop_preview`

```elixir
@spec stop_preview(Mob.Socket.t()) :: Mob.Socket.t()
```

Stop the active camera preview session.

---

*Consult [api-reference.md](api-reference.md) for complete listing*
