Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.mixpeek.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

Overview

Mux is a video infrastructure platform for building video experiences. By connecting Mux to Mixpeek, you can automatically process your video assets through feature extraction pipelines — enabling search, classification, and analysis across your entire video library.

Prerequisites

  • A Mux account with video assets
  • A Mux access token with Mux Video read permissions
  • A Mixpeek account with an active namespace
  • Static renditions or MP4 support enabled on your Mux assets (see below)
Static renditions are required. By default, Mux assets only have HLS streaming — no downloadable MP4 files. Mixpeek needs MP4 renditions to download and process your videos. Assets without static renditions will be skipped during sync.Enable static renditions when creating assets:
{
  "input": [{"url": "https://example.com/video.mp4"}],
  "playback_policy": ["public"],
  "static_renditions": [{"resolution": "720p"}]
}
Or enable them on existing assets in the Mux Dashboard under each asset’s settings.

Configuration

Connection-level fields

FieldRequiredDescription
access_token_idYesMux access token ID (UUID format)
access_token_secretYesMux access token secret
webhook_secretNoShared signing secret for inbound Mux webhooks. Required only if you want Mixpeek to cascade-delete objects when assets are deleted in Mux (see Cascade delete via webhooks).

Sync-level fields

FieldRequiredDefaultDescription
source_pathNomux://Sync all assets, or mux://{asset_id} for a specific asset
sync_modeNocontinuousinitial_only or continuous
polling_interval_secondsNo300How often to check for new assets (continuous mode)
file_filters.metadata_filtersNoArray of metadata filter rules applied to Mux asset fields (see Metadata filtering)
reconcile_on_syncNofalseRe-check previously indexed assets on each sync and unindex any that no longer match metadata filters (see Reconciliation)

Setup

1

Create a Mux access token

  1. Go to Mux Dashboard → Settings → Access Tokens
  2. Click Generate new token
  3. Select Mux Video with Read permissions
  4. Copy the Token ID and Token Secret immediately — the secret is only shown once
2

Create the connection in Mixpeek

In the Mixpeek Studio, go to Settings → Storage Connections → Add Connection and select Mux.Or via the API:
from mixpeek import Mixpeek

client = Mixpeek(api_key="YOUR_API_KEY")

connection = client.organizations.connections.create(
    name="Mux Production",
    provider_type="mux",
    provider_config={
        "credentials": {
            "type": "access_token",
            "access_token_id": "a3b02074-dbca-47b6-...",
            "access_token_secret": "Am9+RGn2mhmz..."
        }
    },
    test_before_save=True
)
3

Create a bucket with Mux sync

Create a bucket and configure it to sync from your Mux connection:
Python
bucket = client.buckets.create(
    namespace_id="your-namespace-id",
    name="mux-videos",
    description="Video assets from Mux",
    blob_type="video"
)

sync = client.buckets.syncs.create(
    namespace_id="your-namespace-id",
    bucket_id=bucket.bucket_id,
    connection_id=connection.connection_id,
    source_path="mux://",
    sync_mode="continuous",
    polling_interval_seconds=300
)
4

Verify ingestion

Monitor the sync status in Studio or via the API. Video assets in ready status will be automatically downloaded and processed through your collection’s feature extractors.

Mux metadata on objects and retriever results

When the connector syncs a Mux asset into a bucket, it captures the Mux identifiers and exposes them as source_metadata on the resulting bucket object. This metadata flows through unchanged to downstream documents and is returned alongside every retriever match — so your application can render results directly from Mux without falling back to the S3 mirror.
FieldDescription
asset_idMux asset ID — use to look the asset up via the Mux API
playback_idPrimary public/signed playback ID
playback_idsFull list of playback IDs on the asset
playback_urlHLS manifest URL (https://stream.mux.com/{playback_id}.m3u8) ready for <mux-player> or any HLS client
thumbnail_urlDefault Mux thumbnail (https://stream.mux.com/{playback_id}/thumbnail.jpg)
durationAsset duration in seconds
resolution_tierSource quality tier (e.g. 1080p, 1440p)
statusMux asset status at sync time
aspect_ratioAsset aspect ratio (e.g. 16:9)
Example object payload:
{
  "object_id": "obj_7fa70fd60a0547152df57cff",
  "bucket_id": "bkt_648b4be8",
  "source_metadata": {
    "asset_id": "5sCF1Yk3rC0102zMVEh74jJE9pT6KFeZ1N8gcQZ3l53FI",
    "playback_id": "YGDmuLw7AOiQKwvCZbHAGjdZRZwxMqqg1iBELyJig5E",
    "playback_url": "https://stream.mux.com/YGDmuLw7AOiQKwvCZbHAGjdZRZwxMqqg1iBELyJig5E.m3u8",
    "thumbnail_url": "https://stream.mux.com/YGDmuLw7AOiQKwvCZbHAGjdZRZwxMqqg1iBELyJig5E/thumbnail.jpg",
    "duration": 23.86,
    "resolution_tier": "1080p",
    "status": "ready",
    "aspect_ratio": "16:9"
  }
}
In retriever results, the same source_metadata is attached to each match. A typical UI flow is:
// Each search result carries the original Mux IDs — no S3 hop needed
results.forEach((match) => {
  const { playback_id, playback_url, thumbnail_url } = match.source_metadata;
  renderMuxPlayer({ playbackId: playback_id, poster: thumbnail_url });
});

Webhooks: delete and update

Mixpeek handles two Mux webhook event types to keep your index in sync with Mux automatically.

How it works

  1. Mux POSTs events to a Mixpeek endpoint that includes your connection_id.
  2. Mixpeek verifies the Mux-Signature header against the webhook_secret stored on that connection.
  3. video.asset.deleted — Mixpeek finds every object synced from that Mux asset and deletes them, cascading through OBJECT_DELETED to remove derived documents from your collections.
  4. video.asset.updated — Mixpeek re-fetches the asset’s current metadata from Mux and re-evaluates it against each sync config’s metadata_filters. If a previously non-matching asset now matches, it becomes eligible for indexing on the next sync. If the asset no longer matches and reconcile_on_sync is enabled, the corresponding objects are unindexed. Without reconcile_on_sync, the event is acknowledged but no objects are removed.

Setup

1

Generate a signing secret and attach it to the connection

Pick any high-entropy string (32+ chars) and store it as webhook_secret in the Mux connection’s credentials. You can set it at create time or patch an existing connection.
cURL
curl -X PATCH https://api.mixpeek.com/v1/organizations/connections/{connection_id} \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "provider_config": {
      "credentials": {
        "type": "access_token",
        "access_token_id": "...",
        "access_token_secret": "...",
        "webhook_secret": "your-32-char-random-string"
      }
    }
  }'
2

Register the webhook in the Mux dashboard

In the Mux Dashboard → Settings → Webhooks, add a new endpoint:
  • URL: https://api.mixpeek.com/v1/webhooks/mux/{connection_id} — replace {connection_id} with the Mixpeek connection ID returned when you created the connection.
  • Signing secret: the same value you put in webhook_secret.
  • Events: at minimum video.asset.deleted and video.asset.updated. You can subscribe to additional events; Mixpeek will acknowledge them and ignore any it doesn’t act on.
3

Verify with a test delete

Delete a synced asset in Mux (or use the dashboard’s “Send test event” button) and confirm the corresponding object disappears from your bucket. The webhook returns a JSON body documenting what it did:
{
  "received": true,
  "event_type": "video.asset.deleted",
  "handled": true,
  "asset_id": "5sCF1Yk3rC0102zMVEh74jJE9pT6KFeZ1N8gcQZ3l53FI",
  "deleted_objects": 1
}

Signature format

Mixpeek follows the same scheme Mux documents for outgoing webhooks:
  • Header: Mux-Signature: t=<unix_ts>,v1=<hex_hmac_sha256>
  • HMAC computed as HMAC_SHA256(webhook_secret, f"{t}.{raw_body}") over the exact raw bytes of the request body.
  • Signatures are compared with hmac.compare_digest to avoid timing attacks.

Response codes

StatusMeaning
200Event received. handled is true for video.asset.deleted, false for ignored event types.
400Connection has no webhook_secret configured, or payload is not valid JSON.
401Mux-Signature is missing, malformed, or the HMAC doesn’t match.
404connection_id in the URL is not a Mux connection.

Source path format

PathBehavior
mux://Sync all video assets in ready status
mux://{asset_id}Sync a specific asset by ID

Asset filtering

Only assets in ready status are synced. Assets that are preparing or errored are skipped automatically. You can also use file filters to narrow sync scope:
sync = client.buckets.syncs.create(
    namespace_id="your-namespace-id",
    bucket_id=bucket.bucket_id,
    connection_id=connection.connection_id,
    source_path="mux://",
    file_filters={
        "min_size": 1000000,  # Skip assets under ~1MB (estimated)
    }
)

Metadata filtering

You can filter which Mux assets get synced based on asset-level metadatapassthrough, meta.external_id, or any field captured from the Mux asset. This is useful when you want a single Mux environment connected to Mixpeek but only certain assets indexed. Metadata filters are evaluated against the asset’s metadata dict (which includes passthrough, duration, resolution_tier, status, etc.). Each filter specifies a field, operator, and value.

Supported operators

OperatorDescriptionExample
containsField value contains substringpassthrough contains vi:1
equalsExact matchstatus equals ready
not_equalsDoes not matchresolution_tier not_equals audio_only
not_containsDoes not contain substringpassthrough not_contains skip
gt, lt, gte, lteNumeric comparisonduration gt 10
existsField is present and non-emptypassthrough exists true

Example: sync only flagged assets

A common pattern is to use the Mux passthrough field as a sync selector. Your application sets a flag on assets that should be indexed, and Mixpeek only syncs matching assets.
sync = client.buckets.syncs.create(
    namespace_id="your-namespace-id",
    bucket_id=bucket.bucket_id,
    connection_id=connection.connection_id,
    source_path="mux://",
    file_filters={
        "metadata_filters": [
            {
                "field": "passthrough",
                "operator": "contains",
                "value": "vi:1"
            }
        ]
    }
)
With this configuration, an asset with passthrough: "myapp|vi:1|v:abc" would be synced, while passthrough: "myapp|vi:0|v:abc" would be skipped.
Metadata filters work with any sync provider, not just Mux. The same metadata_filters syntax applies to S3, GCS, and all other storage connections — the filters are evaluated against whatever metadata the provider exposes for each file.

Reconciliation

When reconcile_on_sync is enabled, each sync cycle re-checks all previously indexed assets against the current metadata filters. Assets that no longer match are automatically unindexed. This is useful when your application changes asset metadata after initial sync — for example, removing the visual-index flag from a Mux asset’s passthrough. Without reconciliation, the already-indexed object would remain in the bucket. With reconciliation, it gets cleaned up on the next sync.
curl -X PATCH "https://api.mixpeek.com/v1/buckets/$BUCKET_ID/syncs/$SYNC_ID" \
  -H "Authorization: Bearer $MIXPEEK_API_KEY" \
  -H "X-Namespace: $NAMESPACE_ID" \
  -H "Content-Type: application/json" \
  -d '{"reconcile_on_sync": true}'
Reconciliation queries the source provider for each indexed object on every sync cycle. For large libraries (10k+ assets), this adds API calls proportional to the number of indexed objects. Use it when metadata changes are common and stale objects are unacceptable.

Sync modes

ModeDescription
initial_onlySync all current assets once
continuousPoll for new assets at the configured interval

Troubleshooting

Verify your access token ID and secret are correct. The token ID is a UUID (e.g., a3b02074-dbca-47b6-...), not the environment ID. Tokens can be regenerated in the Mux Dashboard.
Only assets with ready status and at least one playback ID are synced. Check your asset status in the Mux Dashboard. Assets that are still encoding (preparing) will be picked up on the next sync cycle.
Assets need a public or signed playback ID and either static renditions or MP4 support enabled. By default, Mux assets only have HLS streaming — MP4 downloads require explicit configuration.To enable static renditions when creating an asset via the Mux API:
{
  "input": [{"url": "https://example.com/video.mp4"}],
  "playback_policy": ["public"],
  "static_renditions": [{"resolution": "720p"}]
}
You can also enable static renditions on existing assets in the Mux Dashboard under the asset’s settings. Mixpeek will try multiple rendition qualities (low, medium, high, highest) and use the first available one.