Mixpeek Logo
    Back to Videos

    Building an Exploratory Multimodal Retriever with the National Gallery

    8:20
    Use Cases
    Ethan
    January 25, 2026

    Summary

    Discover how to build a powerful exploratory image board using multimodal search across 120,000 images from the National Gallery. This walkthrough demonstrates combining text search, reverse image search, and document-based queries into a unified retrieval experience using hybrid search with Reciprocal Rank Fusion (RRF).

    hybrid-searchexploratory-searchimage-searchretrievers

    About this video

    Discover how to build a powerful exploratory image board using multimodal search across 120,000 images from the National Gallery. This walkthrough demonstrates combining text search, reverse image search, and document-based queries into a unified retrieval experience using hybrid search with Reciprocal Rank Fusion (RRF). 👉 Live Demo: https://mxp.co/r/npg What you'll learn: ⚡ Building exploratory search interfaces for visual content ⚡ Combining text, image, and document reference queries ⚡ Implementing hybrid search with RRF for optimal results ⚡ Using Google SigLIP embeddings for image understanding ⚡ Creating multi-stage retriever pipelines with feature search ⚡ Capturing user signals for recommendation systems ⚡ Architecture patterns: Objects → Buckets → Collections → Retrievers Real-world demo: Visual curation across 120k images, 12GB of data, with text + image + document hybrid queries. Full source code available in the Mixpeek showcase repository.

    Frequently Asked Questions