Mixpeek Logo
    Back to Videos

    Building an Exploratory Multimodal Retriever with the National Gallery

    8:20
    Use Cases
    Ethan

    About this video

    Discover how to build a powerful exploratory image board using multimodal search across 120,000 images from the National Gallery. This walkthrough demonstrates combining text search, reverse image search, and document-based queries into a unified retrieval experience using hybrid search with Reciprocal Rank Fusion (RRF). πŸ‘‰ Live Demo: https://mxp.co/r/npg What you'll learn: ⚑ Building exploratory search interfaces for visual content ⚑ Combining text, image, and document reference queries ⚑ Implementing hybrid search with RRF for optimal results ⚑ Using Google SigLIP embeddings for image understanding ⚑ Creating multi-stage retriever pipelines with feature search ⚑ Capturing user signals for recommendation systems ⚑ Architecture patterns: Objects β†’ Buckets β†’ Collections β†’ Retrievers Real-world demo: Visual curation across 120k images, 12GB of data, with text + image + document hybrid queries. Full source code available in the Mixpeek showcase repository.