NEWAgents can now see video via MCP.Try it now →
    Models/Text Generation/stelterlab/Mistral-Small-24B-Instruct-2501-AWQ
    Text Generationvllmapache-2.0

    Mistral-Small-24B-Instruct-2501-AWQ

    by stelterlab

    456Kdl/month
    29likes
    Identifier
    Model ID
    stelterlab/Mistral-Small-24B-Instruct-2501-AWQ

    Tags

    vllmsafetensorsmistraltext-generationtransformersconversationalenfrdeesitptzhjarukobase_model:mistralai/Mistral-Small-24B-Instruct-2501base_model:quantized:mistralai/Mistral-Small-24B-Instruct-2501license:apache-2.0text-generation-inference4-bitawqregion:us

    Use Mistral-Small-24B-Instruct-2501-AWQ on Mixpeek

    Build multimodal processing pipelines with this model and others. Extract features, run inference, and set up retrieval, all through the Mixpeek pipeline builder.

    Open Pipeline Builder