810dl/month
10likes
Identifier
Model ID
HPAI-BSC/Qwen2.5-Aloe-Beta-7BTags
transformerssafetensorsqwen2text-generationbiologymedicalhealthcarequestion-answeringendataset:HPAI-BSC/Aloe-Beta-General-Collectiondataset:HPAI-BSC/chain-of-diagnosisdataset:HPAI-BSC/MedS-Insdataset:HPAI-BSC/ultramedicaldataset:HPAI-BSC/pubmedqa-cot-llama31dataset:HPAI-BSC/medqa-cot-llama31dataset:HPAI-BSC/medmcqa-cot-llama31dataset:HPAI-BSC/headqa-cot-llama31dataset:HPAI-BSC/MMLU-medical-cot-llama31dataset:HPAI-BSC/Polymed-QAarxiv:2505.04388license:cc-by-nc-4.0text-generation-inferenceendpoints_compatibleregion:us
Use Qwen2.5-Aloe-Beta-7B on Mixpeek
Build multimodal processing pipelines with this model and others. Extract features, run inference, and set up retrieval, all through the Mixpeek pipeline builder.
Open Pipeline BuilderSpecification
OrganizationHPAI-BSC
TaskQuestion Answering
Librarytransformers
Licensecc-by-nc-4.0
Downloads/mo810
Likes10
View on HuggingFace
See model card, files, and community discussion
Related Question Answering Models
ahotrod/electra_large_discriminator_squad2_512
865K
deepset/roberta-base-squad2
604K
google-bert/bert-large-uncased-whole-word-masking-finetuned-squad
377K
deepset/roberta-large-squad2
284K
distilbert/distilbert-base-cased-distilled-squad
233K
deepset/bert-large-uncased-whole-word-masking-squad2
186K
timpal0l/mdeberta-v3-base-squad2
163K
monologg/koelectra-small-v2-distilled-korquad-384
155K