AI PC: Question Answering
LLMs for Question Answering that have been validated to run on the AI PC Intel® Core™ Ultra CPU and iGPU.
Question Answering • Updated • 20Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation
deepset/bert-base-cased-squad2
Question Answering • Updated • 22.5k • 19Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation
deepset/roberta-base-squad2
Question Answering • Updated • 908k • • 802Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation
distilbert/distilbert-base-uncased-distilled-squad
Question Answering • Updated • 159k • 97Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation
google-bert/bert-large-uncased-whole-word-masking-finetuned-squad
Question Answering • Updated • 114k • 172Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation
mrm8488/bert-small-finetuned-squadv2
Question Answering • Updated • 428 • 1Note This model was tested with OpenVINO version 2024.1.0 using the OVModelForQuestionAnswering library. To convert the model to OpenVINO, follow instructions at: https://huggingface.co/docs/optimum/main/en/intel/installation