Asclepius
Collection
8 items
•
Updated
•
2
This is an official model checkpoint for Asclepius-Llama3-8B (arxiv). This model is an enhanced version of Asclepius-7B, by replacing the base model with Llama-3 and increasing the max sequence length to 8192.
This model can perform below 8 clinical NLP tasks, with clincal notes.
[More Information Needed]
[More Information Needed]
ONLY USE THIS MODEL FOR RESEARCH PURPOSE!!
prompt = """You are an intelligent clinical languge model.
Below is a snippet of patient's discharge summary and a following instruction from healthcare professional.
Write a response that appropriately completes the instruction.
The response should provide the accurate answer to the instruction, while being concise.
[Discharge Summary Begin]
{note}
[Discharge Summary End]
[Instruction Begin]
{question}
[Instruction End]
"""
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("starmpcc/Asclepius-Llama3-8B", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("starmpcc/Asclepius-Llama3-8B")
note = "This is a sample note"
question = "What is the diagnosis?"
model_input = prompt.format(note=note, question=question)
input_ids = tokenizer(model_input, return_tensors="pt").input_ids
output = model.generate(input_ids)
print(tokenizer.decode(output[0]))
https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes
BibTeX:
@article{kweon2023publicly,
title={Publicly Shareable Clinical Large Language Model Built on Synthetic Clinical Notes},
author={Kweon, Sunjun and Kim, Junu and Kim, Jiyoun and Im, Sujeong and Cho, Eunbyeol and Bae, Seongsu and Oh, Jungwoo and Lee, Gyubok and Moon, Jong Hak and You, Seng Chan and others},
journal={arXiv preprint arXiv:2309.00237},
year={2023}
}