Edit model card
Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Model card for phikon-finetuned-lora-kather2016

This model is a fine-tuned version of owkin/phikon on the 1aurent/Kather-texture-2016 dataset.

Model Usage

Image Classification

from transformers import AutoModelForImageClassification, AutoImageProcessor
from peft import PeftConfig, PeftModel
from urllib.request import urlopen
from PIL import Image

# get example histology image
img = Image.open(
  urlopen(
    "https://datasets-server.huggingface.co/assets/1aurent/Kather-texture-2016/--/default/train/0/image/image.jpg"
  )
)

# load config, image_processor, base_model and lora_model from the hub
model_name = "1aurent/phikon-finetuned-lora-kather2016"
config = PeftConfig.from_pretrained(
  pretrained_model_name_or_path=model_name
)
image_processor = AutoImageProcessor.from_pretrained(
  pretrained_model_name_or_path=config.base_model_name_or_path
)
model = AutoModelForImageClassification.from_pretrained(
  pretrained_model_name_or_path=config.base_model_name_or_path,
  num_labels=8,
)
lora_model = PeftModel.from_pretrained(
  model=model,
  model_id=model_name
)

inputs = image_processor(img, return_tensors="pt")
outputs = lora_model(**inputs)

Citation

@article{Filiot2023.07.21.23292757,
  author       = {Alexandre Filiot and Ridouane Ghermi and Antoine Olivier and Paul Jacob and Lucas Fidon and Alice Mac Kain and Charlie Saillard and Jean-Baptiste Schiratti},
  title        = {Scaling Self-Supervised Learning for Histopathology with Masked Image Modeling},
  elocation-id = {2023.07.21.23292757},
  year         = {2023},
  doi          = {10.1101/2023.07.21.23292757},
  publisher    = {Cold Spring Harbor Laboratory Press},
  url          = {https://www.medrxiv.org/content/early/2023/09/14/2023.07.21.23292757},
  eprint       = {https://www.medrxiv.org/content/early/2023/09/14/2023.07.21.23292757.full.pdf},
  journal      = {medRxiv}
}
Downloads last month
7
Inference Examples
Inference API (serverless) does not yet support peft models for this pipeline type.

Model tree for 1aurent/phikon-finetuned-lora-kather2016

Base model

owkin/phikon
Adapter
(1)
this model
Finetunes
2 models

Dataset used to train 1aurent/phikon-finetuned-lora-kather2016

Evaluation results