File size: 1,693 Bytes
c20c9d2
 
054e35e
 
 
 
 
c20c9d2
 
054e35e
c20c9d2
054e35e
 
 
 
 
 
 
 
 
c20c9d2
 
205b9e1
c20c9d2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b8a4fe9
 
 
 
 
 
 
 
 
 
 
 
c20c9d2
 
205b9e1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
---
license: apache-2.0
language:
- en
pipeline_tag: image-text-to-text
tags:
- chat
---

# mPLUG-DocOwl2

## Introduction
mPLUG-DocOwl2 is a state-of-the-art Multimodal LLM for OCR-free Multi-page Document Understanding. 

Through a compressing module named High-resolution DocCompressor, each page is encoded with just 324 tokens.


Github: [mPLUG-DocOwl](https://github.com/X-PLUG/mPLUG-DocOwl)

## Quickstart


```python
import torch
import os
from transformers import AutoTokenizer, AutoModel
from icecream import ic
import time

class DocOwlInfer():
    def __init__(self, ckpt_path):
        self.tokenizer = AutoTokenizer.from_pretrained(ckpt_path, use_fast=False)
        self.model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True, low_cpu_mem_usage=True, torch_dtype=torch.float16, device_map='auto')
        self.model.init_processor(tokenizer=self.tokenizer, basic_image_size=504, crop_anchors='grid_12')
        
    def inference(self, images, query):
        messages = [{'role': 'USER', 'content': '<|image|>'*len(images)+query}]
        answer = self.model.chat(messages=messages, images=images, tokenizer=self.tokenizer)
        return answer


docowl = DocOwlInfer(ckpt_path='mPLUG/DocOwl2')

images = [
        './examples/docowl2_page0.png',
        './examples/docowl2_page1.png',
        './examples/docowl2_page2.png',
        './examples/docowl2_page3.png',
        './examples/docowl2_page4.png',
        './examples/docowl2_page5.png',
    ]

answer = docowl.inference(images, query='what is this paper about? provide detailed information.')

answer = docowl.inference(images, query='what is the third page about? provide detailed information.')


```