Upload folder using huggingface_hub
Browse files- README.md +52 -0
- config.json +1 -1
- configuration_intern_vit.py +1 -1
- configuration_internvl_chat.py +1 -1
- modeling_intern_vit.py +1 -1
README.md
CHANGED
@@ -74,6 +74,8 @@ Limitations: Although we have made efforts to ensure the safety of the model dur
|
|
74 |
|
75 |
We provide an example code to run InternVL-Chat-V1-5 using `transformers`.
|
76 |
|
|
|
|
|
77 |
> Please use transformers==4.37.2 to ensure the model works normally.
|
78 |
|
79 |
```python
|
@@ -421,6 +423,56 @@ sess = pipe.chat('What is the woman doing?', session=sess, gen_config=gen_config
|
|
421 |
print(sess.response.text)
|
422 |
```
|
423 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
424 |
## License
|
425 |
|
426 |
This project is released under the MIT license, while InternLM is licensed under the Apache-2.0 license.
|
|
|
74 |
|
75 |
We provide an example code to run InternVL-Chat-V1-5 using `transformers`.
|
76 |
|
77 |
+
We also welcome you to experience the InternVL2 series models in our [online demo](https://internvl.opengvlab.com/). Currently, due to the limited GPU resources with public IP addresses, we can only deploy models up to a maximum of 26B. We will expand soon and deploy larger models to the online demo.
|
78 |
+
|
79 |
> Please use transformers==4.37.2 to ensure the model works normally.
|
80 |
|
81 |
```python
|
|
|
423 |
print(sess.response.text)
|
424 |
```
|
425 |
|
426 |
+
#### Service
|
427 |
+
|
428 |
+
LMDeploy's `api_server` enables models to be easily packed into services with a single command. The provided RESTful APIs are compatible with OpenAI's interfaces. Below are an example of service startup:
|
429 |
+
|
430 |
+
```shell
|
431 |
+
lmdeploy serve api_server OpenGVLab/InternVL-Chat-V1-5 --model-name InternVL-Chat-V1-5 --backend turbomind --server-port 23333
|
432 |
+
```
|
433 |
+
|
434 |
+
To use the OpenAI-style interface, you need to install OpenAI:
|
435 |
+
|
436 |
+
```shell
|
437 |
+
pip install openai
|
438 |
+
```
|
439 |
+
|
440 |
+
Then, use the code below to make the API call:
|
441 |
+
|
442 |
+
```python
|
443 |
+
from openai import OpenAI
|
444 |
+
|
445 |
+
client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
|
446 |
+
model_name = client.models.list().data[0].id
|
447 |
+
response = client.chat.completions.create(
|
448 |
+
model="InternVL-Chat-V1-5",
|
449 |
+
messages=[{
|
450 |
+
'role':
|
451 |
+
'user',
|
452 |
+
'content': [{
|
453 |
+
'type': 'text',
|
454 |
+
'text': 'describe this image',
|
455 |
+
}, {
|
456 |
+
'type': 'image_url',
|
457 |
+
'image_url': {
|
458 |
+
'url':
|
459 |
+
'https://modelscope.oss-cn-beijing.aliyuncs.com/resource/tiger.jpeg',
|
460 |
+
},
|
461 |
+
}],
|
462 |
+
}],
|
463 |
+
temperature=0.8,
|
464 |
+
top_p=0.8)
|
465 |
+
print(response)
|
466 |
+
```
|
467 |
+
|
468 |
+
### vLLM
|
469 |
+
|
470 |
+
TODO
|
471 |
+
|
472 |
+
### Ollama
|
473 |
+
|
474 |
+
TODO
|
475 |
+
|
476 |
## License
|
477 |
|
478 |
This project is released under the MIT license, while InternLM is licensed under the Apache-2.0 license.
|
config.json
CHANGED
@@ -91,7 +91,7 @@
|
|
91 |
"tie_word_embeddings": false,
|
92 |
"tokenizer_class": null,
|
93 |
"top_k": 50,
|
94 |
-
"top_p":
|
95 |
"torch_dtype": "bfloat16",
|
96 |
"torchscript": false,
|
97 |
"transformers_version": "4.37.2",
|
|
|
91 |
"tie_word_embeddings": false,
|
92 |
"tokenizer_class": null,
|
93 |
"top_k": 50,
|
94 |
+
"top_p": 1.0,
|
95 |
"torch_dtype": "bfloat16",
|
96 |
"torchscript": false,
|
97 |
"transformers_version": "4.37.2",
|
configuration_intern_vit.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
-
# Copyright (c)
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
import os
|
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
+
# Copyright (c) 2024 OpenGVLab
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
import os
|
configuration_internvl_chat.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
-
# Copyright (c)
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
|
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
+
# Copyright (c) 2024 OpenGVLab
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
|
modeling_intern_vit.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
-
# Copyright (c)
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
from typing import Optional, Tuple, Union
|
|
|
1 |
# --------------------------------------------------------
|
2 |
# InternVL
|
3 |
+
# Copyright (c) 2024 OpenGVLab
|
4 |
# Licensed under The MIT License [see LICENSE for details]
|
5 |
# --------------------------------------------------------
|
6 |
from typing import Optional, Tuple, Union
|