Unable to call this model
1
#70 opened 7 months ago
by
Abhi012024
Fix TL;DR TOC link
#69 opened 8 months ago
by
ZennyKenny
Classification probabilities
#68 opened 10 months ago
by
SaraAmd
Authorization header is correct, but the token seems invalid
24
#67 opened 11 months ago
by
Srisowmya
[AUTOMATED] Model Memory Requirements
#65 opened about 1 year ago
by
model-sizer-bot
Validation error
#64 opened about 1 year ago
by
kishoress1963
While try to using google/flan-t5-xxl inference deploy in AWS sagemaker. Answers is truncated.
3
#62 opened about 1 year ago
by
haizamir
can you use it in 'text-generation' instead of 'text-to-text-generation'?
#61 opened about 1 year ago
by
surya-narayanan
Autotrain Failures
#60 opened about 1 year ago
by
Velhic
Flan-t5-XL and 500 server error
1
#59 opened about 1 year ago
by
wendyjboss
Low score and wrong answer for "question-answering" task
#58 opened about 1 year ago
by
ybensaid
Config does not align with original paper
#57 opened over 1 year ago
by
cbock90
Diversity in beam search
#56 opened over 1 year ago
by
ayseozgun
How we can pass System Text?
#55 opened over 1 year ago
by
rehan02
maxTokens
2
#54 opened over 1 year ago
by
islamn25
ValueError: Need either a `state_dict` or a `save_folder` containing offloaded weights.
5
#53 opened over 1 year ago
by
tuannguyends
Cannot finish download of model
1
#52 opened over 1 year ago
by
Dipe00
Not getting proper output from this model as before.
3
#50 opened over 1 year ago
by
nisha7
Hey community, Has google flan t5 xxl model updated yesterday ?
#48 opened over 1 year ago
by
Aksha
Classifcation
1
#47 opened over 1 year ago
by
SaraAmd
input prompt
1
#46 opened over 1 year ago
by
SaraAmd
Model google/flan-t5-xxl does not exist
1
#45 opened over 1 year ago
by
phdykd
ValueError: Error raised by inference API: Input validation error: `inputs` tokens + `max_new_tokens` must be <= 1512. Given: 190761 `inputs` tokens and 20 `max_new_tokens`
11
#44 opened over 1 year ago
by
phdykd
Error raised by inference API: Model google/flan-t5-xl time out
21
#43 opened over 1 year ago
by
phdykd
Quoting source from given text for Q&A prompt
3
#42 opened over 1 year ago
by
tristanchambers-bids
any plans on increasing model's model_max_length?
17
#41 opened over 1 year ago
by
lovodkin93
<Response [422]>
#40 opened over 1 year ago
by
skrishna
Output seems overly truncated, and max_length don't seem to matter
12
#39 opened over 1 year ago
by
joekr552
Minimum number of tokens in generate
#38 opened over 1 year ago
by
rachith
Does the team have the plan to release the multilingual version (60)? When?
#37 opened over 1 year ago
by
xin1111
samkenxstream
#34 opened over 1 year ago
by
samkenxstream
Flan-T5 tokenizer supports neither Chinese nor many code-related tokens despite being advertised as such
1
#33 opened over 1 year ago
by
michaelroyzen
Looking for 'mps' support for flan models ?
#30 opened almost 2 years ago
by
abhishekmamdapure
Confusion about sampling for `flan-t5` models
2
#29 opened almost 2 years ago
by
ArthurConmy
Add Portuguese in model card tags
1
#28 opened almost 2 years ago
by
pierreguillou
file not found error
1
#25 opened almost 2 years ago
by
pradeepmohans
Fine tune xxl using 24GB GPU?
7
#23 opened almost 2 years ago
by
gameveloster
One gpu version
3
#22 opened almost 2 years ago
by
joorei
Float16 and Int8 Produce Wrong Results
7
#21 opened almost 2 years ago
by
egeozsoy
Slovak tag instead of Slovenian?
#20 opened almost 2 years ago
by
matejklemen
Long variants
#18 opened almost 2 years ago
by
fastandthefinetuned
support chinese?
#17 opened almost 2 years ago
by
Hines
Clean tokenizer_config.json
#16 opened almost 2 years ago
by
skwon
Deploying on Amazon SageMaker
3
#14 opened almost 2 years ago
by
ivoschaper
Is there a way to add to the feed my domain specific articles to current trained model?
3
#13 opened almost 2 years ago
by
Stromal
run model in colab using 8 bit
8
#8 opened about 2 years ago
by
kabalanresearch
Support Japanese?
4
#7 opened about 2 years ago
by
kosukekurimoto
Load model automatic in inference API
#6 opened about 2 years ago
by
micole66
ValueError: `decoder_start_token_id` or `bos_token_id` has to be defined for encoder-decoder generation
2
#2 opened about 2 years ago
by
kosukekurimoto