Error while doing inference

#69
by SivaPrasad02 - opened

Hi all, I am facing one error and one issue. Did anyone encounter a similar issue?

  1. ValueError: past key much have a shape of (batch_size, num_heads, self.config.sliding_window-1, head_dim), got torch.Size([1, 8, 1868, 128])
    The above error came while making inferences with an input length of 6k. Config is the default one that was used.

  2. Batch inference generation is not the same as a single sample as shown below(generating less number of tokens while doing batch inference with the same setting)

While doing batch inference
"""
[INST] Determine whether the following pairs of sentences embody an entailment relation or not.

Sentences: America is a monarchy. Therefore America has a king.
Relation:
Choices: entailment,no-entailment [/INST]

The relation between the two sentences is an entailment.
"""
the same while doing inference with only single sample.
"""
[INST] Determine whether the following pairs of sentences embody an entailment relation or not.

Sentences: No one ordered any beverages. So no one ordered orange juice.
Relation:
Choices: entailment,no-entailment [/INST]

The pair of sentences embody an entailment relation.

Explanation: The first sentence, "No one ordered any beverages," is the antecedent of the conditional statement "So no one ordered orange juice." The second sentence is the consequent of the conditional statement. The entailment relation holds because if the antecedent is true, then the consequent must also be true. In this case, if no one ordered any beverages, then it logically follows that no one ordered orange juice.
"""

I also see the first issue when I try to inference with more than the sliding window length. I'm unsure why this is the case.

Sign up or log in to comment