Fix: potential negative max_tokens in RAPTOR (#10701)
### What problem does this PR solve? Fix potential negative max_tokens in RAPTOR. #10235. ### Type of change - [x] Bug Fix (non-breaking change which fixes an issue
This commit is contained in:
parent
544c9990e3
commit
cd77425b87
1 changed files with 1 additions and 1 deletions
|
|
@ -114,7 +114,7 @@ class RecursiveAbstractiveProcessing4TreeOrganizedRetrieval:
|
|||
),
|
||||
}
|
||||
],
|
||||
{"max_tokens": self._max_token},
|
||||
{"max_tokens": max(self._max_token, 512)}, # fix issue: #10235
|
||||
)
|
||||
cnt = re.sub(
|
||||
"(······\n由于长度的原因,回答被截断了,要继续吗?|For the content length reason, it stopped, continue?)",
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue