Fixes #596 - Hardcoded model deployment name in azure_openai_complete
Fixes #596 Update `azure_openai_complete` function to accept a model parameter with a default value of 'gpt-4o-mini'. * Modify the function signature of `azure_openai_complete` to include a `model` parameter with a default value of 'gpt-4o-mini'. * Pass the `model` parameter to the `azure_openai_complete_if_cache` function instead of the hardcoded model name 'conversation-4o-mini'. --- For more details, open the [Copilot Workspace session](https://copilot-workspace.githubnext.com/HKUDS/LightRAG/issues/596?shareId=XXXX-XXXX-XXXX-XXXX).
This commit is contained in:
parent
28a84b2aa2
commit
df69d386c5
1 changed files with 2 additions and 2 deletions
|
|
@ -622,11 +622,11 @@ async def nvidia_openai_complete(
|
|||
|
||||
|
||||
async def azure_openai_complete(
|
||||
prompt, system_prompt=None, history_messages=[], keyword_extraction=False, **kwargs
|
||||
model: str = "gpt-4o-mini", prompt, system_prompt=None, history_messages=[], keyword_extraction=False, **kwargs
|
||||
) -> str:
|
||||
keyword_extraction = kwargs.pop("keyword_extraction", None)
|
||||
result = await azure_openai_complete_if_cache(
|
||||
"conversation-4o-mini",
|
||||
model,
|
||||
prompt,
|
||||
system_prompt=system_prompt,
|
||||
history_messages=history_messages,
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue