ragflow/rag/llm
Yongteng Lei ad6f7fd4b0
Fix: pipeline ignore MinerU backend config and vllm module is missing (#11955)
### What problem does this PR solve?

Fix pipeline ignore MinerU backend config and vllm module is missing.
#11944, #11947.

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
2025-12-15 18:03:34 +08:00
..
__init__.py Fix: Asure-OpenAI resource not found (#11934) 2025-12-13 11:32:46 +08:00
chat_model.py Fix: Asure-OpenAI resource not found (#11934) 2025-12-13 11:32:46 +08:00
cv_model.py Refa: migrate CV model chat to Async (#11828) 2025-12-09 13:08:37 +08:00
embedding_model.py Refactor: Improve the logic to calculate embedding total token count (#11943) 2025-12-15 11:33:57 +08:00
ocr_model.py Fix: pipeline ignore MinerU backend config and vllm module is missing (#11955) 2025-12-15 18:03:34 +08:00
rerank_model.py fix cohere rerank base_url default (#11353) 2025-11-20 09:46:39 +08:00
sequence2txt_model.py Feat:new api /sequence2txt and update QWenSeq2txt (#11643) 2025-12-02 11:17:31 +08:00
tts_model.py Move token related functions to common (#10942) 2025-11-03 08:50:05 +08:00