Commit graph

5 commits

Author SHA1 Message Date
yangdx
05852e1ab2 Add max_token_size parameter to embedding function decorators
- Add max_token_size=8192 to all embed funcs
- Move siliconcloud to deprecated folder
- Import wrap_embedding_func_with_attrs
- Update EmbeddingFunc docstring
- Fix langfuse import type annotation
2025-11-14 18:41:43 +08:00
yangdx
9923821d75 refactor: Remove deprecated max_token_size from embedding configuration
This parameter is no longer used. Its removal simplifies the API and clarifies that token length management is handled by upstream text chunking logic rather than the embedding wrapper.
2025-07-29 10:49:35 +08:00
Yannick Stephan
55cd900e8e clean comments and unused libs 2025-02-18 21:12:06 +01:00
Saifeddine ALOUI
06c9e4e454 Fixed missing imports bug and fixed linting 2025-01-25 00:55:07 +01:00
Saifeddine ALOUI
34018cb1e0 Separated llms from the main llm.py file and fixed some deprication bugs 2025-01-25 00:11:00 +01:00