llm-output
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Request for Multi-Input State-Oriented Inference | 0 | 23 | October 1, 2024 | |
Is there a single sampling method used during inference, or there's a logic to use different sampling methods based on a given input? | 1 | 568 | April 28, 2024 | |
How does Assistant API (ChatGPT System) handles long context (aggregation of prompts & responses) in a Thread? | 0 | 534 | April 20, 2024 | |
Is LLM a must in Retrieval augmented generation? | 1 | 1881 | December 25, 2023 |