| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|
|
|
|
|
|
| |
There is no point in using Provide more than once,
and anywhere besides the first action of a flow.
So it's not really an action, but more of a flow property.
Add Flow.Consts field to handle this case better.
Also provide slightly less verbose syntax by using a map
instead of a struct, and add tests.
|
| |
|
|
|
|
|
|
|
|
| |
Introduce abstract "task type" for LLM agents instead of specifying
temperature explicitly for each agent. This has 2 advantages:
- we don't hardcode it everywhere, and can change centrally
as our understanding of the right temperature evolves
- we can control other LLM parameters (topn/topk) using task type as well
Update #6576
|
| |
|
|
|
|
|
| |
Handle LLM tool input token overflow by removing the last tool reply,
and replacing it with an order to answer right now.
I've seen an LLM tool went into too deap research and in the end
just overflowed input tokens. It could provide at least some answer instead.
|
| |
|
|
|
|
|
| |
I've added NewPipeline constructor for a bit nicer syntax,
but failed to use it in actual workflows.
Unexport Pipeline and rename NewPipeline to Pipeline.
This slightly improves workflows definition syntax.
|
|
|
LLMTool acts like a tool for the parent LLM, but is itself implemented as an LLM agent.
It can have own tools, different from the parent LLM agent.
It can do complex multi-step research, and provide a concise answer to the parent LLM
without polluting its context window.
|