LLMClientAsk Async Method
Asynchronous streaming LLM call via WebSocket.
Calls onChunk for each received text fragment.
Returns the complete concatenated response when the stream ends.
Uses the server's currently active backend and model by default.
Definition
Namespace: DWSIM.SharedClasses
Assembly: DWSIM.SharedClasses (in DWSIM.SharedClasses.dll) Version: 10.0.0.0
Assembly: DWSIM.SharedClasses (in DWSIM.SharedClasses.dll) Version: 10.0.0.0
C#
public Task<string> AskAsync(
string prompt,
string model = "",
string lang = "",
Action<string> onChunk = null,
CancellationToken ct = default
)VB
Public Function AskAsync (
prompt As String,
Optional model As String = "",
Optional lang As String = "",
Optional onChunk As Action(Of String) = Nothing,
Optional ct As CancellationToken = Nothing
) As Task(Of String)Parameters
- prompt String
- model String (Optional)
- lang String (Optional)
- onChunk ActionString (Optional)
- ct CancellationToken (Optional)