![]() |
StyledLines from testedlines.com: C# docs 1.0.1
GPT2-based text stylization LLM model, wrapped with Llama.cpp in Csharp to ensure compatibility across various platforms, including iOS and WebGL. The model is designed to transform generic texts into stylized, game or user-tailored dialogue.
|
Inheritance diagram for LlamaLibrary.LlamaInfrence:
Collaboration diagram for LlamaLibrary.LlamaInfrence:Public Member Functions | |
| virtual void | Dispose () |
| void | Echo (string arg0) |
| bool | Generate (string prompt) |
| string | GetGenerated () |
| LlamaInfrence (global::System.IntPtr cPtr, bool cMemoryOwn) | |
| LlamaInfrence (LoggingContext logging, gpt_params cfg_params) | |
| void | Stop () |
Static Public Member Functions | |
| static gpt_params | GetParameters (string llamacpp_cmd_args) |
Protected Attributes | |
| bool | swigCMemOwn |
Static Package Functions | |
| static global::System.Runtime.InteropServices.HandleRef | getCPtr (LlamaInfrence obj) |
Properties | |
| SWIGTYPE_p_LlamaInfrence__LlamaImpl | pImpl [get, set] |
Private Member Functions | |
| ~LlamaInfrence () | |
Private Attributes | |
| global::System.Runtime.InteropServices.HandleRef | swigCPtr |
| LlamaLibrary.LlamaInfrence.LlamaInfrence | ( | global::System::IntPtr | cPtr, |
| bool | cMemoryOwn ) |
|
private |
| LlamaLibrary.LlamaInfrence.LlamaInfrence | ( | LoggingContext | logging, |
| gpt_params | cfg_params ) |
References LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, and LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve().
Here is the call graph for this function:
|
virtual |
References LlamaLibrary.libllama_libPINVOKE.delete_LlamaInfrence(), LlamaLibrary.LlamaInfrence.swigCMemOwn, and LlamaLibrary.LlamaInfrence.swigCPtr.
Referenced by LlamaLibrary.LlamaInfrence.~LlamaInfrence().
Here is the call graph for this function:
Here is the caller graph for this function:| void LlamaLibrary.LlamaInfrence.Echo | ( | string | arg0 | ) |
References LlamaLibrary.libllama_libPINVOKE.LlamaInfrence_Echo(), LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve(), and LlamaLibrary.LlamaInfrence.swigCPtr.
Here is the call graph for this function:| bool LlamaLibrary.LlamaInfrence.Generate | ( | string | prompt | ) |
References LlamaLibrary.libllama_libPINVOKE.LlamaInfrence_Generate(), LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve(), and LlamaLibrary.LlamaInfrence.swigCPtr.
Here is the call graph for this function:
|
staticpackage |
References LlamaLibrary.LlamaInfrence.swigCPtr.
| string LlamaLibrary.LlamaInfrence.GetGenerated | ( | ) |
References LlamaLibrary.libllama_libPINVOKE.LlamaInfrence_GetGenerated(), and LlamaLibrary.LlamaInfrence.swigCPtr.
Here is the call graph for this function:
|
static |
References LlamaLibrary.libllama_libPINVOKE.LlamaInfrence_GetParameters(), LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, and LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve().
Here is the call graph for this function:| void LlamaLibrary.LlamaInfrence.Stop | ( | ) |
References LlamaLibrary.libllama_libPINVOKE.LlamaInfrence_Stop(), and LlamaLibrary.LlamaInfrence.swigCPtr.
Here is the call graph for this function:
|
protected |
Referenced by LlamaLibrary.LlamaInfrence.Dispose(), and LlamaLibrary.LlamaInfrence.LlamaInfrence().
|
private |
|
getset |