StyledLines from testedlines.com: C# docs 1.0.1
GPT2-based text stylization LLM model, wrapped with Llama.cpp in Csharp to ensure compatibility across various platforms, including iOS and WebGL. The model is designed to transform generic texts into stylized, game or user-tailored dialogue.
Loading...
Searching...
No Matches
LlamaLibrary.AsyncLlamaInfrence Class Reference
+ Inheritance diagram for LlamaLibrary.AsyncLlamaInfrence:
+ Collaboration diagram for LlamaLibrary.AsyncLlamaInfrence:

Public Member Functions

 AsyncLlamaInfrence ()
 
void ClearAllTasks ()
 
virtual void Dispose ()
 
int GenerateAsync (string prompt)
 
string GetGenerationResults (int task_id)
 
bool GetSetupResults (int task_id)
 
bool IsGenerationReady (int task_id)
 
int SetupAsync (LoggingContext logging, gpt_params cfg_params)
 

Protected Attributes

bool swigCMemOwn
 

Package Functions

 AsyncLlamaInfrence (global::System.IntPtr cPtr, bool cMemoryOwn)
 

Static Package Functions

static global::System.Runtime.InteropServices.HandleRef getCPtr (AsyncLlamaInfrence obj)
 

Properties

SWIGTYPE_p_AsyncLlamaInfrence__LlamaAsyncImpl pImpl [get, set]
 

Private Member Functions

 ~AsyncLlamaInfrence ()
 

Private Attributes

global::System.Runtime.InteropServices.HandleRef swigCPtr
 

Constructor & Destructor Documentation

◆ AsyncLlamaInfrence() [1/2]

LlamaLibrary.AsyncLlamaInfrence.AsyncLlamaInfrence ( global::System::IntPtr cPtr,
bool cMemoryOwn )
package

◆ ~AsyncLlamaInfrence()

LlamaLibrary.AsyncLlamaInfrence.~AsyncLlamaInfrence ( )
private

References LlamaLibrary.AsyncLlamaInfrence.Dispose().

+ Here is the call graph for this function:

◆ AsyncLlamaInfrence() [2/2]

LlamaLibrary.AsyncLlamaInfrence.AsyncLlamaInfrence ( )

Member Function Documentation

◆ ClearAllTasks()

void LlamaLibrary.AsyncLlamaInfrence.ClearAllTasks ( )

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_ClearAllTasks(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

+ Here is the call graph for this function:

◆ Dispose()

virtual void LlamaLibrary.AsyncLlamaInfrence.Dispose ( )
virtual

References LlamaLibrary.libllama_libPINVOKE.delete_AsyncLlamaInfrence(), LlamaLibrary.AsyncLlamaInfrence.swigCMemOwn, and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

Referenced by Assets.StyledLines.Runtime.LlamaInfrenceUnity.RunModelAsync.OnDestroy(), and LlamaLibrary.AsyncLlamaInfrence.~AsyncLlamaInfrence().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ GenerateAsync()

int LlamaLibrary.AsyncLlamaInfrence.GenerateAsync ( string prompt)

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_GenerateAsync(), LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

Referenced by Assets.StyledLines.Runtime.LlamaInfrenceUnity.RunModelAsync.Generate().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ getCPtr()

static global.System.Runtime.InteropServices.HandleRef LlamaLibrary.AsyncLlamaInfrence.getCPtr ( AsyncLlamaInfrence obj)
staticpackage

◆ GetGenerationResults()

string LlamaLibrary.AsyncLlamaInfrence.GetGenerationResults ( int task_id)

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_GetGenerationResults(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

Referenced by Assets.StyledLines.Runtime.LlamaInfrenceUnity.RunModelAsync.Update().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ GetSetupResults()

bool LlamaLibrary.AsyncLlamaInfrence.GetSetupResults ( int task_id)

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_GetSetupResults(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

+ Here is the call graph for this function:

◆ IsGenerationReady()

bool LlamaLibrary.AsyncLlamaInfrence.IsGenerationReady ( int task_id)

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_IsGenerationReady(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

+ Here is the call graph for this function:

◆ SetupAsync()

int LlamaLibrary.AsyncLlamaInfrence.SetupAsync ( LoggingContext logging,
gpt_params cfg_params )

References LlamaLibrary.libllama_libPINVOKE.AsyncLlamaInfrence_SetupAsync(), LlamaLibrary.gpt_params.getCPtr(), LlamaLibrary.LoggingContext.getCPtr(), LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Pending, LlamaLibrary.libllama_libPINVOKE.SWIGPendingException.Retrieve(), and LlamaLibrary.AsyncLlamaInfrence.swigCPtr.

Referenced by Assets.StyledLines.Runtime.LlamaInfrenceUnity.RunModelAsync.DoSetup(), and Assets.StyledLines.Runtime.LlamaInfrenceUnity.ModelController.ModelController().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

Member Data Documentation

◆ swigCMemOwn

bool LlamaLibrary.AsyncLlamaInfrence.swigCMemOwn
protected

◆ swigCPtr

Property Documentation

◆ pImpl

SWIGTYPE_p_AsyncLlamaInfrence__LlamaAsyncImpl LlamaLibrary.AsyncLlamaInfrence.pImpl
getset

The documentation for this class was generated from the following file: