Overview
Initialization
-
model_name (string, PreTrainedModel or None) ("cerebras/Cerebras-GPT-111M"): Provide a string that contains a Hugging Face ID or a path to a model directory. You can also provide an already loaded model.
-
trust_remote_code (bool) (optional): Set to True to trust remote code. Only set to True for repositories you trust.
from erictransformer import EricGeneration
# we recommend you use a larger model if you have enough memory. Cerebras has models ranging from 111M to 13B parameters.
# https://huggingface.co/collections/cerebras/cerebras-gpt
eric_gen = EricGeneration(model_name="cerebras/Cerebras-GPT-111M", trust_remote_code=False)
Call
Arguments:
-
text (string): The text prompt for the model.
-
args (
GENCallArgs) (GENCallArgs()): See this webpage for more detail.
from erictransformer import EricGeneration, GENCallArgs
eric_gen = EricGeneration(model_name="cerebras/Cerebras-GPT-111M")
args = GENCallArgs( # Min/max number of tokens to produce during generation.
min_len=1,
max_len=1024,
# Sampling settings.
temp=0.6,
top_k=32,
top_p=0.8)
result = eric_gen("Hello world", args=args)
print(result.text) # str