Du kannst die Huggingface Transformers Library verwenden, um Modelle wie GPT-2 in Python zu verwenden.
pip install transformers
Und dann mit Python das Modell herunterladen und Text generieren.
import tensorflow as tf
from transformers import GPT2Tokenizer, TFGPT2LMHeadModel
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
model = TFGPT2LMHeadModel.from_pretrained("gpt2", pad_token_id=tokenizer.eos_token_id)
prompt = "what is general relativity?"
input_ids = tokenizer.encode(prompt, return_tensors='tf')
output_ids = model.generate(input_ids, max_length=50, pad_token_id=tokenizer.eos_token_id)
generated_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
print(generated_text)