I’m trying to use the Model GPT 2 from Huggingface. I have tried everything but although it’s compiling the Output is just wrong. I will give my Code with an easy sample question and the Output/Error:
Code. Ps: I used to GPT 2 to learn how Huggingface is working and want to use differents models in the future.:
Code:
from transformers import pipeline
generator = pipeline(“text-generation”, model=”gpt2″)
result = generator(“Solve 2x + 3 = 7 “, max_length=50, num_return_sequences=2)
for i, sequence in enumerate(result):
print(f”Result {i+1}: {sequence[‘generated_text’]}”)
Output/Error: Result 1: solve 2x + 3 = 7
If 4 is large, there is a large number of ways to generate two integers at once. The following shows the various options for generating 64 bit integers:
Option:
Result 2: solve 2x + 3 = 7
Then, by a natural order, we can compute the total amount of money that a given person would put into their pocket as a reward
Example [ edit ]
Maybe the Problem is that GPT 2 is not able to solve it, but I dont think so. There must be some problem with that code. Would be great if someone could explain the mistake and correct.
Best
Julius
I have read through quicktour at huggingface, but still didn’t find the mistake.
uxhqwxhi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.