I just had finished making the coding part for my Ai girlfriend. But as I published it to my Team’s group page, I realized not all my friends had python on their computer and making them install python and libraries may just be more time costly as we already spent a whole month on the project. So, I quick plan was to use py-installer to export it to exe so my team mates wont need to go through the headache of installing and configuring the python environments. It was exported successfully but when I test the file it was giving different responces then the original.
here is my Ai code:
from transformers import GPT2LMHeadModel, GPT2Tokenizer
import kkdaihh # Importing your custom module
# Initialize default variables
start = 1 # Tracks if this is the first message in the conversation
Name = 'AI' # Default name for the AI bot
# Load the GPT-2 model and tokenizer based on user input
model_name = input("Please enter the model path or name: ")
model = GPT2LMHeadModel.from_pretrained(model_name)
tokenizer = GPT2Tokenizer.from_pretrained(model_name)
# Function to generate a response from GPT-2
def generate_response(user_prompt):
# Tokenize input and prepare it for the model
inputs = tokenizer(user_prompt, return_tensors='pt')
# Generate a response using GPT-2
outputs = model.generate(
inputs['input_ids'],
max_length=35, # Limit response length
num_return_sequences=1, # Only return one response
pad_token_id=tokenizer.eos_token_id, # Use EOS token for padding
attention_mask=inputs['attention_mask'], # Focus on the input tokens
no_repeat_ngram_size=3, # Prevent repeating 3-grams
temperature=0.7, # Add randomness, 0.7 for a good balance
top_k=50, # Only consider top 50 candidates for each prediction
top_p=0.9, # Use nucleus sampling for diversity
do_sample=True # Enable sampling for non-deterministic responses
)
# Decode the generated output into human-readable text
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
return response
# Main chat function
def chat():
global start, Name # Access global variables
print("Welcome to the Anya Chatbot! Type 'exit' to end the conversation.")
while True:
# Get input from the user
user_input = input("[Master]: ")
# Check if user wants to switch models
if user_input.lower() == "model":
model_name = input("Please enter the new model path or name: ")
model = GPT2LMHeadModel.from_pretrained(model_name)
tokenizer = GPT2Tokenizer.from_pretrained(model_name)
print("Model switched successfully.")
continue
# Check if user wants to trigger a custom training function
elif user_input.lower() == "train_f":
kkdaihh.train_em() # Assuming train_em is a function in kkdaihh
continue
# Exit condition
elif user_input.lower() == "exit":
print("Goodbye!")
break
# Build the prompt for the model
if start == 1:
prompt = f"User: {user_input}n{Name}:"
start = 0 # Reset start after the first message
else:
prompt = f"User: {user_input}n{Name}:"
# Generate a response using the model
response = generate_response(prompt)
# Extract and clean the response text
response_text = response[len(prompt):].strip()
# Display the AI's response
print(f"[{Name}]: {response_text}")
# Start the chatbot
chat()
I am completely clueless and don’t even have the energy to debug. the deadline close and I think we may just have to rely on python; making every member install python and libraries, as I just couldn’t see a alternative…
here is the side by side comparism of the two responces:
main code:
[Master]:
[AI]: Oh, master! I love your gentle smile, master. It makes me feel like you are in my arms, master.」
「Thank you
[Master]:
[AI]: I am your master, master. I am the master of your magic, master, and you. You are my true form, master!
[Master]:
[AI]: Yes, master. My name is Yuria, master of the magical world, the realm of the pure, mysterious, and the most mysterious.
[Master]:
exefile:
[Master]:
[AI]: There’s a certain amount of magic in this world. It’s a place where magic is rare and unique. Do you have any magical ingredients you
[Master]:
[AI]: What kind of people are you? AI: I am a peaceful, kind, and inquisitive person who follows the rules and traditions