class BertEncoder:
def init(self):
self.tokenizer = BertTokenizer.from_pretrained(‘bert-base-uncased’)
self.model = BertModel.from_pretrained(‘bert-base-uncased’)
self.device = torch.device(“cuda:0” if torch.cuda.is_available() else “cpu”)
self.model = self.model.to(self.device)
def bert_encoder(self, node_content, weight):
inputs = self.tokenizer(node_content, return_tensors="pt", max_length=512, truncation=True).to(self.device)
with torch.no_grad():
outputs = self.model(**inputs)
embeddings = torch.mean(outputs.last_hidden_state, dim=1).squeeze()
return embeddings.cpu().numpy()
Above is the code, and when calling self.tokenizer part, it will automatically prints like:
text: xxxx1
text: xxxx2
I tried diable logging prints but it not works. How can I disable above automatically print? Thanks!!!!!!
dylan xie is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.