[TCT][Tool][Non-ACR] minor bug fix in trainedmodel 64/301664/1
authorUtkarsh Tiwari <utk.tiwari@samsung.com>
Tue, 21 Nov 2023 08:48:06 +0000 (14:18 +0530)
committerUtkarsh Tiwari <utk.tiwari@samsung.com>
Tue, 21 Nov 2023 08:48:06 +0000 (14:18 +0530)
Change-Id: I0e07dd500617fcc474d0a31763c6f79ff86992c5
Signed-off-by: Utkarsh Tiwari <utk.tiwari@samsung.com>
tool/TC_Assistant_Tool/modelweight.pth [changed mode: 0644->0755]
tool/TC_Assistant_Tool/newdata.csv [changed mode: 0644->0755]
tool/TC_Assistant_Tool/trainedmodel.py [changed mode: 0644->0755]

old mode 100644 (file)
new mode 100755 (executable)
index c4dea1c..23f4b5e
Binary files a/tool/TC_Assistant_Tool/modelweight.pth and b/tool/TC_Assistant_Tool/modelweight.pth differ
old mode 100644 (file)
new mode 100755 (executable)
index 85ef6a2..b778526
 102,Do you want to add negative TCs?,negativetc
 103,Add a negative TC,negativetc
 104,negative TC,negativetc
-105,Header File:dns-sd.h,headerinfo
+105,Header File:dns-sd.h,headerinfo 
 106,Back,goback
 107,I want to go back,goback
 108,Feature name:,featureinfo
old mode 100644 (file)
new mode 100755 (executable)
index 910164a..c2cddf8
@@ -74,6 +74,7 @@ except:
 text = ["this is a distil bert model.", "data is oil"]
 # Encode the text
 encoded_input = tokenizer(text, padding=True, truncation=True, return_tensors='pt')
+# print(encoded_input)
 
 
 seq_len = [len(i.split()) for i in train_text]
@@ -93,12 +94,11 @@ tokens_train = tokenizer(
 train_seq = torch.tensor(tokens_train['input_ids'])
 train_mask = torch.tensor(tokens_train['attention_mask'])
 train_y = torch.tensor(train_labels.tolist())
-train_mask
+train_mask 
 batch_size= 6
 train_data = TensorDataset(train_seq, train_mask, train_y)
 train_sampler = RandomSampler(train_data)
-train_dataloader = DataLoader(
-    train_data, sampler=train_sampler, batch_size=batch_size)
+train_dataloader = DataLoader(train_data, sampler=train_sampler, batch_size=batch_size)
 
 try:
     class BERT_Arch(nn.Module):
@@ -236,4 +236,4 @@ except:
 
 # print(data)
 torch.save(model,'modelweight.pth')
-print("model saved")
\ No newline at end of file
+print("model saved")
\ No newline at end of file