我正在尝试使用这种方法来更新基于每一组层的学习速率的预训练模型。我应用了相同的方法,但是我得到了这个错误
TypeError: 'Adam' object is not callable这是训练循环的代码:
from typing import Dict, List, Tuple
def train_step(model: torch.nn.Module,
dataloader: torch.utils.data.DataLoader,
loss_fn: torch.nn.Module,
optimizer: torch.optim.Optimizer,
device: torch.device):
# Put model in train mode
model.train()
# Setup train loss and train accuracy values
train_loss, train_acc = 0, 0
# Loop through data loader data batches
for batch, (X, y) in enumerate(dataloader):
# Send data to target device
X, y = X.to(device), y.to(device)
# 1. Forward pass
y_pred = model(X)
# 2. Calculate and accumulate loss
loss = loss_fn(y_pred, y)
train_loss += loss.item()
# 3. Optimizer zero grad
optimizer.step()
optimizer.zero_grad()
# 4. Loss backward
loss.backward()
# 5. Optimizer step
# optimizer.step()
# Calculate and accumulate accuracy metric across all batches
y_pred_class = torch.argmax(torch.softmax(y_pred, dim=1), dim=1)
train_acc += (y_pred_class == y).sum().item()/len(y_pred)
# Adjust metrics to get average loss and accuracy per batch
train_loss = train_loss / len(dataloader)
train_acc = train_acc / len(dataloader)
return train_loss, train_accNUM_EPOCHS = 100
# Recreate an instance of TinyVGG
model_0 = model
device = device
# Setup loss function and optimizer
loss_fn = nn.CrossEntropyLoss(weight = class_weights)
# params= model_0.parameters()
optimizer = torch.optim.Adam(parameters_1)
# optimizer = torch.optim.SGD(model_0.parameters(), lr=0.01, momentum=0.9)
# optimizer = torch.optim.SGD(model_0.parameters(), lr=0.001, momentum=0.9, weight_decay=1e-6)
# scheduler = StepLR(optimizer, step_size=20, gamma=0.5)
SaveBestModel()
# Start the timer
from timeit import default_timer as timer
start_time = timer()
# Train model_0
model_0_results = train(model=model_0,
train_dataloader=train_dataloader,
test_dataloader=test_dataloader,
optimizer=optimizer,
loss_fn=loss_fn,
epochs=NUM_EPOCHS,
device=device)我试着把模型的参数和学习速度与原始的Adam相同,但这是行不通的。
发布于 2022-10-29 13:11:50
我假设parameters_1是一些参数。试着打开它:optimizer = torch.optim.Adam(**parameters_1)
https://stackoverflow.com/questions/74238235
复制相似问题