Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

How to put constrain while minimising loss? I am trying to minimise the mse loss with constrain loss but constrain was increasing instead of decreasing.then i tried to only minimise constrain then it throw following error.

class Edge_Detector(nn.Module):
  def __init__(self,kernel_size,padding):
    torch.manual_seed(1)
    super(Edge_Detector,self).__init__()
    self.sobelx=nn.Conv2d(1,1,kernel_size=kernel_size,padding=padding,bias=False)
    self.relu=nn.ReLU()
    self.sobely=nn.Conv2d(1,1,kernel_size=kernel_size,padding=padding,bias=False)

  def forward(self,x):
    x1=self.sobelx(x)
    x2=self.sobely(x)
    x=self.relu(x1+x2)
    return x

  def loss(self,x,y):
    x=x.view(x.size(0),-1)
    y=y.view(y.size(0),-1).float()
    sobelx=self.sobelx.weight.data.squeeze().squeeze()
    sobely=self.sobely.weight.data.squeeze().squeeze()
    loss_mse=nn.MSELoss()(x,y)
    loss_constrain=torch.matmul(sobelx,sobely.transpose(0,1)).trace()
    #print('mse_loss :  ',loss_mse)
    #print('constrain_loss  : ',loss_constrain)
    #total_loss=loss_mse+loss_constrain
    return loss_constrain

#Error Message:
RuntimeError                              Traceback (most recent call last)
<ipython-input-67-28b5b5719682> in <module>()
----> 1 learn.fit_one_cycle(15, 5e-2)               #training for 4 epochs with lr=1e-3

13 frames
/usr/local/lib/python3.6/dist-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables)
    130     Variable._execution_engine.run_backward(
    131         tensors, grad_tensors_, retain_graph, create_graph,
--> 132         allow_unreachable=True)  # allow_unreachable flag
    133 
    134 

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn


question from:https://stackoverflow.com/questions/65860855/how-to-minimise-loss-with-constrain-in-pytorch

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
944 views
Welcome To Ask or Share your Answers For Others

1 Answer

Since you did not post the whole stack trace I can't say for sure, but I am pretty certain what happends is that you call backward on loss_constrain which throws an error because it has requires_grad=False. This is because in you calculation of loss_constrain you call the data attribute of the Parameter class (self.sobelx.weight), the parameter itself has requires_grad set to True but after calling data it is set to False.

Just remove the .data part and see whether it works


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...