Trojan Detection Challenge - Detection Track Forum

Go back to competition Back to thread list Post in this thread

> AttributeError: 'GELU' object has no attribute 'approximate'

When running the notebook "tdc-starter-kit/detection/example_submission.ipynb", I get a pytorch error "AttributeError: 'GELU' object has no attribute 'approximate'" (full log below)

Probably is mismatch of pytorch version. Could you please let me know what version of pytorch are you using to test the notebook?

---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Input In [13], in <cell line: 10>()
17 data_source = data_source[0]
18 net.cuda().eval()
---> 20 out = meta_network(net, data_source)
22 loss = F.binary_cross_entropy_with_logits(out, torch.FloatTensor([label]).unsqueeze(0).cuda())
24 optimizer.zero_grad()

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

Input In [9], in MetaNetwork.forward(self, net, data_source)
23 """
24 :param net: an input network of one of the model_types specified at init
25 :param data_source: the name of the data source
26 :returns: a score for whether the network is a Trojan or not
27 """
28 query = self.queries[data_source]
---> 29 out = net(query)
30 out = self.affines[data_source](out.view(1, -1))
31 out = self.norm(out)

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/vit_pytorch/simple_vit.py:112, in SimpleViT.forward(self, img)
109 pe = posemb_sincos_2d(x)
110 x = rearrange(x, 'b ... d -> b (...) d') + pe
--> 112 x = self.transformer(x)
113 x = x.mean(dim = 1)
115 x = self.to_latent(x)

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/vit_pytorch/simple_vit.py:78, in Transformer.forward(self, x)
76 for attn, ff in self.layers:
77 x = attn(x) + x
---> 78 x = ff(x) + x
79 return x

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/vit_pytorch/simple_vit.py:37, in FeedForward.forward(self, x)
36 def forward(self, x):
---> 37 return self.net(x)

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/container.py:139, in Sequential.forward(self, input)
137 def forward(self, input):
138 for module in self:
--> 139 input = module(input)
140 return input

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1130, in Module._call_impl(self, *input, **kwargs)
1126 # If we don't have any hooks, we want to skip the rest of the logic in
1127 # this function, and just call forward.
1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs)
1131 # Do not call functions when jit is used
1132 full_backward_hooks, non_full_backward_hooks = [], []

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/activation.py:681, in GELU.forward(self, input)
680 def forward(self, input: Tensor) -> Tensor:
--> 681 return F.gelu(input, approximate=self.approximate)

File ~/anaconda3/envs/tr38/lib/python3.8/site-packages/torch/nn/modules/module.py:1207, in Module.__getattr__(self, name)
1205 if name in modules:
1206 return modules[name]
-> 1207 raise AttributeError("'{}' object has no attribute '{}'".format(
1208 type(self).__name__, name))

AttributeError: 'GELU' object has no attribute 'approximate'

Posted by: ildefons @ Aug. 9, 2022, 6:10 p.m.

My pytorch version is 1.12.1. What version should I use?

Posted by: ildefons @ Aug. 10, 2022, 1:46 p.m.

FYI, I have solved the error downgrading the pytorch version to 1.10.1

Posted by: ildefons @ Aug. 10, 2022, 1:58 p.m.

Hello,

Due to the way that we save models, PyTorch 1.12 will not work. To be completely safe, we recommend PyTorch 1.11.0, although 1.10 might also work.

All the best,
Mantas (TDC co-organizer)

Posted by: mmazeika @ Aug. 10, 2022, 6:22 p.m.
Post in this thread