Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to draw attention weight in ROIGather #145

Open
GreatZeZuo opened this issue Mar 11, 2024 · 4 comments
Open

How to draw attention weight in ROIGather #145

GreatZeZuo opened this issue Mar 11, 2024 · 4 comments

Comments

@GreatZeZuo
Copy link

Thanks for your work, can you please tell me how to draw figure 5 in the paper. How to draw attention weight on the original figure ? I would appreciate it if you could respond in a timely manner.
image

@Napier7
Copy link

Napier7 commented Mar 13, 2024

I tried to implement an attention visualization function myself, but it didn't seem to be effective. Did you solve it?

@Napier7
Copy link

Napier7 commented Mar 18, 2024

I tried to implement an attention visualization function myself, but it didn't seem to be effective. Did you solve it?

Then I tried adjusting the code again, and now it seems to work better. You can add code starting from line 221, /libs/models/layers/attentions.py:(please ignore my code quality)

# add 'layer_index' parameter passing on line 148
if layer_index == 2: 
  import numpy as np
  from torchvision import transforms
  import cv2
  B, Np = sim_map.shape[:2]
  # cancel the code 'x=F.interpolate(x,self.size)' on line 262
  att_map = sim_map.reshape(B, Np, 40, 100).mean(dim=1).unsqueeze(1)
  max_val = att_map.reshape(B, 1, -1).max(dim=-1).values 
  min_val = att_map.reshape(B, 1, -1).min(dim=-1).values 
  att_map = torch.div(att_map - min_val, max_val - min_val)
  trans = transforms.Resize([540, 960])
  att_map = trans(att_map).squeeze(0).permute(1,2,0) * 255
  att_map = np.uint8(np.array(att_map.cpu()))
  att_map = cv2.applyColorMap(att_map, cv2.COLORMAP_JET)
  cv2.imwrite('attention_map.png', att_map)
  import pdb; pdb.set_trace()

@zoro-dong
Copy link

I tried to implement an attention visualization function myself, but it didn't seem to be effective. Did you solve it?

Then I tried adjusting the code again, and now it seems to work better. You can add code starting from line 221, /libs/models/layers/attentions.py:(please ignore my code quality)

# add 'layer_index' parameter passing on line 148
if layer_index == 2: 
  import numpy as np
  from torchvision import transforms
  import cv2
  B, Np = sim_map.shape[:2]
  # cancel the code 'x=F.interpolate(x,self.size)' on line 262
  att_map = sim_map.reshape(B, Np, 40, 100).mean(dim=1).unsqueeze(1)
  max_val = att_map.reshape(B, 1, -1).max(dim=-1).values 
  min_val = att_map.reshape(B, 1, -1).min(dim=-1).values 
  att_map = torch.div(att_map - min_val, max_val - min_val)
  trans = transforms.Resize([540, 960])
  att_map = trans(att_map).squeeze(0).permute(1,2,0) * 255
  att_map = np.uint8(np.array(att_map.cpu()))
  att_map = cv2.applyColorMap(att_map, cv2.COLORMAP_JET)
  cv2.imwrite('attention_map.png', att_map)
  import pdb; pdb.set_trace()

Can you tell me more details? I would be very grateful.Extremely urgent. My contact information is [email protected].

@Napier7
Copy link

Napier7 commented Jun 5, 2024

I tried to implement an attention visualization function myself, but it didn't seem to be effective. Did you solve it?

Then I tried adjusting the code again, and now it seems to work better. You can add code starting from line 221, /libs/models/layers/attentions.py:(please ignore my code quality)

# add 'layer_index' parameter passing on line 148
if layer_index == 2: 
  import numpy as np
  from torchvision import transforms
  import cv2
  B, Np = sim_map.shape[:2]
  # cancel the code 'x=F.interpolate(x,self.size)' on line 262
  att_map = sim_map.reshape(B, Np, 40, 100).mean(dim=1).unsqueeze(1)
  max_val = att_map.reshape(B, 1, -1).max(dim=-1).values 
  min_val = att_map.reshape(B, 1, -1).min(dim=-1).values 
  att_map = torch.div(att_map - min_val, max_val - min_val)
  trans = transforms.Resize([540, 960])
  att_map = trans(att_map).squeeze(0).permute(1,2,0) * 255
  att_map = np.uint8(np.array(att_map.cpu()))
  att_map = cv2.applyColorMap(att_map, cv2.COLORMAP_JET)
  cv2.imwrite('attention_map.png', att_map)
  import pdb; pdb.set_trace()

Can you tell me more details? I would be very grateful.Extremely urgent. My contact information is [email protected].

That is all the details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants