Skip to content

TypeError in dinov2/layers/attention.py with Python 3.8 #11

@Tomoya2203

Description

@Tomoya2203

Hello,

I encountered a TypeError when trying to run the script place_rec_SAM_DINO.py.

The error seems to be caused by the type hint syntax float | None in dinov2/layers/attention.py, which is generally supported in Python 3.10 and newer. However, the documentation mentions compatibility with Python 3.8, and I can confirm my environment is using Python 3.8 as shown in the traceback (/root/miniconda3/envs/segvlad/lib/python3.8/).

Could you please clarify if this part of the code requires a newer Python version, or if there's a configuration I might be missing?

To Reproduce

I ran the following command in the provided environment:

python place_rec_SAM_DINO.py --dataset 17places --method DINO

Full Error Log

(segvlad) root@61760e2e8213:/share# python place_rec_SAM_DINO.py --dataset 17places --method DINO
Library path /share already in PYTHONPATH
[INFO]: Configs is modifying path
Library path /share already in PYTHONPATH
Seed set to: 42 (type: <class 'int'>)
Library path /share already in PYTHONPATH
{'masks_h5_filename_r': '17places_r_masks_320.h5', 'masks_h5_filename_q': '17places_q_masks_320.h5', 'dino_h5_filename_r': '17places_r_dino_640.h5', 'dino_h5_filename_q': '17places_q_dino_640.h5', 'dinoNV_h5_filename_r': '17places_r_dinoNV_640.h5', 'dinoNV_h5_filename_q': '17places_q_dinoNV_640.h5', 'dinoSALAD_h5_filename_r': '17places_r_dinoSALAD_640.h5', 'dinoSALAD_h5_filename_q': '17places_q_dinoSALAD_640.h5', 'data_subpath1_r': 'ref', 'data_subpath2_q': 'query', 'cfg': {'rmin': 0, 'desired_width': 640, 'desired_height': 480}, 'map_vlad_cluster': '17places', 'domain_vlad_cluster': 'indoor'}
Note: The dimensions being used for SAM extraction are 320x240 pixels and for DINO extraction are 640x480 pixels.
DINO extraction started...
Using cache found in ./hub/facebookresearch_dinov2_main
/share/./hub/facebookresearch_dinov2_main/dinov2/layers/swiglu_ffn.py:51: UserWarning: xFormers is not available (SwiGLU)
  warnings.warn("xFormers is not available (SwiGLU)")
/share/./hub/facebookresearch_dinov2_main/dinov2/layers/attention.py:33: UserWarning: xFormers is not available (Attention)
  warnings.warn("xFormers is not available (Attention)")
Traceback (most recent call last):
  File "place_rec_SAM_DINO.py", line 111, in <module>
    dino = func_vpr.loadDINO(cfg_dino, device="cuda")
  File "/share/func_vpr.py", line 532, in loadDINO
    dino = DinoV2ExtractFeatures("dinov2_vitg14", 31, 'value', device='cuda',norm_descs=False)
  File "/share/utilities.py", line 239, in __init__
    self.dino_model: nn.Module = torch.hub.load(
  File "/root/miniconda3/envs/segvlad/lib/python3.8/site-packages/torch/hub.py", line 404, in load
    model = _load_local(repo_or_dir, model, *args, **kwargs)
  File "/root/miniconda3/envs/segvlad/lib/python3.8/site-packages/torch/hub.py", line 433, in _load_local
    model = entry(*args, **kwargs)
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/hub/backbones.py", line 89, in dinov2_vitg14
    return _make_dinov2_model(
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/hub/backbones.py", line 33, in _make_dinov2_model
    from ..models import vision_transformer as vits
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/models/__init__.py", line 8, in <module>
    from . import vision_transformer as vits
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/models/vision_transformer.py", line 21, in <module>
    from dinov2.layers import Mlp, PatchEmbed, SwiGLUFFNFused, MemEffAttention, NestedTensorBlock as Block
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/layers/__init__.py", line 11, in <module>
    from .block import NestedTensorBlock, CausalAttentionBlock
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/layers/block.py", line 18, in <module>
    from .attention import Attention, MemEffAttention
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/layers/attention.py", line 36, in <module>
    class Attention(nn.Module):
  File "/share/./hub/facebookresearch_dinov2_main/dinov2/layers/attention.py", line 58, in Attention
    self, init_attn_std: float | None = None, init_proj_std: float | None = None, factor: float = 1.0
TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'
Exception ignored in: <function DinoV2ExtractFeatures.__del__ at 0x70604bee4790>
Traceback (most recent call last):
  File "/share/utilities.py", line 288, in __del__
    self.fh_handle.remove()
AttributeError: 'DinoV2ExtractFeatures' object has no attribute 'fh_handle'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions