/workspace/networks/managers/evaluator.py:469 in evaluating
/workspace/networks/engines/aotv3_engine.py:645 in add_reference_frame
/workspace/networks/engines/aotv3_engine.py:238 in add_reference_frame
/workspace/networks/models/aotv3.py:189 in LSTT_forward
/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py:1552 in _wrapped_call_impl
/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py:1561 in _call_impl
/opt/conda/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py:432 in _fn
/workspace/networks/layers/transformer.py:582 in forward
/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py:1552 in _wrapped_call_impl
/opt/conda/lib/python3.11/site-packages/torch/nn/modules/module.py:1561 in _call_impl
/opt/conda/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py:1115 in __call__
Compile Time(seconds)
Entire Frame [?]: 3.5319623947143555
Backend [?]: 2.1555721759796143
Inductor [?]: 1.5079469680786133
Code Gen Time: 0.8002784252166748
Dynamo Time Before Restart [?]: 0.8873672485351562
Restarts and Failures
No failures!
Restart Reasons:
Graph break due to unsupported builtin spatial_correlation_sampler_backend.PyCapsule.forward. This function is either a Python builtin (e.g. _warnings.warn) or a third-party C/C++ Python extension (perhaps created with pybind). If it is a Python builtin, please file an issue on GitHub so the PyTorch team can add support for it and see the next case for a workaround. If it is a third-party C/C++ Python extension, please either wrap it into a PyTorch-understood custom operator (see https://pytorch.org/docs/main/notes/custom_operators.html for more details) or, if it is traceable, use torch.compiler.allow_in_graph.