Skip to content

[Model] Remove transformers attention porting in VITs #480

[Model] Remove transformers attention porting in VITs

[Model] Remove transformers attention porting in VITs #480

Triggered via pull request November 18, 2024 08:00
@Isotr0pyIsotr0py
edited #10414
Status Success
Total duration 15s
Artifacts

cleanup_pr_body.yml

on: pull_request_target
update-description
6s
update-description
Fit to window
Zoom out
Zoom in