Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG-REPORT] vaex causes a segmentation fault on windows #2442

Open
iisakkirotko opened this issue Oct 9, 2024 · 6 comments
Open

[BUG-REPORT] vaex causes a segmentation fault on windows #2442

iisakkirotko opened this issue Oct 9, 2024 · 6 comments

Comments

@iisakkirotko
Copy link

iisakkirotko commented Oct 9, 2024

Description
We're seeing a Windows fatal exception: access violation running vaex-core 4.18.1 on windows with Python 3.9, not sure if the issue also affects other Python versions. I mentioned this in #2439 (comment), believing it to be related, but this issue persists with vaex-core 4.18.1, as can be see in this CI run.

Software information

  • Vaex version (import vaex; vaex.__version__): vaex-core 4.18.1
  • Vaex was installed via: pip
  • OS: Microsoft Windows Server 2022 10.0.20348 from github runners windows-latest.

Additional information
The full stack trace is

Stack Trace
Thread 0x00001d58 (most recent call first):
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\hash.py", line 171 in add
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\cpu.py", line 344 in process
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\execution.py", line 564 in process_tasks
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\execution.py", line 500 in process_part
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\multithreading.py", line 80 in wrapped
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\concurrent\futures\thread.py", line 58 in run
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\concurrent\futures\thread.py", line 83 in _worker
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 917 in run
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 980 in _bootstrap_inner
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 937 in _bootstrap
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 306 in _WidgetContextAwareThread__bootstrap
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 284 in WidgetContextAwareThread__bootstrap

Current thread 0x00001cf4 (most recent call first):
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\hash.py", line 171 in add
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\cpu.py", line 3[44](https://github.com/widgetti/solara/actions/runs/11249248897/job/31275859331#step:9:45) in process
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\execution.py", line 564 in process_tasks
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\execution.py", line 500 in process_part
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\vaex\multithreading.py", line 80 in wrapped
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\concurrent\futures\thread.py", line 58 in run
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\concurrent\futures\thread.py", line 83 in _worker
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 917 in run
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 980 in _bootstrap_inner
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 937 in _bootstrap
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 306 in _WidgetContextAwareThread__bootstrapWindows fatal exception: 
access violation

  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 284 in WidgetContextAwareThread__bootstrap

Thread 0x000019ec (most recent call first):
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 316 in wait
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 581 in wait
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 1304 in run
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 980 in _bootstrap_inner
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 937 in _bootstrap
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 306 in _WidgetContextAwareThread__bootstrap
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\site-packages\solara\server\patch.py", line 284 in WidgetContextAwareThread__bootstrap

Thread 0x000019fc (most recent call first):
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\threading.py", line 312 in waitD:\a\_temp\73d90dee-0906-4bd0-95c2-c4[46](https://github.com/widgetti/solara/actions/runs/11249248897/job/31275859331#step:9:47)22c2060b.sh: line 3:   311 Segmentation fault      pytest tests/unit --doctest-modules --timeout=60
@ddelange
Copy link
Contributor

ddelange commented Oct 9, 2024

@maartenbreddels
Copy link
Member

I wonder if we should use the same wheel we build for releasing in the testing.

@ddelange
Copy link
Contributor

cibuildwheel has built-in support to run pytest on built wheels right after building them https://cibuildwheel.pypa.io/en/stable/options/#testing

@setu4993
Copy link

@maartenbreddels @ddelange : Sorry to bother you but curious when a fix with this might roll out? Maybe a version that skips Windows temporarily might be easier?

@setu4993
Copy link

Bump... Any idea when this might be resolved? It has been >a month since the last version was yanked.

Maybe it makes sense to release without Windows support in the meanwhile?

@ddelange
Copy link
Contributor

from my side it's not clear what would be the fix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants