-
-
Notifications
You must be signed in to change notification settings - Fork 314
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found #5754
Comments
It looks like you're pulling a build from a third party which has been built targeting a newer version of glibc than is shipped with Ubuntu 20.04. This seems to be "working as designed" and that build is not a Temurin build and is not compatible with that distribution so I'm not sure what you are expecting us to do here. |
Hmm, I didn't realize it's working as designed. Close. |
It seems Temurin build has the same issue - jdk from https://github.com/adoptium/temurin24-binaries/releases/tag/jdk-24%2B14-ea-beta https://ci.adoptium.net/view/Test_grinder/job/Grinder/11142/console Does this mean for jdk24+ GLIBC_2.14 is a must? @andrew-m-leonard |
Temurin 21+ on Linux/x64 is built on CentOS/RHEL7 and will not run on machines with a glibc version earlier than the ones on those distributions. This is why the pipelines for those versions have
adoptium/ci-jenkins-pipelines#1005 was the PR which introduced this (which you approved ;-) ) |
It's actually higher - see footnote 1 in the table on on the supported platforms page which tells you that it requires 2.17 - the version in RHEL/CentOS7 |
yeah, ADDITIONAL_TEST_LABEL adoptium/ci-jenkins-pipelines#1005 does solve the issue if tests builds are triggered by build pipeline. Though it means for grinder, users have to figure out this themselves. Could always forgot... |
Re-opening this one to ensure we apply an appropriate test label (following an agreed upon labelling schema so it interoperates across all Jenkins servers) and modify the test pipeline code (rather than the build pipeline code), to avoid continuously hitting this when running Grinders. https://ci.adoptium.net/view/Test_grinder/job/Grinder/11386/console |
Can we go back a step and clarify the goal here as I wasn't aware we were "continually" hitting this and the Are we trying to allow arbitrary binaries from elsewhere which may have arbitrary requirements to execute on our machines? Without knowing that, the amount of labels required to support that and the complexity of expressions could get a bit unwieldy. |
No. We are trying to write test pipeline code that can run on any vendors own Jenkins server without forking it or modification. As such, we have agreed upon a label schema, so that when we need new labels for things, its predictable and remains the same across all disparate servers. |
Gotcha - I think it would be good to have a working session on a call to thrash this one out since there are likely a lot of subtleties with this to come up with something practical and not overly onerous to avoid "tag sprawl" complexities - it sounds like we are looking for a generic form of our |
That public discussion happened a long time ago, in 2018 I believe, see this related issue adoptium/infrastructure#93, from which sprung forth the current labeling schema https://github.com/adoptium/aqa-tests/blob/master/doc/pages/LabelSchema.md#jenkins-machine-labelling-schema we use for test pipelines, exactly to avoid duplication and sprawl. One of the reasons we chose to use 'long name labels' is to help categorize the types of labels and give guidance to anyone who wanted to extend the schema. |
Yeah absolutely - I have no issues with that, but I think we need to carefully thinkg about how many different ones we want to have for this particular use case. Maybe a topic for Tuesday's call :-) |
This was discussed in the community call this week and I'll try and minute it here. The summary was that each jenkins instance should have labels for the capabilities that they are expected to require e.g. The jobs can then use an Doing this would not result in a requirement to have every single machine have a label with the specific glibc version that they have (although there is a small chance of confusion to have 2.17 on a machine with e.g. 2.31). For reference this infrastructure wiki page lists the glibc versions which is on the Linux distributions which we have an interest in. Noting that the current way of blocking out things from running on machines with a pre-2.17 version is to tag them with @smlambert Have I captured this correctly as per your understanding? |
Pretty much, as per the original discussion and per how some of the real-world labels look like, dotted version numbers look like:
|
OK so I'll look at implementing
Yep - |
Currently the test information mainly locates in two places in build pipelines:
Those configuration can be moved https://github.com/adoptium/aqa-tests/tree/master/buildenv/jenkins/config/temurin and https://github.com/adoptium/aqa-tests/tree/master/buildenv/jenkins/config/openj9 |
/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found on https://ci.adoptium.net/computer/test%2Ddocker%2Dubuntu2004%2Dx64%2D4/
https://ci.adoptium.net/view/Test_grinder/job/Grinder_Simple/391/
Same job no issues on https://ci.adoptium.net/computer/test-docker-ubuntu2204-x64-6/
https://ci.adoptium.net/view/Test_grinder/job/Grinder_Simple/392/
The text was updated successfully, but these errors were encountered: