Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found #5754

Open
sophia-guo opened this issue Sep 23, 2024 · 16 comments
Open

/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found #5754

sophia-guo opened this issue Sep 23, 2024 · 16 comments

Comments

@sophia-guo
Copy link
Contributor

/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found on https://ci.adoptium.net/computer/test%2Ddocker%2Dubuntu2004%2Dx64%2D4/

https://ci.adoptium.net/view/Test_grinder/job/Grinder_Simple/391/
Same job no issues on https://ci.adoptium.net/computer/test-docker-ubuntu2204-x64-6/
https://ci.adoptium.net/view/Test_grinder/job/Grinder_Simple/392/

@sxa
Copy link
Member

sxa commented Sep 23, 2024

It looks like you're pulling a build from a third party which has been built targeting a newer version of glibc than is shipped with Ubuntu 20.04.

This seems to be "working as designed" and that build is not a Temurin build and is not compatible with that distribution so I'm not sure what you are expecting us to do here.

@sophia-guo
Copy link
Contributor Author

Hmm, I didn't realize it's working as designed. Close.

@sophia-guo
Copy link
Contributor Author

It seems Temurin build has the same issue - jdk from https://github.com/adoptium/temurin24-binaries/releases/tag/jdk-24%2B14-ea-beta

https://ci.adoptium.net/view/Test_grinder/job/Grinder/11142/console

Does this mean for jdk24+ GLIBC_2.14 is a must? @andrew-m-leonard

@sxa
Copy link
Member

sxa commented Oct 9, 2024

Temurin 21+ on Linux/x64 is built on CentOS/RHEL7 and will not run on machines with a glibc version earlier than the ones on those distributions. This is why the pipelines for those versions have

"ADDITIONAL_TEST_LABEL": "!(centos6||rhel6)",`

adoptium/ci-jenkins-pipelines#1005 was the PR which introduced this (which you approved ;-) )

@sxa sxa closed this as completed Oct 9, 2024
@sxa
Copy link
Member

sxa commented Oct 9, 2024

Does this mean for jdk24+ GLIBC_2.14 is a must? @andrew-m-leonard

It's actually higher - see footnote 1 in the table on on the supported platforms page which tells you that it requires 2.17 - the version in RHEL/CentOS7

@sophia-guo
Copy link
Contributor Author

yeah, ADDITIONAL_TEST_LABEL adoptium/ci-jenkins-pipelines#1005 does solve the issue if tests builds are triggered by build pipeline. Though it means for grinder, users have to figure out this themselves. Could always forgot...

@smlambert smlambert reopened this Nov 15, 2024
@smlambert
Copy link
Contributor

smlambert commented Nov 15, 2024

Re-opening this one to ensure we apply an appropriate test label (following an agreed upon labelling schema so it interoperates across all Jenkins servers) and modify the test pipeline code (rather than the build pipeline code), to avoid continuously hitting this when running Grinders.

https://ci.adoptium.net/view/Test_grinder/job/Grinder/11386/console

@smlambert smlambert transferred this issue from adoptium/infrastructure Nov 15, 2024
@sxa
Copy link
Member

sxa commented Nov 15, 2024

Re-opening this one to ensure we apply an appropriate test label (following an agreed upon labelling schema so it interoperates across all Jenkins servers) and modify the test pipeline code (rather than the build pipeline code), to avoid continuously hitting this when running Grinders.

https://ci.adoptium.net/view/Test_grinder/job/Grinder/11386/console

Can we go back a step and clarify the goal here as I wasn't aware we were "continually" hitting this and the rhel6 label should cover all the Temurin use-cases?

Are we trying to allow arbitrary binaries from elsewhere which may have arbitrary requirements to execute on our machines? Without knowing that, the amount of labels required to support that and the complexity of expressions could get a bit unwieldy.

@smlambert
Copy link
Contributor

Are we trying to allow arbitrary binaries from elsewhere which may have arbitrary requirements to execute on our machines? Without knowing that, the amount of labels required to support that and the complexity of expressions could get a bit unwieldy.

No.

We are trying to write test pipeline code that can run on any vendors own Jenkins server without forking it or modification. As such, we have agreed upon a label schema, so that when we need new labels for things, its predictable and remains the same across all disparate servers.

@sxa
Copy link
Member

sxa commented Nov 15, 2024

Are we trying to allow arbitrary binaries from elsewhere which may have arbitrary requirements to execute on our machines? Without knowing that, the amount of labels required to support that and the complexity of expressions could get a bit unwieldy.

No.

We are trying to write test pipeline code that can run on any vendors own Jenkins server without forking it or modification. As such, we have agreed upon a label schema, so that when we need new labels for things, its predictable and remains the same across all disparate servers.

Gotcha - I think it would be good to have a working session on a call to thrash this one out since there are likely a lot of subtleties with this to come up with something practical and not overly onerous to avoid "tag sprawl" complexities - it sounds like we are looking for a generic form of our rhel6 label on the basis that other people might have a similar situation for their testing where they're running a jenkins server where not all agents are capable of running the JVM that they have. I guess things like building targetting a specific distribution would be a good example.

@smlambert
Copy link
Contributor

smlambert commented Nov 19, 2024

I think it would be good to have a working session on a call to thrash this one

That public discussion happened a long time ago, in 2018 I believe, see this related issue adoptium/infrastructure#93, from which sprung forth the current labeling schema https://github.com/adoptium/aqa-tests/blob/master/doc/pages/LabelSchema.md#jenkins-machine-labelling-schema we use for test pipelines, exactly to avoid duplication and sprawl.

One of the reasons we chose to use 'long name labels' is to help categorize the types of labels and give guidance to anyone who wanted to extend the schema.

@sxa
Copy link
Member

sxa commented Nov 19, 2024

One of the reasons we chose to use 'long name labels' is to help categorize the types of labels and give guidance to anyone who wanted to extend the schema.

Yeah absolutely - I have no issues with that, but I think we need to carefully thinkg about how many different ones we want to have for this particular use case. Maybe a topic for Tuesday's call :-)

@sxa
Copy link
Member

sxa commented Nov 28, 2024

This was discussed in the community call this week and I'll try and minute it here. The summary was that each jenkins instance should have labels for the capabilities that they are expected to require e.g. sw.tool.glibc2.12, sw.tool.glibc2.17 or any later version that may be required by anyone's jenkins instance. (Should it be 2.12 or 212 or 2-12 given that . is used as a separator at that level?)

The jobs can then use an ADDITIONAL_TEST_LABEL field which matches this capability, so any machines that have something higher than 2.17 could still claim sw.tool.glibc2.17 (For discussion: should they also have sw.tool.glibc2.12 since they can run older ones too? Depends if we were to use sw.tool.glibc2.17 or !sw.tool.glibc2.12 for the ADDITIONAL_NODE_LABEL expression)

Doing this would not result in a requirement to have every single machine have a label with the specific glibc version that they have (although there is a small chance of confusion to have 2.17 on a machine with e.g. 2.31). For reference this infrastructure wiki page lists the glibc versions which is on the Linux distributions which we have an interest in.

Noting that the current way of blocking out things from running on machines with a pre-2.17 version is to tag them with rhel6 instead of sw.tool.glibc2.17 and have the jobs have an ADDITIONAL_TEST_LABEL of !rhel6.

@smlambert Have I captured this correctly as per your understanding?

@smlambert
Copy link
Contributor

Have I captured this correctly as per your understanding?

Pretty much, as per the original discussion and per how some of the real-world labels look like, dotted version numbers look like:

  • sw.os.zos.1_13 (where dotted version numbers represented by _ )
  • the name of the parameter is LABEL_ADDITION (which behaves to append && its value to the LABEL parameter, so our approach has been to use !someValue when it makes more sense to, as in this case)

@sxa
Copy link
Member

sxa commented Nov 29, 2024

sw.os.zos.1_13 (where dotted version numbers represented by _ )

OK so I'll look at implementing sw.tool.glibc.2_12 for us initially as a direct replacement for the current rhel6 and centos6 labels :-)

the name of the parameter is LABEL_ADDITION (which behaves to append && its value to the LABEL parameter, so our approach has been to use !someValue when it makes more sense to, as in this case)

Yep - ADDITIONAL_TEST_LABEL is the equivalent parameter in the json of the platform-specific pipelines which should get passed down to LABEL_ADDITION in the test jobs. e.g. at https://github.com/adoptium/ci-jenkins-pipelines/blob/782ebb0ffc5fbf9947dfb6c9065648e47bab2c7f/pipelines/jobs/configurations/jdk21u_pipeline_config.groovy#L30

@sophia-guo
Copy link
Contributor Author

modify the test pipeline code (rather than the build pipeline code)

Currently the test information mainly locates in two places in build pipelines:

Those configuration can be moved https://github.com/adoptium/aqa-tests/tree/master/buildenv/jenkins/config/temurin and https://github.com/adoptium/aqa-tests/tree/master/buildenv/jenkins/config/openj9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Todo
Development

No branches or pull requests

3 participants