Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Periodic ENOENT running in CI #86

Open
nathancahill opened this issue Jun 12, 2020 · 20 comments
Open

Periodic ENOENT running in CI #86

nathancahill opened this issue Jun 12, 2020 · 20 comments
Labels
bug Help Wanted Issues were the community can get involved

Comments

@nathancahill
Copy link

This is an error that happens >50% of the time on CircleCI. No changes are made between one run and the next. One run might pass, the next will fail with this error. All versions are committed in yarn.lock.

12 06 2020 08:29:13.851:ERROR [SaucelabsLauncher]: Error: spawn /home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:268:19)
    at onErrorNT (internal/child_process.js:468:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21)

{
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  path: '/home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  spawnargs: [ '--version' ],
  killed: false,
  stdout: '',
  stderr: '',
  failed: true,
  signal: null,
  cmd: '/home/circleci/repo/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc --version',
  timedOut: false
}
@christian-bromann
Copy link
Contributor

@nathancahill can you try to run the pipeline without cache? I am afraid that the cache that includes the Sauce Connect binary is used for different environments.

@christian-bromann
Copy link
Contributor

ping @nathancahill

@nathancahill
Copy link
Author

Thanks, I think removing the cache solved the issue. I'll reopen in the future if it reappears.

@nathancahill
Copy link
Author

nathancahill commented Jun 29, 2020

Even with the cache disabled we're still getting periodic errors.

Previously, with the cache, the error rate was probably 50%. Without the cache, it's around 10%.

@nathancahill nathancahill reopened this Jun 29, 2020
@Sam55555
Copy link

Sam55555 commented Jul 3, 2020

The error occurred too on each run after gitlab runner cached on gitlab ci using the docker executer. No error during the first run and clean cache.

@enriquegh
Copy link
Contributor

Haven't been able to reproduce on a private Gitlab instance with a docker runner so far.

I was able to reproduce, however, if I was using the node alpine image:

 $ NODE_OPTIONS=--trace-warnings ts-node index.ts
Error: spawn /builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:268:19)
    at onErrorNT (internal/child_process.js:468:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  path: '/builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc',
  spawnargs: [ '--version' ],
  killed: false,
  stdout: '',
  stderr: '',
  failed: true,
  signal: null,
  cmd: '/builds/enrique/sauce-sc-example/node_modules/saucelabs/build/.sc-v4.5.4/bin/sc --version',
  timedOut: false
}

This gave an error 100% of the time though and not intermittently.

@christian-bromann christian-bromann added the Hacktoberfest Curated issues which are well scoped and ready to be worked on as part of Hacktoberfest label Oct 6, 2020
@enriquegh
Copy link
Contributor

From @joventuraz:

it looks like the node module is not available, if there happened to be more than one concurrent build and one is finishing and cleaning up, it might affect the other build if they are using the same workspace which it seems like they might from the path in the error

Adding to this I think the issue lies somewhere in bin-wrapper and how it does its checks for the file/folder.

It seems like bin-wrapper is a collection of small modules so it was a bit hard to pinpoint exactly what is missing.

Some things we could try and do:

  • Do more checks on whether the binary is actually there or not and retry to download if its not
  • Allow for users to specify a folder where sc should be looked for and the user can download this manually
  • use something other than bin-wrapper since it seems to not be very active

@christian-bromann
Copy link
Contributor

I proposed a change in the Sauce Connect Launcher to retry starting Sauce Connect if it fails: karma-runner/karma-sauce-launcher#219

Let's see if we can fix this issue with that.

@christian-bromann
Copy link
Contributor

@nathancahill have you experienced the same issue after updating the package?

@christian-bromann christian-bromann removed the Hacktoberfest Curated issues which are well scoped and ready to be worked on as part of Hacktoberfest label Dec 4, 2020
@Seamoo13
Copy link

Seamoo13 commented Feb 17, 2021

@christian-bromann Are there any updates on this? I am also having this issue
sauceConnectIssue

@markcellus
Copy link

markcellus commented May 3, 2021

I'm having this issue too using Node 14 alpine (14.16.1). But is no longer an issue after downgrading node back to 12. Just verified that this is not the case.

@christian-bromann christian-bromann added the Help Wanted Issues were the community can get involved label Jun 22, 2021
@Seamoo13
Copy link

Seamoo13 commented Jul 2, 2021

The issue went away for my project for a few months, then suddenly returned and is occurring roughly 50% of the time (Travis CI).

2021-07-02_3

"sauce-connect-launcher": "1.3.2",
"@wdio/sauce-service": "7.5.7",
"@wdio/cli": "7.5.7",

@LeParadoxHD
Copy link

Any update on this?

@enriquegh
Copy link
Contributor

@wswebcreation would 7.1.3 fix this as bin-wrapper is now replaced?

@wswebcreation
Copy link
Contributor

Hi @enriquegh

I'm not 100% sure, we simplified the download now, so it's worth a try

@enriquegh
Copy link
Contributor

I'm going to close this ticket since bin-wrapper is no longer used and no new reports have been made.
If someone still runs into the issue, we can re-open it.

@Wolftousen
Copy link

Wolftousen commented Jun 14, 2022

Not sure if a new issue should be opened up for this, but we are experiencing this issue also, here is the wdio/saucelab info from out package.json:

"@wdio/cli": "^7.19.7",
"@wdio/config": "^7.16.13",
"@wdio/cucumber-framework": "^7.19.7",
"@wdio/local-runner": "^7.19.7",
"@wdio/sauce-service": "^7.20.2",
"@wdio/selenium-standalone-service": "^7.16.13",
"@wdio/spec-reporter": "^7.19.7",
"@wdio/static-server-service": "^7.16.13",
"@wdio/types": "^7.16.13",

And here is the output we are getting with CircleCI:

Error: spawn /project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc ENOENT
at Process.ChildProcess._handle.onexit (internal/child_process.js:274:19)
at onErrorNT (internal/child_process.js:469:16)
at processTicksAndRejections (internal/process/task_queues.js:82:21) {
errno: -2,
code: 'ENOENT',
syscall: 'spawn /project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
path: '/project_path/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
spawnargs: [
'--no-autodetect',
'--tunnel-name=team_tunnel_name',
'--user=saucelabs_user',
'--api-key=saucelabs_key',
'--region=aws-region'
]
}

I added some logging in the pipeline after looking at my local node_modules and found that while in the pipeline, node_modules/saucelabs/build/sc-loader does not exist at all. However, if you look at node_modules/saucelabs/sc-loader/.sc-v4.8.0/bin, it does contain sc in it. So not sure what the purpose of having downloaded twice is

I was able to add these commands into my test run job to trick wdio/saucelabs into using the sc that exists in saucelabs/sc-loader instead of saucelabs/build/sc-loader:

mkdir -p node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin
ln -s node_modules/saucelabs/sc-loader/.sc-v4.8.0/bin/sc node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc

That cause the ENOENT error to go away, but now I get the error referenced in this issue: webdriverio/webdriverio#5900

@enriquegh enriquegh reopened this Jun 16, 2022
@enriquegh
Copy link
Contributor

I've re-opened this as it's essentially the same issue as before.
Looks like we replaced bin-wrapper for the download module (link).

We have something that in theory checks for the binary to be there so not sure why this is happening.

@kleinbs
Copy link

kleinbs commented Jan 20, 2023

We are running into the same issue here running in an alpine docker container as part of a gitlab ci job. Is this still being worked on, or is there a good work around?

Execution of 2 workers started at 2023-01-20T17:59:03.464Z
2023-01-20T17:59:03.526Z DEBUG @wdio/utils:initialiseServices: initialise service "sauce" as NPM package
2023-01-20T17:59:04.939Z INFO @wdio/cli:launcher: Run onPrepare hook
2023-01-20T17:59:04.942Z INFO @wdio/sauce-service: Starting Sauce Connect Tunnel
Error: spawn /builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc ENOENT
    at Process.ChildProcess._handle.onexit (node:internal/child_process:283:19)
    at onErrorNT (node:internal/child_process:478:16)
    at processTicksAndRejections (node:internal/process/task_queues:83:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
  path: '/builds/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/e2e/node_modules/saucelabs/build/sc-loader/.sc-v4.8.0/bin/sc',
  spawnargs: [
    '--verbose',
    '--logfile=-',
    '--no-autodetect',
    '--tunnel-name=SC-tunnel-40602318395085324',
    '--no-ssl-bump-domains=127.0.0.1,localhost,1[72](https://gitlab.disney.com/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/-/jobs/18227172#L72).17.0.3',
    '--user=MyAccountTestRunner',
    '--api-key=7ee95499-d176-4eed-ace0-0c83a4ec6e[74](https://gitlab.disney.com/consumer-identity/OneID/infra-eng/infra-source-bundles/my-account/-/jobs/18227172#L74)',
    '--region=us-west-1'
  ]
}

@naruaway
Copy link
Contributor

I think probably this is one of the root cause #241

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Help Wanted Issues were the community can get involved
Projects
None yet
Development

No branches or pull requests