You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi folks! Is there a way you could reduce the size/reduce the number of layers for the docker image at uwgac/topmed-master:2.6.0. It takes a bit of time to download the image even if it's not so large, and I wonder if the number of layers contributes to that? Thank you! -Kaushk
The text was updated successfully, but these errors were encountered:
Hi Kaushik,
The number of layers don't necessarily make the download slower (the number
of layers can make the download faster for a couple of reasons: (1) layers
can be downloaded in parallel (see
https://stackoverflow.com/questions/43479614/docker-parallel-operations-limit)
and (2) when a docker image is updated, only the layers that have changed
will be downloaded if you execute a docker pull command). However, we
could probably shave off 500MB. We currently build two versions of R and
the R packages: one for the sequential version of MKL (Intel's Math Kernel
Library) and one for the parallel version of MKL. But I don't think that
would improve the download performance by much.
cheers, roy
On Mon, May 18, 2020 at 7:59 AM Kaushik Ghose ***@***.***> wrote:
Hi folks! Is there a way you could reduce the size/reduce the number of
layers for the docker image at uwgac/topmed-master:2.6.0. It takes a bit
of time to download the image even if it's not so large, and I wonder if
the number of layers contributes to that? Thank you! -Kaushk
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#30>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADC7JAJWJWP7HJGSFHZOKKTRSFEMNANCNFSM4NEFAKPQ>
.
Hi folks! Is there a way you could reduce the size/reduce the number of layers for the docker image at
uwgac/topmed-master:2.6.0
. It takes a bit of time to download the image even if it's not so large, and I wonder if the number of layers contributes to that? Thank you! -KaushkThe text was updated successfully, but these errors were encountered: