Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

could not load: lapack.dll #133

Open
sdmcallister opened this issue Oct 29, 2021 · 5 comments
Open

could not load: lapack.dll #133

sdmcallister opened this issue Oct 29, 2021 · 5 comments

Comments

@sdmcallister
Copy link

Running windows 10, Nim 1.6

Successfully installed ggplotnim:

 Installing [email protected]
   Success: ggplotnim installed successfully.

Tried to run:

import ggplotnim
import ggplotnim/ggplot_vega
let mpg = toDf(readCsv("data/mpg.csv"))
ggplot(mpg, aes(x = "displ", y = "cty", color = "class")) +
  geom_point() +
  ggtitle("ggplotnim in Vega-Lite!") +
  ggvega("rSimpleVegaLite.html") # w/o arg creates a `/tmp/vega_lite_plot.html`

got:

:\Users\mcallistst\Desktop\test>nim r plotnim.nim
Hint: used config file 'C:\Users\mcallistst\nim\nim-1.6.0\config\nim.cfg' [Conf]
Hint: used config file 'C:\Users\mcallistst\nim\nim-1.6.0\config\config.nims' [Conf]
Hint: used config file 'C:\Users\mcallistst\Desktop\test\config.nims' [Conf]
.........................................................................................................................................
C:\Users\mcallistst\.nimble\pkgs\nimblas-0.2.2\nimblas\private\common.nim(50, 7) Hint: Using BLAS library with name: blas.dll [User]
................................................................................................................................
C:\Users\mcallistst\.nimble\pkgs\nimlapack-0.2.0\nimlapack.nim(19, 7) Hint: Using LAPACK library with name: lapack.dll [User]
....................................................................
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\dataframe.nim(1496, 29) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkIdent in node false [User]
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\dataframe.nim(1496, 29) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[localCol][idx] [User]
FormulaNode(name: "(== isNull(df[localCol][idx]).toBool false)",
            colName: "(== isNull(df[localCol][idx]).toBool false)",
            kind: fkVector, resType: toColKind(type(bool)), fnV: proc (
    df: DataFrame): Column =
  let localColT = df[localCol, Value]
  var res = newTensor[bool](df.len)
  for idx in 0 ..< df.len:
    res[idx] = isNull(localColT[idx]).toBool == false
  result = toColumn res)
..............
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(86, 7) template/generic instantiation of `closureScope` from here
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(91, 19) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[col.toStr][idx] [User]
FormulaNode(name: "(~ colStr (ms.trans df[col.toStr][idx]))", colName: colStr,
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let coltoStrT = df[col.toStr, float]
  var res = newTensor[float](df.len)
  forEach r in res,coltoStrIdx in coltoStrT:
    r = ms.trans(coltoStrIdx)
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(236, 39) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkAccQuoted in node `keys` [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(236, 39) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkIdent in node existKeys [User]
FormulaNode(name: "(notin keys existKeys)", colName: "(notin keys existKeys)",
            kind: fkVector, resType: toColKind(type(bool)), fnV: proc (
    df: DataFrame): Column =
  let keys = df["keys", string]
  var res = newTensor[bool](df.len)
  for idx in 0 ..< df.len:
    res[idx] = keys[idx] notin existKeys
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(237, 28) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(1106, 14) Warning: Formula (~ countCol 0) has a mismatch between given formula kind:
        `~` (mapping)
and automatically determined formula kind:
        << (reducing)
Please adjust the given kind to `<<`. [User]
FormulaNode(name: "(~ countCol 0)", colName: countCol, kind: fkVector,
            resType: toColKind(type(int)), fnV: proc (df: DataFrame): Column =
  var res = newTensor[int](df.len)
  forEach r in res:
    r = 0
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(276, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[maxName][idx] [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(276, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[minName][idx] [User]
FormulaNode(name: "(- (~ height df[maxName][idx]) df[minName][idx])",
            colName: "height", kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let
    maxNameT = df[maxName, float]
    minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,maxNameIdx in maxNameT,minNameIdx in minNameT:
    r = maxNameIdx - minNameIdx
  result = toColumn res)
FormulaNode(name: "(~ yColName df[minName][idx])", colName: yColName,
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,minNameIdx in minNameT:
    r = minNameIdx
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(294, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[maxName][idx] [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(294, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[minName][idx] [User]
FormulaNode(name: "(- (~ width df[maxName][idx]) df[minName][idx])",
            colName: "width", kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let
    maxNameT = df[maxName, float]
    minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,maxNameIdx in maxNameT,minNameIdx in minNameT:
    r = maxNameIdx - minNameIdx
  result = toColumn res)
FormulaNode(name: "(~ xColName df[minName][idx])", colName: xColName,
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,minNameIdx in minNameT:
    r = minNameIdx
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(326, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[maxName][idx] [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(326, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[minName][idx] [User]
FormulaNode(name: "(- (~ height df[maxName][idx]) df[minName][idx])",
            colName: "height", kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let
    maxNameT = df[maxName, float]
    minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,maxNameIdx in maxNameT,minNameIdx in minNameT:
    r = maxNameIdx - minNameIdx
  result = toColumn res)
FormulaNode(name: "(~ yColName df[minName][idx])", colName: yColName,
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,minNameIdx in minNameT:
    r = minNameIdx
  result = toColumn res)
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(346, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[maxName][idx] [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\postprocess_scales.nim(346, 23) template/generic instantiation of `{}` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkBracketExpr in node df[minName][idx] [User]
FormulaNode(name: "(- (~ width df[maxName][idx]) df[minName][idx])",
            colName: "width", kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let
    maxNameT = df[maxName, float]
    minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,maxNameIdx in maxNameT,minNameIdx in minNameT:
    r = maxNameIdx - minNameIdx
  result = toColumn res)
FormulaNode(name: "(~ xColName df[minName][idx])", colName: xColName,
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let minNameT = df[minName, float]
  var res = newTensor[float](df.len)
  forEach r in res,minNameIdx in minNameT:
    r = minNameIdx
  result = toColumn res)
.....
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\ggplot_vega.nim(214, 22) template/generic instantiation of `fn` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkAccQuoted in node `displ` [User]
C:\Users\mcallistst\.nimble\pkgs\ggplotnim-0.4.8\ggplotnim\ggplot_vega.nim(214, 22) template/generic instantiation of `fn` from here
C:\Users\mcallistst\.nimble\pkgs\datamancer-0.1.8\datamancer\formula.nim(818, 12) Warning: What kind? nnkAccQuoted in node `binWidths` [User]
FormulaNode(name: "(+ (~ binEnd displ) binWidths)", colName: "binEnd",
            kind: fkVector, resType: toColKind(type(float)), fnV: proc (
    df: DataFrame): Column =
  let
    displ = df["displ", float]
    binWidths = df["binWidths", float]
  var res = newTensor[float](df.len)
  forEach r in res,displIdx in displ,binWidthsIdx in binWidths:
    r = displIdx + binWidthsIdx
  result = toColumn res)
C:\Users\mcallistst\Desktop\test\plotnim.nim(3, 11) Warning: `toDf` is not required anymore, because `readCsv` already returns an actual `DataFrame` nowadays. Feel free to remove the `toDf` call.; toDf is deprecated [Deprecated]
CC: adler32
CC: compress
CC: crc32
CC: deflate
CC: gzclose
CC: gzlib
CC: gzread
CC: gzwrite
CC: infback
CC: inffast
CC: inflate
CC: inftrees
CC: trees
CC: uncompr
CC: zutil
CC: read
CC: write
CC: stdlib_digitsutils.nim
CC: stdlib_assertions.nim
CC: stdlib_formatfloat.nim
CC: stdlib_dollars.nim
CC: stdlib_widestrs.nim
CC: stdlib_io.nim
CC: stdlib_system.nim
CC: stdlib_hashes.nim
CC: stdlib_math.nim
CC: stdlib_parseutils.nim
CC: stdlib_unicode.nim
CC: stdlib_strutils.nim
CC: stdlib_streams.nim
CC: stdlib_lexbase.nim
CC: stdlib_options.nim
CC: stdlib_dynlib.nim
CC: stdlib_winlean.nim
CC: stdlib_times.nim
CC: stdlib_pathnorm.nim
CC: stdlib_win_setenv.nim
CC: stdlib_os.nim
CC: ../../.nimble/pkgs/chroma-0.2.5/chroma/names.nim
CC: ../../.nimble/pkgs/chroma-0.2.5/chroma/colortypes.nim
CC: ../../.nimble/pkgs/chroma-0.2.5/chroma/transformations.nim
CC: ../../.nimble/pkgs/chroma-0.2.5/chroma.nim
CC: stdlib_strformat.nim
CC: ../../.nimble/pkgs/seqmath-0.1.12/seqmath/sutil.nim
CC: ../../.nimble/pkgs/seqmath-0.1.12/seqmath/smath.nim
CC: ../../.nimble/pkgs/ginger-0.3.6/ginger/types.nim
CC: ../../.nimble/pkgs/ginger-0.3.6/ginger/backendCairo.nim
CC: ../../.nimble/pkgs/ginger-0.3.6/ginger.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/laser/tensor/datatypes.nim
CC: stdlib_monotimes.nim
CC: stdlib_random.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/laser/private/nested_containers.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/laser/tensor/initialization.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/init_cpu.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/accessors_macros_syntax.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/private/p_accessors_macros_desugar.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/operators_comparison.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/private/p_display.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/laser/cpuinfo_x86.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/operators_logical.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/math_functions.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/optim_ops_fusion.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/ml/clustering/dbscan.nim
CC: ../../.nimble/pkgs/nimlapack-0.2.0/nimlapack.nim
CC: stdlib_nativesockets.nim
CC: stdlib_base64.nim
CC: stdlib_httpcore.nim
CC: stdlib_asyncfutures.nim
CC: stdlib_asyncdispatch.nim
CC: stdlib_httpclient.nim
CC: stdlib_logging.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/datasets/imdb.nim
CC: stdlib_memfiles.nim
CC: ../../.nimble/pkgs/stb_image-2.5/stb_image/write.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/nlp/tokenizers.nim
CC: ../../.nimble/pkgs/arraymancer-0.7.8/arraymancer/tensor/einsum.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/value.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/column.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/df_types.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/formulaExp.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/formula.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/dataframe.nim
CC: ../../.nimble/pkgs/datamancer-0.1.8/datamancer/io.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_types.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_scales.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_theme.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_ticks.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/colormaps/viridisRaw.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_styles.nim
CC: ../../.nimble/pkgs/polynumeric-0.2.0/polynumeric.nim
CC: ../../.nimble/pkgs/scinim-0.2.2/scinim/signals/filters.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/postprocess_scales.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/collect_and_fill.nim
CC: stdlib_parsejson.nim
CC: stdlib_json.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim.nim
CC: ../../.nimble/pkgs/webview-0.1.0/webview.nim
CC: ../../.nimble/pkgs/ggplotnim-0.4.8/ggplotnim/ggplot_vega.nim
CC: plotnim.nim
Hint:  [Link]
Hint: gc: refc; opt: none (DEBUG BUILD, `-d:release` generates faster code)
216849 lines; 24.461s; 537.789MiB peakmem; proj: C:\Users\mcallistst\Desktop\test\plotnim.nim; out: C:\Users\mcallistst\nimcache\plotnim_d\plotnim_DF6D6F2080FD85F1BEF57F4234EF6E60111CB904.exe [SuccessX]
Hint: C:\Users\mcallistst\nimcache\plotnim_d\plotnim_DF6D6F2080FD85F1BEF57F4234EF6E60111CB904.exe  [Exec]
could not load: lapack.dll
Error: execution of an external program failed: 
@Vindaar
Copy link
Owner

Vindaar commented Oct 29, 2021

Ouch, this is the downside of adding geom_smooth (#127). geom_smooth internally uses BLAS to solve a linear least squares problem for the Savitzky-Golay filters.

I didn't quite realize that dead code elimination wouldn't actually eliminate the dependency, as long as geom_smooth is not used. :/ Especially for Windows is a bit of a pain.

The "solution" for you would be to get BLAS & LAPACK running on Windows... But that's not really a good solution as the dependency really shouldn't be there unless the relevant functionalities are used. The offending code is here:

https://github.com/Vindaar/ggplotnim/blob/master/src/ggplotnim/postprocess_scales.nim#L505-L533

(and the filledSmoothGeom below that calling it of course).

I will try to come up with a solution somehow.

edit:
In the CI I use msys2 to install the dependencies, see here:

https://github.com/Vindaar/ggplotnim/blob/master/.github/workflows/ci.yml#L65-L71

I'm not sure if using msys2 could be a (temporary?) solution for you.

edit 2:
Curiously though, compiling this example on my linux machine does not lead to a blas or lapack dependence. Not sure if this is Windows fault?

@sdmcallister
Copy link
Author

@Vindaar

Just flagging mainly to help target issues that may impact new users, and I appreciate the suggestions. Cairo is a pain on windows, as you warn in the Readme, so I thought I might try it with the vega-lite backend.

Is using Pixie instead of Cairo possible/desirable?

@Vindaar
Copy link
Owner

Vindaar commented Oct 30, 2021

Yes, that was indeed a good idea. In the CI I also use msys2 to install cairo, but there we have at least a few known options so far.

At the moment the Vega-Lite backend is not quite intended as an independent backend to avoid the Cairo dependency. While that would be possible, I simply didn't have that in mind when writing the code. So currently you'll still end up with a libcairo dependency even if you only call the Vega-Lite backend (at least I think so).
However, you can already achieve Cairo less compilation, if you compile with -d:noCairo, which replaces the actual ginger Cairo backend with a dummy backend. But since it's not needed for the Vega-Lite backend that should just work.

Confusingly I just checked whether compilation with -d:noCairo does indeed get rid of the cairo dependency, but for some reason it still shows up via objdump / ldd and I have no clue why right now. I'm a bit stumped. (see edit below)

On another note, I just realized (from looking at the Nimble file) that for the Windows CI I also compile with -d:lapack=liblapack because apparently the naming is different (at least for the msys2 lapack).

And regarding a Pixie backend:
this is in the works by @zetashift. An example plot using that backend can be found here:
Vindaar/ginger#28 (comment)

edit: Just dug into the spurious cairo appearance. Yeah, the reason for suddenly depending on cairo even if compiling with -d:noCairo is simply because we use the webview library. It uses gtk on linux, which depends on cairo.
It's very simple to just offer a pure JSON / HTML based output version of the Vega-Lite backend, if so desired (that already exists using ggvegaCreate which returns a JsonNode, but not in such a way to avoid a webview dependency).

@Vindaar
Copy link
Owner

Vindaar commented Feb 18, 2022

PR #143 adds a -d:nolapack option to disable geom_smooth and with it the LAPACK dependency. This isn't a perfect solution, but it's at least a workaround.

@Vindaar
Copy link
Owner

Vindaar commented Feb 21, 2022

PR #143 adds a -d:nolapack option to disable geom_smooth and with it the LAPACK dependency. This isn't a perfect solution, but it's at least a workaround.

PR #143 is now merged and v0.5.0 tagged. Hopefully compiling with -d:nolapack should yield no lapack dependency anymore.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants