8000 compiling the scala source files contained within the mleap R package failed. · Issue #26 · rstudio/mleap · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
compiling the scala source files contained within the mleap R package failed. #26
Open
@lgongmsft

Description

@lgongmsft

I sync to the latest master and was trying to compile the scala source files contained within the mleap R package with the configure.R script but got the following error:

lgong@ubuntu:~/mleapr$ Rscript configure.R
Warning message:
In normalizePath("internal/mleap-spark") :
  path[1]="internal/mleap-spark": No such file or directory
==> using scalac 2.11.8
==> building against Spark 2.0.0
==> building 'mleap-2.0-2.11.jar' ...
==> '/home/lgong/scala/scala-2.11.8/bin/scalac' -optimise -deprecation '/home/lgong/mleapr/java/spark-2.0.0/main.scala'
/home/lgong/mleapr/java/spark-2.0.0/main.scala:3: error: object mleap is not a member of package org.apache.spark.ml
import org.apache.spark.ml.mleap.SparkUtil
                           ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:4: error: not found: object ml
import ml.combust.mleap.spark.SparkSupport._
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:5: error: not found: object resource
import resource._
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:6: error: not found: object ml
import ml.combust.bundle.BundleFile
       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:7: error: object bundle is not a member of package org.apache.spark.ml
import org.apache.spark.ml.bundle.SparkBundleContext
                           ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:16: error: not found: value SparkUtil
    val pipeline = SparkUtil.createPipelineModel(transformers.toArray)
                   ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:17: error: not found: value SparkBundleContext
    implicit val sbc = SparkBundleContext().withDataset(dataset)
                       ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value managed
    for(bf <- managed(BundleFile("jar:" + path))) {
              ^
/home/lgong/mleapr/java/spark-2.0.0/main.scala:18: error: not found: value BundleFile
    for(bf <- managed(BundleFile("jar:" + path))) {
                      ^
9 errors found
Error in spark_compile(jar_name = jar_name, spark_home = spark_home, filter = filter,  :
  ==> failed to compile Scala source files
Calls: <Anonymous> -> spark_compile
Execution halted

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0