scala 如何将“提供的”依赖项添加回运行/测试任务的类路径?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/18838944/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 05:41:04  来源:igfitidea点击:

How to add "provided" dependencies back to run/test tasks' classpath?

scalasbtsbt-assembly

提问by user2785627

Here's an example build.sbt:

这是一个例子build.sbt

import AssemblyKeys._

assemblySettings

buildInfoSettings

net.virtualvoid.sbt.graph.Plugin.graphSettings

name := "scala-app-template"

version := "0.1"

scalaVersion := "2.9.3"

val FunnyRuntime = config("funnyruntime") extend(Compile)

libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"

sourceGenerators in Compile <+= buildInfo

buildInfoPackage := "com.psnively"

buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)

assembleArtifact in packageScala := false

val root = project.in(file(".")).
  configs(FunnyRuntime).
  settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
    libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
  )): _*)

The goal is to have spark-core "provided"so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.

目标是拥有 spark-core,"provided"因此它及其依赖项不包含在程序集工件中,而是将它们重新包含在run- 和 -test相关任务的运行时类路径中。

It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependenciesand hopefully override the default. I've tried things including:

似乎使用自定义范围最终会有所帮助,但我对如何实际导致默认/全局运行/测试任务使用自定义libraryDependencies并希望覆盖默认值感到困惑。我尝试过的事情包括:

(run in Global) := (run in FunnyRuntime)

and the like to no avail.

之类的无济于事。

To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.

总而言之:这感觉本质上是 web 案例的概括,其中 servlet-api 在“提供”范围内,并且运行/测试任务通常会派生一个 servlet 容器,该容器确实为正在运行的代码提供了 servlet-api。这里唯一的区别是我没有分叉出一个单独的 JVM/环境;我只想手动增加这些任务的类路径,有效地“撤消”“提供的”范围,但以一种继续从程序集工件中排除依赖项的方式。

回答by douglaz

For a similar case I used in assembly.sbt:

对于我在 assembly.sbt 中使用的类似情况:

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)) 

and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.

现在“运行”任务使用所有库,包括标有“提供”的库。无需进一步更改。

Update:

更新:

@rob solution seems to be the only one working on latest SBT version, just add to settingsin build.sbt:

@rob的解决方案似乎是最新版本的SBT只有一个工作,只是添加settingsbuild.sbt

run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated

回答by tgpfeiffer

Adding to @douglaz' answer,

添加到@douglaz 的回答中,

runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))

is the corresponding fix for the runMain task.

是 runMain 任务的相应修复。

回答by VasiliNovikov

If you use sbt-revolverplugin, here is a solution for its "reStart" task:

如果您使用sbt-revolver插件,这里是其“重新启动”任务的解决方案:

fullClasspath in Revolver.reStart <<= fullClasspath in Compile

fullClasspath in Revolver.reStart <<= fullClasspath in Compile

UPD: for sbt-1.0 you may use the new assignment form:

UPD:对于 sbt-1.0,您可以使用新的分配表:

fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

回答by Ryan

Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt asseblyProj/assemblyto build a fat jar for deploying with spark-submit, as well as sbt runTestProj/runfor running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.

另一种选择是为组装与运行/测试创建单独的 sbt 项目。这允许您运行sbt asseblyProj/assembly以构建一个用于使用 spark-submit 部署的胖 jar,以及sbt runTestProj/run通过嵌入 Spark 的 sbt 直接运行。作为额外的好处,runTestProj 无需修改即可在 IntelliJ 中运行,并且可以为每个项目定义一个单独的主类,以便在使用 sbt 运行时在代码中指定 spark master。

val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion

val commonSettings = Seq(
  name := "Project",
  libraryDependencies ++= Seq(...) // Common deps
)

// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
  .settings(
    commonSettings,
    assembly / mainClass := Some("com.example.Main"),
    libraryDependencies += sparkDep % "provided"
  )

// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
  .settings(
    // Projects' target dirs can't overlap
    target := target.value.toPath.resolveSibling("target-runtest").toFile,
    commonSettings,
    // If separate main file needed, e.g. for specifying spark master in code
    Compile / run / mainClass := Some("com.example.RunMain"),
    libraryDependencies += sparkDep
  )