java Spark - 将 int 与列相除?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/38855235/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-03 03:50:40  来源:igfitidea点击:

Spark - Divide int with column?

javaapache-sparkdataframeapache-spark-sql

提问by lte__

I'm trying to divide a constant with a column. I know I can do

我试图用一列除以常数。我知道我能做到

df.col("col1").divide(90)

but how can I do (90).divide(df.col("col1"))(obviously this is incorrect). Thank you!

但是我该怎么做(90).divide(df.col("col1"))(显然这是不正确的)。谢谢!

回答by zero323

Use o.a.s.sql.functions.lit:

使用o.a.s.sql.functions.lit

lit(90).divide(df.col("col1"))

or o.a.s.sql.functions.expr:

o.a.s.sql.functions.expr

expr("90 / col1")