如何使用 Java 在 Spark SQL 中加入多列以在 DataFrame 中进行过滤
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/35211993/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to Join Multiple Columns in Spark SQL using Java for filtering in DataFrame
提问by Gokul
DataFrame a
= contains column x,y,z,kDataFrame b
= contains column x,y,aa.join(b,<condition to use in java to use x,y >) ???
DataFrame a
= 包含列 x,y,z,kDataFrame b
= 包含列 x,y,aa.join(b,<condition to use in java to use x,y >) ???
I tried using
我尝试使用
a.join(b,a.col("x").equalTo(b.col("x")) && a.col("y").equalTo(b.col("y"),"inner")
But Java is throwing error saying &&
is not allowed.
但是Java抛出错误说&&
不允许。
回答by zero323
Spark SQL provides a group of methods on Column
marked as java_expr_ops
which are designed for Java interoperability. It includes and
(see also or
) method which can be used here:
Spark SQL 提供了一组Column
标记为的方法,java_expr_ops
这些方法是为 Java 互操作性而设计的。它包括and
(另见or
)方法,可在此处使用:
a.col("x").equalTo(b.col("x")).and(a.col("y").equalTo(b.col("y"))