如何使用 Java 在 Spark SQL 中加入多列以在 DataFrame 中进行过滤

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/35211993/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 23:51:38  来源:igfitidea点击:

How to Join Multiple Columns in Spark SQL using Java for filtering in DataFrame

javaapache-sparkdataframeapache-spark-sql

提问by Gokul

  • DataFrame a= contains column x,y,z,k
  • DataFrame b= contains column x,y,a

    a.join(b,<condition to use in java to use x,y >) ??? 
    
  • DataFrame a= 包含列 x,y,z,k
  • DataFrame b= 包含列 x,y,a

    a.join(b,<condition to use in java to use x,y >) ??? 
    

I tried using

我尝试使用

a.join(b,a.col("x").equalTo(b.col("x")) && a.col("y").equalTo(b.col("y"),"inner")

But Java is throwing error saying &&is not allowed.

但是Java抛出错误说&&不允许。

回答by zero323

Spark SQL provides a group of methods on Columnmarked as java_expr_opswhich are designed for Java interoperability. It includes and(see also or) method which can be used here:

Spark SQL 提供了一组Column标记为的方法,java_expr_ops这些方法是为 Java 互操作性而设计的。它包括and(另见or)方法,可在此处使用:

a.col("x").equalTo(b.col("x")).and(a.col("y").equalTo(b.col("y"))