scala Spark SQL 过滤多个字段

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/36893571/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:13:58  来源:igfitidea点击:

Spark SQL filter multiple fields

scalaapache-sparkapache-spark-sql

提问by gstvolvr

What is the corrent syntax for filtering on multiple columns in the Scala API? If I want to do something like this:

在 Scala API 中过滤多列的正确语法是什么?如果我想做这样的事情:

dataFrame.filter($"col01" === "something" && $"col02" === "something else")

or

或者

dataFrame.filter($"col01" === "something" || $"col02" === "something else") 

EDIT:

编辑:

This is what my original code looks like. Everything comes in as a string.

这就是我的原始代码的样子。一切都以字符串形式出现。

df.select($"userID" as "user", $"itemID" as "item", $"quantity" cast("int"), $"price" cast("float"), $"discount" cast ("float"), sqlf.substring($"datetime", 0, 10) as "date", $"group")
  .filter($"item" !== "" && $"group" !== "-1")

回答by dheee

I think i see what the issue is. For some reason, spark does not allow two !='s in the same filter. Need to look at how filteris defined in Spark source code.

我想我明白是什么问题了。出于某种原因,spark 不允许在同一个filter 中有两个 !='s 。需要看下Spark 源码中filter是如何定义的。

Now for your code to work, you can use this to do the filter

现在为了让您的代码正常工作,您可以使用它来执行过滤器

df.filter(col("item").notEqual("") && col("group").notEqual("-1"))

or use two filters in same statement

或在同一语句中使用两个过滤器

df.filter($"item" !== "").filter($"group" !== "-1").select(....)

This link herecan help with different spark methods.

此链接在这里可以用不同的火花方法帮助。