Python 如何在 pyspark.sql.funtions.when() 中使用多个条件?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/33151861/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How do I use multiple conditions with pyspark.sql.funtions.when()?
提问by jho
I have a dataframe with a few columns. Now I want to derive a new column from 2 other columns:
我有一个包含几列的数据框。现在我想从其他两列派生一个新列:
from pyspark.sql import functions as F
new_df = df.withColumn("new_col", F.when(df["col-1"] > 0.0 & df["col-2"] > 0.0, 1).otherwise(0))
With this I only get an exception:
有了这个,我只得到一个例外:
py4j.Py4JException: Method and([class java.lang.Double]) does not exist
It works with just one condition like this:
它只适用于这样的一种条件:
new_df = df.withColumn("new_col", F.when(df["col-1"] > 0.0, 1).otherwise(0))
Does anyone know to use multiple conditions?
有谁知道使用多个条件?
I'm using Spark 1.4.
我正在使用 Spark 1.4。
采纳答案by Ashalynd
Use parentheses to enforce the desired operator precedence:
使用括号强制执行所需的运算符优先级:
F.when( (df["col-1"]>0.0) & (df["col-2"]>0.0), 1).otherwise(0)
回答by Cyanny
you can also use
from pyspark.sql.functions import col
F.when(col("col-1")>0.0) & (col("col-2")>0.0), 1).otherwise(0)
你也可以使用
from pyspark.sql.functions import col
F.when(col("col-1")>0.0) & (col("col-2")>0.0), 1).otherwise(0)
回答by vj sreenivasan
whenin pysparkmultiple conditions can be built using &(for and) and |(for or), it is important to enclose every expressions within parenthesis that combine to form the condition
当在pyspark 中时,可以使用&(for and) 和|构建多个条件 (for or),将每个表达式括在括号内很重要,这些表达式组合在一起形成条件
%pyspark
dataDF = spark.createDataFrame([(66, "a", "4"),
(67, "a", "0"),
(70, "b", "4"),
(71, "d", "4")],
("id", "code", "amt"))
dataDF.withColumn("new_column",
when((col("code") == "a") | (col("code") == "d"), "A")
.when((col("code") == "b") & (col("amt") == "4"), "B")
.otherwise("A1")).show()
whenin spark scalacan be used with &&and ||operator to build multiple conditions
当在spark scala 中时可以与&&和||一起使用 运算符建立多个条件
//Scala
val dataDF = Seq(
(66, "a", "4"), (67, "a", "0"), (70, "b", "4"), (71, "d", "4"
)).toDF("id", "code", "amt")
dataDF.withColumn("new_column",
when(col("code") === "a" || col("code") === "d", "A")
.when(col("code") === "b" && col("amt") === "4", "B")
.otherwise("A1"))
.show()
Output:
输出:
+---+----+---+----------+
| id|code|amt|new_column|
+---+----+---+----------+
| 66| a| 4| A|
| 67| a| 0| A|
| 70| b| 4| B|
| 71| d| 4| A|
+---+----+---+----------+