Python 断言错误:col 应该是 Column

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/47903905/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 18:26:08  来源:igfitidea点击:

AssertionError: col should be Column

pythonapache-sparkpysparkapache-spark-sql

提问by Markus

How to create a new column in PySpark and fill this column with the date of today?

如何在 PySpark 中创建一个新列并用今天的日期填充此列?

This is what I tried:

这是我尝试过的:

import datetime
now = datetime.datetime.now()
df = df.withColumn("date", str(now)[:10])

I get this error:

我收到此错误:

AssertionError: col should be Column

断言错误:col 应该是 Column

回答by Alper t. Turker

How to create a new column in PySpark and fill this column with the date of today?

如何在 PySpark 中创建一个新列并用今天的日期填充此列?

There is already function for that:

已经有这样的功能:

from pyspark.sql.functions import current_date

df.withColumn("date", current_date().cast("string"))

AssertionError: col should be Column

断言错误:col 应该是 Column

Use literal

使用文字

from pyspark.sql.functions import lit

df.withColumn("date", lit(str(now)[:10]))