pandas 将数据帧中的 NaN 转换为零

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/48956789/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-14 05:13:47  来源:igfitidea点击:

Converting NaN in dataframe to zero

pythonpandasdataframereplacenan

提问by Josef

I have dictionary and created Pandas using cars = pd.DataFrame.from_dict(cars_dict, orient='index') and sorted the index (columns in alphabetical order
cars = cars.sort_index(axis=1) After sorting I noticed the DataFrame has NaN and I wasn't sure if the really np.nan values? print(cars.isnull().any()) and all column shows false.

我有字典并使用cars = pd.DataFrame.from_dict(cars_dict, orient='index') 创建了Pandas 并对索引进行了排序(按字母顺序排列的列
cars.sort_index(axis=1) 排序后我注意到DataFrame 有NaN我不确定 np.nan 是否真的值? print(cars.isnull().any()) 和所有列都显示错误。

I have tried different method to convert those "NaN" values to zero which is what I want to do but non of them is working. I have tried replace and fillna methods and nothing works Below is sample of my dataframe..

我尝试了不同的方法将这些“NaN”值转换为零,这正是我想要做的,但它们都不起作用。我尝试过替换和填充方法,但没有任何效果下面是我的数据框示例..

            speedtest          size 
toyota       65                NaN 
honda        77                800 

回答by cs95

Either use replaceor np.whereon the values if they are strings:

如果它们是字符串,则在值上使用replacenp.where

df = df.replace('NaN', 0)

Or,

或者,

df[:] = np.where(df.eq('NaN'), 0, df)

Or, if they're actually NaNs (which, it seems is unlikely), then use fillna:

或者,如果它们实际上是 NaN(这似乎不太可能),则使用fillna

df.fillna(0, inplace=True)

Or, to handle both situations at the same time, use apply+ pd.to_numeric(slightly slower but guaranteed to work in any case):

或者,要同时处理这两种情况,请使用apply+ pd.to_numeric(稍慢但保证在任何情况下都能正常工作):

df = df.apply(to_numeric, errors='coerce').fillna(0, downcast='infer')

Thanks to piRSquared for this one!

感谢 piRSquared 为这个!