Python 如何删除熊猫数据框的最后一列数据

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/20517650/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-18 20:38:57  来源:igfitidea点击:

How to delete the last column of data of a pandas dataframe

pythonpandasdataframe

提问by Artturi Bj?rk

I have some cvs data that has an empty column at the end of each row. I would like to leave it out of the import or alternatively delete it after import. My cvs data's have a varying number of columns. I've tried using df.tail(), but haven't managed to choose the last column with it.

我有一些 cvs 数据,每行末尾都有一个空列。我想将其排除在导入之外,或者在导入后将其删除。我的 cvs 数据有不同数量的列。我试过使用df.tail(),但没有设法用它选择最后一列。

employment=pd.read_csv('./data/spanish/employment1976-1987thousands.csv',index_col=0,header=[7,8],encoding='latin-1')

Data:

数据:

4.- Resultados provinciales
Encuesta de Población Activa. Principales Resultados

Activos por provincia y grupo de edad (4).
Unidades:miles de personas


,álava,,,,Albacete,,,,Alicante,,,,Almería,,,,Asturias,,,,ávila,,,,Badajoz,,,,Balears (Illes),,,,Barcelona,,,,Burgos,,,,Cáceres,,,,Cádiz,,,,Cantabria,,,,Castellón de la Plana,,,,Ciudad Real,,,,Córdoba,,,,Coru?a (A),,,,Cuenca,,,,Girona,,,,Granada,,,,Guadalajara,,,,Guipúzcoa,,,,Huelva,,,,Huesca,,,,Jaén,,,,León,,,,Lleida,,,,Lugo,,,,Madrid,,,,Málaga,,,,Murcia,,,,Navarra,,,,Orense,,,,Palencia,,,,Palmas (Las),,,,Pontevedra,,,,Rioja (La),,,,Salamanca,,,,Santa Cruz de Tenerife,,,,Segovia,,,,Sevilla,,,,Soria,,,,Tarragona,,,,Teruel,,,,Toledo,,,,Valencia,,,,Valladolid,,,,Vizcaya,,,,Zamora,,,,Zaragoza,,,,Ceuta y Melilla,,,,
,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,de 16 a 19 a?os,de 20 a 24 a?os,de 25 a 54 a?os,de 55 y más a?os,
1976TIII,"8.9","11.6","60.4","11.8","16.4","14.4","65.2","14.9","47.9","49.9","246.0","60.1","20.5","14.3","88.9","11.2","34.5","42.5","278.0","91.3","6.6","7.2","41.5","13.3","25.3","22.8","135.3","37.5","19.8","24.4","153.0","43.0","166.8","203.7","1079.0","230.7","14.1","16.4","86.0","23.8","17.0","18.3","86.6","28.6","31.0","38.7","180.4","29.8","15.3","19.2","120.6","30.4","19.9","15.3","104.2","23.4","19.7","19.5","97.5","29.7","28.0","23.9","140.5","30.1","29.1","46.1","263.8","70.0","8.9","6.2","45.7","14.6","19.7","19.7","123.0","35.3","26.8","22.5","141.0","36.2","4.8","6.0","33.1","13.4","23.1","31.6","174.5","33.8","11.9","14.3","83.8","18.8","7.0","9.3","50.3","20.0","22.4","23.4","125.8","28.6","22.7","21.6","143.1","50.9","12.5","13.7","89.5","33.2","14.3","14.7","134.0","54.7","136.6","207.5","1067.6","218.6","34.7","41.1","196.4","38.4","37.2","35.0","200.5","46.1","15.6","23.8","111.6","30.7","14.0","16.8","120.2","74.9","5.7","6.4","39.2","8.0","24.5","25.6","135.3","27.1","36.4","39.4","246.1","74.0","10.2","11.3","63.9","13.4","10.5","11.0","74.1","19.6","19.3","23.9","140.3","31.7","5.5","6.0","35.6","11.3","55.2","55.6","262.5","68.1","3.1","3.2","24.4","5.4","21.8","18.4","116.7","37.1","4.6","3.4","37.3","12.0","20.3","16.7","102.2","23.1","73.5","85.5","454.6","101.5","19.2","23.4","90.7","20.5","41.3","54.7","272.2","57.0","6.0","7.1","56.5","28.9","29.2","32.1","192.7","49.8","0.0","0.0","0.0","0.0",
1976TIV,"8.7","11.7","60.8","11.4","14.4","13.6","63.3","14.5","49.1","50.6","244.9","54.2","19.0","16.9","86.8","11.4","33.2","42.3","271.8","86.0","5.8","7.5","40.3","13.9","25.1","24.7","132.7","38.4","18.8","23.4","151.8","43.9","172.2","201.7","1070.7","228.1","11.1","15.7","82.5","21.1","16.4","18.0","89.2","26.6","32.6","40.0","176.5","30.5","15.8","18.1","121.3","30.2","19.0","17.3","106.3","24.1","19.9","19.0","101.7","26.9","25.3","22.3","142.7","28.9","30.0","42.4","267.6","70.1","7.3","7.0","44.4","13.0","17.8","21.4","122.8","34.0","28.4","21.6","140.5","36.8","4.7","6.6","32.6","10.8","24.8","32.7","177.2","32.3","11.9","12.5","85.4","20.5","6.9","8.5","48.8","19.9","22.4","22.1","127.6","25.1","18.5","21.1","137.8","48.7","12.4","11.1","84.9","31.5","13.6","15.6","132.7","52.0","144.0","202.3","1054.0","222.5","35.6","40.1","194.1","37.5","36.7","34.7","203.8","47.1","15.6","23.6","114.3","31.3","14.0","15.9","118.3","76.7","5.5","7.3","36.9","9.3","25.5","25.1","138.7","26.8","34.8","42.9","250.3","74.9","9.9","11.8","62.8","14.0","10.0","13.2","74.5","19.2","19.5","24.2","142.7","31.0","4.0","5.9","35.5","12.0","55.0","56.7","264.7","63.3","2.8","3.5","23.9","5.1","20.0","21.6","116.4","34.9","4.5","3.7","36.5","12.1","21.1","17.6","100.6","25.7","74.6","87.5","455.5","102.1","18.9","22.9","90.0","21.6","40.2","57.1","273.9","58.5","5.6","8.3","57.6","23.9","28.3","31.4","192.2","46.4","0.0","0.0","0.0","0.0",
1977TI,"9.2","11.8","59.9","11.2","14.2","13.2","65.9","14.7","48.2","50.4","251.1","50.8","17.8","15.4","86.5","11.8","30.6","42.9","272.6","84.1","5.8","7.4","37.2","12.8","24.1","22.8","131.3","38.2","17.8","23.5","151.1","42.5","168.1","200.4","1077.2","223.3","11.6","12.8","80.9","17.6","14.4","16.4","88.2","23.9","34.5","37.5","176.3","30.8","15.2","19.7","121.3","31.6","18.4","19.4","107.4","24.7","20.0","18.1","98.3","26.6","24.9","23.6","150.7","27.5","29.5","40.3","267.4","70.5","5.6","7.5","44.2","12.8","17.1","21.1","122.8","33.6","29.6","23.3","142.1","37.9","4.6","5.5","33.7","11.2","23.5","30.4","175.2","32.8","12.0","12.7","84.8","21.3","7.3","9.3","46.6","17.8","30.2","26.0","147.1","25.2","15.9","22.7","133.2","45.1","12.8","12.1","84.3","28.0","12.4","16.5","131.2","55.6","150.9","202.9","1065.4","223.7","36.6","44.0","194.3","39.9","36.7","31.5","196.7","45.7","14.8","22.5","115.1","29.4","11.7","17.2","114.2","75.8","5.0","7.7","38.0","9.4","24.0","26.8","143.5","27.0","35.3","43.0","247.4","73.5","9.7","12.1","61.6","13.3","9.5","11.9","73.9","18.9","20.4","26.7","143.0","31.6","4.0","5.0","35.5","12.3","52.3","58.0","266.0","62.5","2.6","2.7","24.2","6.0","17.3","21.0","113.0","33.3","4.5","5.2","33.8","10.6","18.7","18.8","98.3","24.8","77.4","87.6","446.6","100.3","20.5","23.4","90.2","20.4","38.7","50.7","277.6","57.3","6.4","8.7","60.1","21.5","28.6","31.0","194.8","45.7","0.0","0.0","0.0","0.0",

采纳答案by EdChum

You can specify which columns to import using usecolsparameter for read_csv

您可以使用usecols参数指定要导入的列read_csv

So either create a list of column names or integer values:

因此,要么创建一个列名或整数值列表:

cols_to_use = ['col1', 'col2'] # or [0,1,2,3]
df = pd.read_csv('mycsv.csv', usecols= cols_to_use)

or dropthe column after importing, I prefer the former method (why import data you are not interested in?).

或导入后删除列,我更喜欢前一种方法(为什么导入您不感兴趣的数据?)。

df = df.drop(labels='column_to_delete', axis=1) # axis 1 drops columns, 0 will drop rows that match index value in labels

Note also you misunderstand what taildoes, it returns the last nrows (default is 5) of a dataframe.

还要注意你误解了什么tail,它返回n数据帧的最后一行(默认为 5)。

Additional

额外的

If the columns are varying length then you can just the header to get the columns and then read the csv again properly and drop the last column:

如果列的长度不同,那么您可以只使用标题来获取列,然后再次正确读取 csv 并删除最后一列:

def df_from_csv(path):
    df = read_csv(path, nrows=1) # read just first line for columns
    columns = df.columns.tolist() # get the columns
    cols_to_use = columns[:len(columns)-1] # drop the last one
    df = read_csv(path, usecols=cols_to_use)
    return df

回答by conner.xyz

Here's a one-liner that does not require specifying the column name

这是一个不需要指定列名的单行

df.drop(df.columns[len(df.columns)-1], axis=1, inplace=True)

回答by Gusev Slava

Another method to delete last column in DataFrame df:

删除 DataFrame df 中最后一列的另一种方法:

df = df.iloc[:, :-1]

df = df.iloc[:, :-1]

回答by Nelson Dinh

Improve from @conner.xyz answer above:

从上面的@conner.xyz 答案改进:

df.drop(df.columns[[-1,]], axis=1, inplace=True)

df.drop(df.columns[[-1,]], axis=1, inplace=True)

If you want to delete the last two columns, replace [-1,]by [-1, -2].

如果要删除最后两列,请替换[-1,][-1, -2]

回答by Diego BV

After importing the data you could drop the last column whatever it is with:

导入数据后,您可以删除最后一列,无论它是什么:

employment = employment.drop(columns = [employment.columns[-1]])

回答by aysa

Another way to remove the last column:

删除最后一列的另一种方法:

df = df[df.columns[:-1]]

回答by Bhushan

As with all index based operations in Python, you can use -1 to start from the end.

与 Python 中所有基于索引的操作一样,您可以使用 -1 从末尾开始。

df.drop(df.columns[-1], axis=1, inplace=True)

df.drop(df.columns[-1], axis=1, inplace=True)