当数据帧列中的nan值与另一列中的nan值相同时,如何使用同一列中的值填充数据帧列中的nan值?例:Where条款

2024-06-09 05:41:50 发布

您现在位置:Python中文网/ 问答频道 /正文

我是python的初学者,需要您帮助解决我的问题。 我有一个关于冠状病毒死亡率的数据集。有两列邻里名称(列名:Neighborhood Name)基于邮政编码列(列名:NFS),邮政编码列基于邻里名称列填充

我尝试在两列中填充Nan值

这就是我想做的

1-将数据输入jupyter

 covid_df.head(5)

Output is

covid_df.isnull().sum().to_frame()

Null Values

covid_sub_df = covid_df.loc[:, ['Neighbourhood Name', 'FSA']]
covid_sub_df

covid_sub_df_2 = covid_sub_df.drop_duplicates()
covid_sub_df_2

现在我试过了

val = ""
for i, j in covid_df['Neighbourhood Name'], covid_df['FSA']:
    for k,l in covid_sub_df_2['Neighbourhood Name'], covid_sub_df_2['FSA']:
        if k == val and j == l:
            covid_df['Neighbourhood Name'] = covid_sub_df['Neighbourhood Name']
        if j == val and k == i:
            covid_df['FSA'] = covid_sub_df['FSA']

我得到这个错误:

--------------------------------------------------------------------------- ValueError Traceback (most recent call last) in 1 val = "" ----> 2 for i, j in covid_df['Neighbourhood Name'], covid_df['FSA']: 3 for k,l in covid_sub_df_2['Neighbourhood Name'], covid_sub_df_2['FSA']: 4 if k == val and j == l: 5 covid_df['Neighbourhood Name'] = covid_sub_df['Neighbourhood Name']

ValueError: too many values to unpack (expected 2)

谢谢大家


Tags: andto数据namein邻里名称df
2条回答

因此,您需要做的是消除以下错误

ValueError: too many values to unpack (expected 2)

这个问题不是特别提出的,因为标题是如何填充nan值。此外,如果可能,您应该尝试提供一个虚拟数据

但是,假设您希望消除错误,那么您可能希望同时循环变量。有一个名为zip()的函数可以实现这一点。因此,以下修改有望奏效:

val = ""
for i, j in zip(covid_df['Neighbourhood Name'], covid_df['FSA']):
    for k,l in zip(covid_sub_df_2['Neighbourhood Name'], covid_sub_df_2['FSA']):
        if k == val and j == l:
            covid_df['Neighbourhood Name'] = covid_sub_df['Neighbourhood Name']
        if j == val and k == i:
            covid_df['FSA'] = covid_sub_df['FSA']

不清楚要用哪些值填充Nan值。一个选项是使用数据帧替换方法:

covid_df.replace({np.nan : new_value})

用新的_值替换每个nan值。这是可行的,因为pandas构建在著名的python库numpy之上,并将每个Nan值保存为np.Nan。您应该导入numpy以使其在以前工作:

import numpy as np

请注意,在新的_值变量中,每个Nan值都将替换为相同的精确值

相关问题 更多 >