Pyspark检查是否有任何行大于零

2024-06-16 10:49:43 发布

您现在位置:Python中文网/ 问答频道 /正文

我想过滤掉列表中所有列的零值行

假设我们有下面的df

df = spark.createDataFrame([(0, 1, 1, 2,1), (0, 0, 1, 0, 1), (1, 0, 1, 1 ,1)], ['a', 'b', 'c', 'd', 'e'])
+---+---+---+---+---+                                                           
|  a|  b|  c|  d|  e|
+---+---+---+---+---+
|  0|  1|  1|  2|  1|
|  0|  0|  1|  0|  1|
|  1|  0|  1|  1|  1|
+---+---+---+---+---+

列的列表是['a','b','d',,所以过滤后的数据帧应该是

+---+---+---+---+---+                                                           
|  a|  b|  c|  d|  e|
+---+---+---+---+---+
|  0|  1|  1|  2|  1|
|  1|  0|  1|  1|  1|
+---+---+---+---+---+

这就是我尝试过的

df = df.withColumn('total', sum(df[col] for col in ['a', 'b', 'd']))
df = df.filter(df.total > 0).drop('total')

这对于小数据集很好,但如果列列表很长,并且出现以下错误,则会失败,并出现以下错误

ava.lang.StackOverflowErrorat org.apache.spark.sql.catalyst.analysis.ResolveLambdaVariables.org$apache$spark$sql$catalyst$analysis$ResolveLambdaVariables$$resolve(higher...

我可以想到熊猫udf解决方案,但我的df非常大,这可能是一个瓶颈

编辑:

当使用@Psidom的答案时,我得到以下错误

py4j.protocol.Py4JJavaError: An error occurred while calling o2508.filter. : java.lang.StackOverflowError at org.apache.spark.sql.catalyst.expressions.Expression.references(Expression.scala:88) at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$references$1.apply(Expression.scala:88) at org.apache.spark.sql.catalyst.expressions.Expression$$anonfun$references$1.apply(Expression.scala:88) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.immutable.List.foreach(List.scala:392) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.immutable.List.flatMap(List.scala:355)


Tags: orgdfsqlapacheatlistcollectionspark
3条回答

这里有一个不同的解决方案。还没有尝试过大的列集,请让我知道这是否有效

df = spark.createDataFrame([(0, 1, 1, 2,1), (0, 0, 1, 0, 1), (1, 0, 1, 1 ,1)], ['a', 'b', 'c', 'd', 'e'])
df.show()

+---+---+---+---+---+
|  a|  b|  c|  d|  e|
+---+---+---+---+---+
|  0|  1|  1|  2|  1|
|  0|  0|  1|  0|  1|
|  1|  0|  1|  1|  1|
+---+---+---+---+---+

df = df.withColumn("Concat_cols" , F.concat(*list_of_cols)) # concat the list of columns 
df.show()

+---+---+---+---+---+-----------+
|  a|  b|  c|  d|  e|Concat_cols|
+---+---+---+---+---+-----------+
|  0|  1|  1|  2|  1|        012|
|  0|  0|  1|  0|  1|        000|
|  1|  0|  1|  1|  1|        101|
+---+---+---+---+---+-----------+

pattern =  '0' * len(list_of_cols) 

df1 = df.where(df['Concat_cols'] != pattern) # pattern will be 0's and the number will be equal to length of the columns list.
df1.show()

    +---+---+---+---+---+-----------+
    |  a|  b|  c|  d|  e|Concat_cols|
    +---+---+---+---+---+-----------+
    |  0|  1|  1|  2|  1|        012|
    |  1|  0|  1|  1|  1|        101|
    +---+---+---+---+---+-----------+

^{}在这里可能很有用:

df = spark.createDataFrame([(0, 1, 1, 2,1), (0, 0, 1, 0, 1), (1, 0, 1, 1 ,1)], 
     ['a', 'b', 'c', 'd', 'e'])
cols = ['a', 'b', 'd']

使用reduce创建筛选器表达式:

from functools import reduce
predicate = reduce(lambda a, b: a | b, [df[x] != 0 for x in cols])

print(predicate)
# Column<b'(((NOT (a = 0)) OR (NOT (b = 0))) OR (NOT (d = 0)))'>

然后filterpredicate

df.where(predicate).show()
+---+---+---+---+---+
|  a|  b|  c|  d|  e|
+---+---+---+---+---+
|  0|  1|  1|  2|  1|
|  1|  0|  1|  1|  1|
+---+---+---+---+---+

您可以将列作为数组传递给自定义项,然后检查所有值是否为零,然后应用过滤器:

from pyspark.sql.types import BooleanType
from pyspark.sql.functions import udf, array, col

all_zeros_udf = udf(lambda arr: arr.count(0) == len(arr), BooleanType())

df = spark.createDataFrame([(0, 1, 1, 2,1), (0, 0, 1, 0, 1), (1, 0, 1, 1 ,1)], ['a', 'b', 'c', 'd', 'e'])

df
.withColumn('all_zeros', all_zeros_udf(array('a', 'b', 'd'))) # pass the columns as array
.filter(~col('all_zeros')) # Filter the columns where all values are NOT zeros
.drop('all_zeros')  # Drop the column
.show()

结果:

+---+---+---+---+---+
|  a|  b|  c|  d|  e|
+---+---+---+---+---+
|  0|  1|  1|  2|  1|
|  1|  0|  1|  1|  1|
+---+---+---+---+---+

相关问题 更多 >