Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have a spark DataFrame which is grouped by a column aggregated with a count:

df.groupBy('a').agg(count("a")).show

+---------+----------------+
|a        |count(a)        |
+---------+----------------+
|     null|               0|
|      -90|           45684|
+---------+----------------+


df.select('a').filter('aisNull').count

returns

warning: there was one feature warning; re-run with -feature for details
res9: Long = 26834

which clearly shows that the null values were not counted initially.

What is the reason for this behaviour? I would have expected (if nullat all is contained in the grouping result) to properly see the counts.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
362 views
Welcome To Ask or Share your Answers For Others

1 Answer

Yes, count applied to a specific column does not count the null-values. If you want to include the null-values, use:

df.groupBy('a).agg(count("*")).show

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share

548k questions

547k answers

4 comments

86.3k users

...