The 'news' was based on 'a survey of 2,000 men.' But we can't tell from this the quality of that data, nor do we tend to think about how the question is asked can have an influence on the result.
In this particular case, I have seen the original questionnaire. Participants were asked to pick from a list of 50 'skills' by ticking a box (online) alongside each skill. I would be very surprised if most participants did not pick out the handful of skills they thought represented them best, producing a 'men are bad at dad skills' result. No one really wants to tick 50 boxes. I suspect the result would have been very different if they had started with all the boxes ticked and asked participants to untick the ones they were bad at.
Now, this was just a fun questionnaire - though you do have to ask why it has ended up in the news so much. But the same concern applies to any such data. Whenever we are presented data which supposedly represents people's opinion, we should be able to drill down to see exactly how the participants were asked the questions, as it can have a huge impact on the outcome.