The differences between what Twitter users say they do and what they actually do

I always enjoy studies comparing what people say they do to what they actually do. Most of us are often pretty uninsightful, oftentimes systematically so, in terms of knowing what we did, or at least what we’re prepared to tell someone else we did. Probably both.

Of course since the time we decided everything should be as online as much as possible, measuring this effect has become a lot easier to do in some domains.

In a recent post, Pew Research looked at users of the service formerly known as Twitter.

59% of respondents were vaguely accurate when they answered the researchers’ question about how many accounts they followed. “Vaguely” being the key word here, it was a multiple-choice question with the categories being: less than 20, 20-99, 100-499, 500-999, 1000-4999 or greater than 5000.

A few more, 67% reported roughly accurate figures for how many people followed them. I suppose that tracks with the idea that some folk might be a little more interested in how many people are listening to them than how many people they listen to, although I’m actually surprised it’s not a bigger difference.

There was nothing to stop the users actually looking up their own real values in order to get it exactly right in 100% of cases. After all,plenty of people have (had?) Twitter a mere finger-press away at almost all times. But the question didn’t specifically ask them to do so and in fact was presented in a way that the researchers thought would make them less likely to do so.

They then asked the respondents whether they’d tweeted about a political or social issue in the last year. 45% of them said that yes, they had done so.

Here’s where it gets a bit messy. There are countless possible definitions of what is or is not political or a social issue, making it rather harder both answer and and also validate the accuracy of the answer than a question that can be assessed for veracity by looking at a big number on a screen.

Obviously a tweet about a politician or the act of voting would generally count. But how about a factual tweet about the weather or the progression of Covid through the population? Is the personal political? Is everything political? I think you’d get extremely different responses from different people.

But the researchers had to use some benchmark – here they used a statistical model that classified tweets as to whether they’re political in nature by training it on a dataset of previous tweets that had been classified by actual humans as to whether or not they were political.

45% of respondents said that, yes, they had tweeted about political or social issues. And most of these folk, 78%, had indeed tweeted something that the model classified as being political. This left 22% of them claiming that they’d tweeted something political whilst the model classified their tweetings as wholly non-political.

However out of the folk that said they had not done tweeted anything political, the model disagreed in almost half the cases. 45% of those participants had posted something the model classified as political. Apparently similar results were found over a shorter time frame, asking people to remember their behaviour over the last 30 days rather than a year.

These numbers can’t really offer evidence as to whether the disparities between reported and actual behaviour are down to people’s memories, differing definitions of “political” or an inclination to deliberately lie in the survey. I wish they’d dug into that a bit more. Certainly it would have been challenging to truly get to the bottom of it, but I’d have been interested in people’s responses as to whether they’d tweeted at all, or tweeted about a much more precisely defined subject that could have been assessed reasonably well without recourse to machine learning.

Nonetheless, the results certainly show the limitations of asking people about their behaviour and treating their answers as though they accurately correspond to their actual actions – something which is often done with minimal acknowledgement in countless other studies.

If your research question is about people’s actual behaviour, rather than what they want to tell you about their recollections, and you can realistically measure the behaviour itself rather than ask about it after the fact then do so! And if you cannot, then we should at least be suitably modest about what it actually means when a certain proportion of people are willing to tell you that they did something in the past that happens to correspond to their personal interpretation of your question. We don’t tend to believe it to be an objective fact when everyone tells us they’re a better than average driver.

Leave a comment