Finding Knowledge - young people's use of search engines
Some thoughts on my practical experiences of field work (and preliminary outcomes!)
I’ve now done three runs of my field work and started analysing the data. So today I thought I’d discuss some of the preliminary findings, and their implications. The findings are interesting in themselves, but I also want to highlight some of the benefits of the methods I used – recording talk, and ‘on-screen’ behaviours of groups of children completing search engine tasks.
Reasons to be hopeful
I have a small sample – 3 groups, totalling 8 (v. able) eleven year olds. However, to that we should add the additional class members (who I paid broad attention to & who’s worksheets I collected) and the students in the previous two classes I went into (and I have no overwhelming reason to think the 3 classes were particularly different). So there’s a warning there, but nonetheless this isn’t a tiny or hugely decontextualised group.
In any case, it was interesting to see how the pupil’s behaviours related to the issues the literature suggests they encounter. Specifically it looked like:
- Use of 'answer' sites was a) often derided, and b) often accompanied by fact checking using other websites. I.e. those sites were a “first port” to look for a particular fact, but those were then confirmed elsewhere.
- Pupils were adept at using 'suggested search', spelling correction and so on, contratry to some prior research (e.g. (Druin et al., 2009))
- Pupils did not always just use the question phrase as a search query, although this was certainly a prominent strategy
- First off, ‘phrase transference’ was a pretty common strategy, with pupils directly typing in assignment questions in order to find an answer. This was the case even with multi-part questions in which finding the finished answer was likely to depend on some ‘unknown’ piece of knowledge which needed to be found first
- Pupils were generally not willing to engage with longer texts, tending to want immediate answers. Where longer texts were scanned, even quite prominent occurrences of the answer – for example, in enlarged quotations – were not always seen. In addition, they generally seemed unaware of ‘find’ functions present in most browsers (ctrl+f) – a feature I’d struggle without.
- Furthermore, although pupils often questioned yahoo answers & similar, the reverse wasn’t true – that is, pupils couldn’t give any particularly deep reasons why a website might be trustworthy, except that it looked like an authoritative site. That’s probably true of adults too, but it’s still concerning (particularly given the glossy edges on some awful websites). It might also suggest that part of the questioning of ‘answers’ type sites is due to it being drilled into them that these sites are unreliable. I can think of HCI style experiments to test that hypothesis but from my data it’s only a vague hunch.
Suggested search…suggestive search?
I hope the above has partly highlighted why I think my particular method – including as it did, the recording of ‘on screen’ data – is particularly important. Another example that I think is particularly salient comes from ‘suggested search’. This function starts offering “suggestions” for query completion as soon as you type some root words into google (or whatever search engine). It often provides a kind of ‘signposting’ function, prompting correct spelling, or given the “lay of the land” on a particular topic.
However, in this case – and I would never have known this from just obtaining log data, I needed the screen images – it also provided some a) useless and b) more or less inappropriate suggestions. To give an example (a definitely safe for work one), if you type into google “does he” and have a look at the suggestions, you’ll get the idea: I get 4 results (the pupils got 8…I’m not sure why the discrepancy) does he:
- “like me”,
- “love me”,
- “[he]ather die in eastenders”,
- “fancy me”
But in this context my objection isn’t just that it was clearly inappropriate and distracting, instigating a lot of off task talk, but that I have no idea what purpose they’d serve anyone unless – rather cynically – google is happy to leave in distracting links which aren’t what the user is targeted, because clickthroughs provide advertising clickthroughs too.
Google Trends - a useful tool, including to reduce flu rates...
Now these suggested searches - arising from trends in searches - are interesting from a sociological perspective, for example, telling us something about political trends, including the impact of race (in the Obama campaign), about flu trends and no doubt various other things (including which celebrities people are currently searching for).
The key thing from my perspective is that although it’s interesting to know raw data – log data – on what people search for, whether small scale (as in my study) or larger (as in the case of ‘google trends’), it’s also interesting to know how they search, do they correct their spelling, do older/younger/disabled/specialist workers, etc. know how to use whatever search function they’re using, what impact do ‘aids’ such as ‘suggested search’ have, and so on. For the interested, there’s a nice paper (pdf) on ‘unobtrusive’ methods of internet research, which mentions the use of trends. What’s interesting in my context is the combined use of ‘trends’ with exploration of how such information is presented to, used (ignored, sidetracked onto, or utilised), and evaluated by students.
Where we’re at
So, that’s part of where I’m at – children experience some issues in accessing information on the web, even very able pupils, despite the fact that they are aware of some problem solving and fact checking techniques. Furthermore, although the systems in place to assist them are sometimes useful, more could be done. Specifically, this preliminary stage highlights that:
- Some of the prompts are not only unhelpful, but particularly distracting and sometimes inappropriate
- While google will suggest completed phrases from small segments, it has no way to do the reverse – suggest a shorter search from a long one (e.g. the whole question). Thus either this sort of system should be explored, or /and children need to become better at thinking about phrasing their search queries – this was an issue they raised themselves.
- More needs to be done to be done to undersatnd how to support effective reading of documents for information, this is partly technological (e.g. ctrl+f, intelligent highlighting) partly about children's skills - and there is relevant literature on both of these.
In addition to that, I'm starting to get some useful data from my analysis of the dialogue...but more on that another time! As always, comments would be great.