The development of thinking: still good.

When it comes to groups working together, I’m a fan of Malcolm Knowles. Knowles recognised the value of time invested in the development of thinking.

While teachers promote the importance of ‘digital literacy’, they often fail to acknowledge the lack of autonomy that teachers provide children in their learning – and the almost unlimited ability children have out of school to develop their thinking and independence – not least though games and multiplayer worlds.

Screen Shot 2018-01-22 at 10.05.13 am.png

The challenge seems to be that many institutions spend vast amounts of time being authoritarian and trying to control people. Ironically, responsibility for the results of this no longer rests with the organisation (or its managers), but passed to the individual (teacher and student). It’s a weird waterfall effect. At the same time, the more liberal views of how schools should work and how teachers should teach have been widely disseminated to the public. More open, more collaborative, fewer rules, more freedom. The worst incarnations of this suggest technology is the mechanism for this.

Schools are not places for corporate refugees to try Toyota management tactics, nor a place to experiment with ideas which can’t be scrutinised and verified. However, we’ve had almost 70 years of academics promoting the value of autonomy, collaboration and developmental thinking. None of them said wait for Microsoft or Google to make an app for that.

Advertisements

What do kids use technology for?

angry-kid_large

There are several factors which influence what children you technology for. While it is easy to be critical of what they are doing right now, it’s important to understand each technology has a particular significance in their lives and is subject to ongoing efforts by parents, teachers and others with a vested interest to sway them in particular directions or to adopt particular behaviours. A decade ago, people were arguing Web2.0 would reform how children learn despite those technologies having no social history to speak of – aside from video games. A few did make videos or write a blog in the modality that made early adoptor teacher get excited, but most kids didn’t – and they still don’t.

There are four uses for technology in the lives of children. Each of these contribute to the over-used and under-developed concept of screen time: passive consumption (binging on Netflix, swiping through miles of Insta posts from streamers and pop-culture influencers etc.,); interactive consumption (video games mostly); communication (maintaining ‘your story’ on social media, tagging friends and belonging and content creation (streaming, recording, sharing and managing audience channels – Twitch, YouTube etc.

In school, I’d argue that very few children would conceptualise their use of technology in the classroom in any of the four, but instead tend to describe themselves as ‘doing work on the laptop’ or ‘going on Google Docs’ meaning that they still don’t connect the activities they are directed (required) to do at school with any of the things they would choose to do if left to their devices. Of course, I accept there are exceptions – however, my point is that ‘screen time’ is a term used to demonise children’s use of technology by a cadre of adults including parents and teachers who, for their own reasons prefer children simply did what children are ‘supposed to do’ with technology. As they appear to be unable to define what this is, we are left with headlines about ‘too much screen time’ and how personalisation will, in the future, make learning with technology more fun, engaging and effective.

From my stance – as a teacher – I’m willing to accept that the four things which make up just about all of their screentime usage stand well outside the activities I’m asking them to do. I am simply fortunate enough to be leeching off some digital skills and literacies that they have picked up so far, but this is far from uniform or consistent. I have to, therefore, accept that I am teaching them how to do things with technology that remain alien to them in their personal lives.

Most importantly, when they can’t do something (school tech task vs personal tech task) it is always MY problem and it’s a cop-out to hear people blame screen time. The school has never been in sync with children’s media preferences and technological practices in society – not before Web2.0, not during and certainly not now. As new tools arrive in schools, each teacher has to both learn the tool and figure out how to teach them to use towards any given learning episode. It’s rather ignorant to believe teachers are creating ‘digital citizens’ and that what happens in school should/could be transferred to ‘real life’.  What good teachers do is show students how to use technology well inside systems that increasingly measure thier ‘digital studentship’ while recognising children have a right to their own digital life and use technology in a modality that they find interesting or valuable – even if I don’t for second understand the humour in a meme or want to play a game that has little appeal to me.

A decade of living in the gift shop

In the last few years, I have been far more interested in cultures which have emerged from “educational technology” than the phenomenon itself. In fact, when I think about it, I’ve been around this topic for over a decade and rather than see any improvement.

I’m left feeling that very few are actually interested a possible decline.

To set this up, let me start by saying that for every single online #EduChat, of which there is no shortage of topic or competition, the concluding tweet should be “please exit via the gift shop”. From this stance, I want to talk about personalisation and what that might actually mean.

etgs_title

Should we keep the lights on or blow up the store?

The level of commercial bias, leveraged personalities (EduCelebs and Influencers) and manufactured social-ideological in-breeding is a never-ending isle of novelty and fetishes based on marketing promises, upgrade-culture and individual ambition/competition. These are all key socio-cultural factors which were omitted from the visions of the future but have emerged from decades of EdTech Culture.

Recently, I watched “The Last Jedi”. The thrust of the story is not to improve the present or reclaim the past, but to kill it all and start over.

Educational technology is being ‘framed’ as offering students a new-future of unprecedented personalisation. We are told is going to be best achieved through EdTech’s on-going blundering into ‘gaming’ where it will be more fun, more personalised and shaped by the learner as a neo-avatar. The learner will be in more control of what they learn, how they learn it, and their own credentials.

A point to note here is that personalisation is being conceptualised as ‘data’ and ‘skills’ – which is repeated and amplified to the masses who latch onto it as being true.

Within these new gamified experiences (the tiles of which are not obvious to me – but are already in play apparently), students will embark on a new era of personalised learning: interest and passion-driven, where are offered choices which the machine decodes into various modular ‘fun’ activities under the trending veneer of gamification.

At various points, the machine determines their ‘skill’ based on the data it has collected. Rather than a future trend, I’d argue this is a throwback to behaviorist machine learning – now re-birthed as ‘gamification’. The worst offender here is Microsoft’s Minecraft Education Edition whereby students are routinely presented to us as learning and creating in wonderful new ways – despite no tangible data to support the ‘new skills’ that they are acquiring in almost a decade of the game. Outside this creepy treehouse, the culture of Minecraft via Streamers is far from PG or positive – and clearly few teachers spend any time at all thinking about the media culture that they introduced kids to – as they built some crap model of a Roman fort or ‘learn to code’.

On that point, academics are increasingly questioning the value of ‘learning to code’ as a short encounter within the mysterious realms of ‘design thinking’. They are questioning the validity of claims these are ‘skills of the future’. Meanwhile, media-technology education in K12 remain stuck on ‘stranger danger’ and handing out “hour of code” certificates in response to curriculum demands and individual schools using such things to differentiate themselves as better than other schools. So much for the last decades relentless progress towards personalised learning and empowering the global classroom.

Audrey Waters posted a great piece on the vagueness of ‘personalised learning‘ and the thin research upon which so many people appear to be making some BIG decisions. It is well worth reading as you exit through the gift shop.

To me, this is the dangerous culture of EdTech. The culture of online discussions (especially those being directed by individuals (they call themselves ‘founders’) is to repeat the most popular ‘trend’ statements, rather than make any real effort to evaluate claims. As these online events (which are a form of media entertainment) are also socially-driven, few seem to question the validity of claims and ignore their own behaviour. For example, in Australia, I’d argue teachers will demand kids use technology (the Internet and device) for around 900 hours this year. There are no hash-tag chats about the ramifications and effects of this, nor the lack of responsibility and recourse if it turns out kids are not getting the ‘bright new personalised future learning thingy’ that presumably emerges from hashtagged answers to moot questions on a Sunday night.

Is learning personal? – yes. Can teachers bend their course outlines to allow students to follow personal interest pathways? – Yes, sometimes and within boundaries, they do not control, such as timetables, resources, directives – and technologies they use.

As presented by EdTech, Personalised Learning is not a concept, but a product. We can take a generic product then add a monogram or greeting, but it’s no more personalised than that. Technology is a mass media product, which is both mass produced and has mass-produced meanings for each individual. It’s impossible to show ‘skills and data’ are going to somehow make this clear.

The narrative of EdTech remains the same. Personalised learning has not (so far) liberated students from narrow-minded teachers (who probably can’t use the Internet). This again being touted in support of personalised learning. It’s reinforced by the Twitteristi who have proved effective at repeating this headline and are seemingly here to “save students” once again from crap experiences – aka teachers.

Meanwhile, what little research there is, continues to show families are struggling to cope with media and its effects and educational research show EdTech makes no significant changes to student attainment.

To my mind, I’ve been in the EdTech gift shop for a decade.  I’m still not buying.

I’ve seen a shift from an institutional and academic focus on what “quality education” looks like (and optimism for that to change to include media education) to one where global brands and select individuals (often endorsed/sponsored) use the same technology to de-focus attention away from research and scholarship.

EdTech has succeeded in enlisting students (and families) inside well maintained ‘product cycles’ – which are apparently better because they are being increasingly ‘personalised’ – but as yet, can’t show any data to support claims of improvement. So we are back in the damn gift-shop.

New for 2018 will be Block Chain. Now your Uncle Alan has managed to get $200 f Bitcoin using an app, it’s going to be easy to sell this idea will change Education.

Now we also believe that personalised learning is real, we are likely to also believe it will be made robust and safe by Block Chain technology. Forget Mozilla Badges, they were rubbish. We now have ‘the blockchain’.

Not surprisingly, the online pundits have been busy writing about the Edu-Blockchain – as of course, education has to have its own personalised edition.

They suggest a blockchain transcript could evolve to provide a thorough record of achievement, one that might include writing samples, images of projects completed, reflections or recommendations from faculty, or links to resources that chronicle a student’s progress. Such a ledger could provide evidence of a lifetime of experience, growth, and learning. Access to such a record of achievement might play a role in getting a job of the future of course … but the question is … how well have governments and academic institutions faired against the corporate brands so far – who is more likely own, share and decide on this irreversible, un-breakable transcript of individual skills and their data – your government or Facebook. When you walk into a job interview – who’s data will be used in face-recognition to thin the lines of opportunity and decide who will be successful – and who won’t.

Having a personalised monogram when you sign into this dystopia won’t make it any less of a dystopia for those who are already marginalised or at risk. But gamification is a great way of handing out winner and loser badges.

So yeah, after a couple quiet years, I think 2018 is a year that will be very interesting – and there are plenty of dumb novelty items that I’m going to throw rocks at in the EdTech giftshop.

That moment when I almost quit the PhD

I knew doing a PhD was going to be hard. If I had the luxury of living in a bubble, well away from life, a PhD would be a doddle. In reality, life is complicated and spare time always has a list of competing pressures: home, kids, work, fun and all the other little things that make us all say “wow, where did today go”.

So this year I have actually done quite a lot of work on the PhD. I’ve done a lot of work focusing it and re-writing the literature review several times in response. But then life keeps on piling up the pressure and I have guilt. I am not doing enough. I am not good enough … and the mind loops that drop out of that kind of thinking.

I almost gave up. I missed a big deadline and tried to justify that by believing my other priorities got in the way. But they didn’t. I just let them pile up and become distracted by things that I do genuinely care about – and want to rail against – but ultimately when those things (and people) don’t prioritise me – then I have myself to blame if I get sucked in. In short, I’ve learned that I don’t like randomness at all. I accept it exists, but there are people and places to avoid – because I don’t have time to chart a path through randomness and need to find clear markers – be that people who deliver/care/support or event which help me feel a little better about what I’m doing.

Most of all, I didn’t quit because I don’t like to quit. I like to think that if my horse is dead, that I’ll get off it. But it isn’t dead. I do want to ride it to the finish line.  So I’ve got my list of priorities, and it’s far from singular. But I won’t be packing up just yet as my 14 year old managed to give me that sense of purpose that I’d lost somewhere in the fog.

I need to avoid investing time in people/things which don’t appear to provide any reciprocal benefit to the things I care most about. Seems obvious.

Are teacher’s confident enough to teach new knowledge?

There is no real choice between basic and 21st-century skills. Both are essential learning outcomes for students. In schools, this must be applied to outcomes (standards) and curriculum. Neither can exist independently as the history of human affairs cannot annex or isolate one from the other. At the same time, schools cannot continue to make outlandish claims about being ‘the new technology school’ and be taken seriously. We are well over a decade into cheap, reliable and plentiful laptops, tablets and software connected to fast fixed and mobile communication platforms.

Even more important, delivering better learning hinges on preparing and supporting quality teachers who can deliver this “must have” combination of basic and advanced learning to all students using models which reflect broader media experiences and skills children have acquired in their own generational timeline. Teachers have selectively chosen digital tools which reveals their media-view of the world – whether they like it or not.

The best have shown us how games can transform how kids feel about school and the worst replicate the same dull skinner-box experience of clicking boxes in shovelware services in return for petty icons and badges.

The Internet is awash with lists of what skills kids need, backed up by vague warnings that schools are not preparing kids for the world, or for jobs that are not yet known. These things are useful to those selling solutions – which also exploit the difficulties associated with attempting to measure a wide range of open-ended and performance-based assessments of 21st century skill. Time and again, the research raises concerns about the reliability of results which, according to the salesmen and EdTech pundits will be resolved by buying newer technologies.

We are immersed in media we can’t control (but think we can) in order to feed our bias and create un-realistic perceptions about the world – past, present and future.

This long-term immersion into fake, imaged and self-representing digital worlds lead to the need for educators to grapple with another reliability question: whether 21st-century skills can be coached or “faked” – on a test or in a more open-ended project. A student, for example, might answer in ways that suggest she is a computational thinker when in fact she is merely demonstrating that she has learned what types of answers make her seem that way from her experience of using some software in a certain modality.

The research tells me that there are several areas that teachers need to be confident in when it comes to attempting to teach 21st Century skills: information literacy, collaboration, communication, innovation and creativity, problem solving, and responsible citizenship. But this is hardly a new insight … and conversations about these things don’t go deep enough to produce any useful frames for assessment,

There are three types of knowledge necessary for the 21st century: foundational, meta, and humanistic. These are being provided 24/7 by games such as Minecraft, Overwatch, Rocket League etc., and anyone who’s talked to kids about their game-life will see them declare this constantly. On the other hand, adults don’t know what the meta is, harp on about foundational skills (when they mean morality) and bemoan the decay of society and lack of connectedness IRL as they swipe and tap at their virtual-reality creating phonecuff on Facebook.

Although 21st-century frameworks are thought to advocate new types of knowledge, little has actually changed in the new century with respect to the overall goals of education. Until teachers are confident in delivering new types of knowledge though outcomes (standards) and assessment then  21st Century skills will remain vague and worse, students are more likely to be given operant conditioning software rather than allowed the kind of freedom and experience they enjoy in games and multi-user worlds.

Change: The currency of EdTech

Change is a word often used towards school. Whether it’s a shift in the method, ideology, direction, technology – the last decade has created a perpetual twilight for change. Tommorrow, things will be better – if only we can overcome some barriers.

The barriers to change are at best vague and at worst driven by individual or group dislike, distrust or disagreement with other individuals or groups. The last decade has been one in which false binaries, myths and downright lies have been created, shared and re-tweeted in pursuit of this change. But do people really want change? I tend to think they like the idea of it, in the same way we might like a better car or to feel more at ease with the huge societal and cultural changes going on. Fake new, radical pedagogy – whatever the buzz-of-the-day we can rely on someone talking about change.

Is education really as bad as we’re told? Are the opportunities as amazing as some claim – and why, if you don’t agree are you immediately labelled negative or a non-team player? I was brought up to ask questions and not to blindly agree. I don’t agree that education is rubbish or that schools kill creativity. I also don’t agree with brands provoking change as their incarnation of ‘better futures’ without a scrap of real evidence – because we all have rights and unlike children, our digital rights are deeply wrapped up in corporate spreadsheets and dubious tracking of our every move.

The worst thing about change is that people who are a) not in the classroom b) not actually teachers and c) have no academic relationships with students are driving the ‘change bus’ – and alsmost always present what teacher do from a deficit position. We know all adults know what schools are like – as they all went to one. We also know parents want the best for their kids and bombarded with ‘death of childhood’ and ‘decay of youth’ messages in the media. But what exactly do they want to change? Do they want brands to change education or do they want education to change brands?

I like to think the latter. The idea of living in a technologically deterministic society where machine learning and A.I. drives what children learn is very scary to me. I don’t think ‘most’ people want this either – but it enables the perpetual twilight where we flirt with technologies, identity and digital cultures which we IMAGINE are ‘good’ for us (and kids) when in fact the change never happens. Just like a gambler loses track of time in a casino because of their design – so we find ourselves in EdTech. Change is the vital element needed to sell products but is yet to demonstrate it improves learning or the lives of students. But if you want to be a thought-leader or get yourself a happy-clappy fanbase on social media – you gotta push the change-cart and tow the line.

Or not.

What do we really know about teens?

The iPhone has turned ten. There’s a useful summation of the so called iGen by Professor of Psychology, San Diego State University on the Conversation this week which is also supporting the release of her book (on the too read list).

This is the new normal: Instead of calling someone, you text them. Instead of getting together for dinner with friends to tell them about your recent vacation, you post the pictures to Facebook. It’s convenient, but it cuts out some of the face-to-face interactions that, as social animals, we crave. – Twenge, 2017

Essentially, Twenge reviews the issues and overlap with generational labels and why the current generation of teenagers behave very differently to previous generations. She argues that the common 1:1 ratio of teen to phone has resulted in isolation, distraction and a broad dissatisfaction with non-preferred interactions with others. It’s this which I’ll pick up – is iGen making deliberate choices about avoiding/shutting down non-preferred interactions with teachers (who insist they need an education) for a world teens see as irrelevant?

Research continues to show that screen time needs regulation and that parental practices towards that goal are almost unknown. The limited research that has looked at screen time, is more often from psychology than education or media. It broadly aligns screen addiction with television addiction and gambling, which has been the ‘media effects’ line run about young people at peril for over thirty years.

The impact for teachers is similarly unknown. While teachers might learn how to use technology in pursuit of their goals – which are aligned to the modernist roots of mass education, most teachers I speak to are increasingly finding iGen difficult to engage when they shut-down. iGen is therefore physically familiar, and skilled at swiping at tapping, but involved in a cultural reproduction which alienates them from adults – be they parents or teachers. Schools have had various success in ‘banning phones’ or attempting to get students to use them in ‘school mode’ with a goal of annexing this culture. Few have policies towards ‘screen time’ in terms of digital nutrition, nor do they account for individual usage patterns of children – from low users to habitual. Twenge hints in her article at the ‘mood’ of teens who have grown up with phones, using a range of studies, mostly from the USA.

Some Australian teens do appear struggle to socialise and recognise the role of teachers in their daily lives. While the mantra of ‘pedagogy over technology’ is a well-worn phrase, the underpinning cultural reproduction of teens themselves cannot be isolated or ignored. In addition, the social distance between iGen and thier parents – which some researchers call ‘tethering’ – is more elastic than ever. The teen who doesn’t make much effort in school will not suddenly become more attentive if they are given rich-media courses over listening to their teacher. If they don’t like the class, they are quick to reject their teachers attempts to engage them. Dealing with iGen is therefore different, and further points to how difficult (silly) the idea of preparing kids for jobs of the future, when teachers and parents are struggling to understand the iGen of today.

This is a wicked problem that cannot be solved with behaviorist rules, or the liberalism and democracy of self-determination. These are decaying ‘adult’ ideas. The digital culture iGen CREATES has it’s own rules and motivations which we know almost nothing about. The teen who doesn’t make much effort in school will not suddenly become more attentive if they are given rich-media courses over listening to their teacher – so robot-teacher is a myth, the digital native is a myth and we keep trying to find the tech-solution to what are actually social problems. If they don’t like the class, they are quick to reject their teachers’ attempts to engage them. Thier super connected parents are just as quick to hear about how their over-reaching teacher is giving them a hard time – aka, please learn, we are trying to help you – and yet at home, teens vanish into their social media worlds behind closed bedroom doors. Dealing with iGen is therefore different, and further points to how difficult (silly) the idea of preparing kids for jobs of the future, when teachers and parents are struggling to understand the iGen of today.

If iGen doesn’t like today’s class, they are quick to reject teacher attempts to engage them – enabled by an experienced digital culture of doing your own thing, when not interesting in what’s happening in immediate reality. This ‘escape’ is to a world of conumerism and marketing which also targets teens with messages about identity and self-worth. Teens don’t see any problem with this remediation of lived experience – a culture enabled by a decade of 1:1 digital access. It is no wonder parents and teacher often feel drained. Dealing with iGen is different and creates problems which I think we’re struggling to understand – after a decade of perpetual disruption and reshaping of culture. Twenge leads me to think how difficult (silly) the idea of preparing kids for jobs of the future is (the current mantra of educelebs) when teachers and parents are struggling to understand how to connect iGen to other generations.