The search for ipad apps

I must admit to being ‘new’ to using iPads. I’ve owned one since day one, but since working in University, iPads were something people brought in out of choice. Now, I’m on the hunt for apps! I can’t believe I even said that, but as my middle schoolers all have iPad mini’s then I have to come up with whole new workflows and to be honest, they ain’t computers.

I’m tapping this out on my new, second hand, MacBook Air, and it’s great! My Surface sits on the floor, right next to my iPad2 and Mr14s kitten. I’m awake because my stupid brain is trying to resolve the conundrum of using iPads to teach some basic programming. It’s a week or so from Easter and I’m obsessing about June’s project already. Welcome to my head.

For example: I want to get them coding and making games, so I want to use something amazing such as Code Combat as I want to them to start some hands-on action in Python. Roadblock one is of course the generic iOS problem, cost and two is that an iPad isn’t really a computer so there has to be an ‘app’ between the user and the code. Despite their being plenty of useful tutorials for learning to program, such as Python the hard way, there’s no access to Terminal etc., so the hunt begins for a work around. I liked the idea of Udemy Python for beginners app but it’s no where near as appealing as code combat when it comes to the all important motivation of 12/13 year olds. Programming looks hard to many kids, and it’s not made any easier by beginning on an iPad. Having tried to use the ‘web’ with a couple of kids keen to try the ‘hour of code’, screen lock ups were common.

This puts me back in the hunt for apps. Perhaps try Hopscotch for the ‘hour of code‘ – Hopscotch is intended to familiarise kids with the world of programming, but they don’t write actual code. But he will learn how to make a simple play button, or a tap on the screen lead to an actual action, so in that way it’s ‘like’ Scratch but hasn’t got quite the depth. The more I look into the world of iPad apps, the more problems emerge.

  • The are not as good as desktop computers for interpreting code
  • They don’t have a real keyboard
  • They don’t have “Terminal’ like access
  • Programming resources which are free on computers, cost money as apps.

This leads me to wonder what the advantages of iPads are over cheap PC notebooks these days. The price point seems similar, yet there is this niggling on-going ‘fee’ problem and along with it, iPad editions of ‘web’ things seem convoluted, cut down or ‘crashy’. For example, I do like Sploder for getting kids thinking about and making games. It’s been around since 2007 at least as takes a sensible approach. It’s free online, and costs $1.99 for the iPad. I understand the commercial realities of developing software, however for those using iPads there appears to be a persistent cost which is more easily circumvented using a computer.

As a computing teacher, I get this feeling that I’m going to walking on egg-shells. Nothing turns kids off programming and making as fast as awkward or wonky UI, and that seems to be what happens with ‘apps’. For example: Given Google’s brilliance and wealth, why can’t kids insert a freaking image into a Google Doc. Why is that SO hard? This feature is basic word-processing, available since the 1980s. Just use the desktop? Yeah, as long as you have tiny and accurate fingers to navigate Google’s minuscule ‘dismiss’ notices and other hit and miss navigation. It’s frustrating when it should be easy.

I’m beginning to think iPads are adept at getting consumers to pay a premium for software which is actually inferior to what is often free and more useful on a computer.

I don’t think these things are cheaper in the long run either. They do have long battery life and they are highly portable, but they are more useful to receive information (and consume) than they are to create it. Just about everything has to have a work around and often a compromise. Then there is the ‘open in another app’ lottery. Sending data from one app to another requires chanting. There is often no obvious reason it fails to arrive as expected. The idea of “open in” or being able to locate a file in a folder is just too un-cool?

I don’t doubt that iPads can do many things that a few years ago were not possible – and for many teaching purposes they are great. However, I don’t live in ‘a few years ago’ and right now, I’m yearning for a 30 slot computer lab, because I’m unconvinced that in STEM or Visual Arts that iPads are a better option. I’m also dirty on cheap end laptops too. I bought a $400 Acer for my youngest child last year. It can barely open a window without calling time-out and sits somewhere on a bedroom floor abandoned. I’m not saying iPads are crap and PCs are better — just that entry level devices have been wrongly and frequently described is liberating learning when in fact they bring new challenges in schools, which didn’t occur in the era og computer-labs. In BYOD classrooms, the problems appears to get worse as the teacher has to do twice the preparation, twice the research and effectively know twice as many work arounds, hot fixes etc., just to keep a learning session running.

I’ll never moan about a computer lab again … the ability to image 30 machines over a network, run the LMS, configure the desktop, share data easily  … now those were the days … and don’t get me started on how bad Turn It in is on iPad or how Edmodo lags out or screen loops … some days things that should be simple are elusive, yet at the same time, watching a hundred iPads hit the internet in a lesson is still a marvellous thing. I just wonder if BYOD is more about returning to the world of computing …

Me and Conferences

Conferences come in two main forms to me. The first are events which are organised and orientated to educational institutions. These are not events which Universities or Educational Authorities sponsor, they are events where the scholarship of teaching or other field is discussed in the context of evidence based research.

The other forms are commercial. They are driven by revenue and as such resort to marketing routines, hire talent (keynotes get paid $10k-100k at some of these in Australia). They are glitzy events which are marketed to teachers using consumer messages and popular culture. The organisers are driven by profit and the speaker motivated by their fees. I am sure that many scholars whom have earned their place in the ‘edtech’ royal family have got where they are though scholarship and institutions. Good luck to them, they have every right to charge a fee and try to motivate the crowd. I am not sure the fees being demanded are warranted, but in the market, if event companies are willing to pay it, and people can afford to pay to listen, then thats just the neoliberal free market in action.

But most teachers can’t afford to go to glitzy events – time and money being the key barriers. This is perhaps not a bad thing, as the also-ran, self made experts who often fill out the speaker list, are neither scholars nor remotely qualified in adult education. Perhaps that’s why they lecture their audiences, safe in the knowledge that any push back or comment is likely to be made via Twitter (echo) and most people in the room will simply say nothing should they disagree.

There is disagreement in academic conferences – plenty of it – but the work people are doing is based on a deep and well understood process of research, data-sharing and re-evaluation, which is probably why academics don’t take too much notice of 140 systems. I really don’t believe that the money spent on glitzy events is warranted, given the reduction in funding for innovation, research and so on. We are buying into dogmatic, powerpoint driven information transmission, which ironically is what many of these ‘presos’ claim are the problem. Let me put this in perspective. A lecture is worth about $200 in Australia at best. A day’s casual teaching is worth $350.

Anyone paying some fly in, fly out ‘preso’ dropper thousands of dollars for what is little more than public knowledge is removing money from schools. This conference market is not real, it’s entirely being invented by software, hardware, events management and consultants – who claim these things are filling some gap, or offering some new insight that organic meetups and academic events are too stupid to know about.

Do yourself a favour, just visit your online peers in real life, its cheaper and more fun. Pay for a Masters – it will improve your teaching and career or take a course online for a hundred bucks. Even better, go attend something out of your field and learn something new.

So that’s me and conferences. I don’t agree ‘edtech’ events should be without criticism and I accept that they make money and attract people who are motivated by it. I like meeting folks, I like sharing ideas, but I am not about to pay $1000 for a ticket, when I can organise my own peer-event for $50 a head and still have a muffin. Boom.

Minecraft and Education 2015

For a long time now, education has discussed and experimented with immersion in virtual worlds, often with far more success and innovation than counterparts who have moved from office automation to cloud automation.

In virtual worlds, the potential for learning and teaching lies in the unique archetypes they continue to offer. Despite some mentions in popular reports such as Horizon, few educators to date have really found the time or interest in online, virtual communities (beyond Twitter).

Before Minecraft emerged as the ‘new’ way to learn in classrooms, games were wrapped up in cultural controversies about addiction, violence and other ‘media effects’ and so numerous projects in virtual worlds were largely ignored. Not because they didn’t work, but because the media-market which has grown out of Twitter focused (and sold) easier solutions to teachers. While some of us were busy in virtual worlds, others were making pithy YouTube videos, drawing long bows about ‘digital literacies’, creating endless Nings and writing blogs about lists of ‘apps’.

Minecraft isn’t in education. It is ‘in’ the public discourse about games and society. It’s aesthetic qualities are easily recognised by adults as ‘legos’ and it’s naturalistic biome seems pleasant in comparison with more dystopian games. Minecraft is popular in education for two cultural consumer reasons: 1 – It doesn’t scare parents significantly and 2 – it’s different enough for some teacher-users to attract attention and money. It is not in education because it has passed through the critical eye of virtual-world research, or because sufficient K12 research has been conducted to make any determination as to it’s value in school.

To be realistic, schools are not particularly open to games or reform. Minecraft is a game made by adults for adults and as such has no particular value as a ‘serious game’. What children may or may not learn from it is subjective. This is also subjected by broader media interest in the game, the creator and the billion dollar industry that Microsoft paid for – and into. Microsoft will use it’s claimed Educational value, as the relative cost outlay is tiny and inconsequential. The corporate-social-capital of snagging the ‘experts’ in a room has been a routine tradition in attempting to valorise claims, rather than get into any academic research.

Microsoft isn’t overly interested in game based learning, or at least it isn’t in Australia. I wasted my time with Microsoft once around Project Spark. They’ didn’t even return emails to a request they made to me. Aside from trying to show how ‘cutting edge’ they are there’s nothing to suggest Microsoft is overly interested in games and game based learning (enough to fund it) but happy to show how ‘edgy’ they are and use it in marketing messages. This isn’t news to people who have been working on games and virtual worlds for the last decade or so, but let’s not assume that everyone agrees to go along with the corporate marketing messages. I know Microsoft doesn’t care what I think and I’m fine with that. At the same time, I’m not about to give them (or other) a free ride into academia either. I would not be doing my job if I did.

Now don’t get me wrong here, I do think Minecraft has educational value to children. I just don’t believe converting the game into an educational narrative has merit yet. People make money out of this and it is part of the wider efforts of corporations to position their products as ‘part of childhood’ — where childhood is itself a story we tell ourselves. It’s not real. If your child is playing Minecraft, then the good news is that the archetypes in the game do show positive signs for learning, imagination and creativity. There’s nothing to suggest that any other media-computer time would be better or worse.

If they are playing in school, then the question is who’s really benefitting? Is it the child or some company executives and stock holders. What are they not learning? and why do teachers and schools feel the need to make specific mention of their Minecraft use

What I didn’t see coming out of the recent Minecraft summit in Los Angeles was new research or new funding for that research. To be fair, aside from a Facebook photo, I don’t know what it was about at all. And maybe that’s the point of marketing these days – to tell us nothing memorable enough to get us to ask questions. I am sure everyone had a great time talking about Minecraft … but what was made in Minecraft and where were the legions of kids who play it?

This seems the tragedy of ed-tech. It consumes technology and media as though they might be educational (and that’s enough to use it) but fails to address longstanding research questions about whether or not this investment has or is improving schools and society beyond them. Do schools need as much reform as entertainment media has faced from games and if so, what media education is needed?

Beyond educational research, consumer research is telling is of massive shifts in societal interest in game-like media experiences – that people like to buy DLC, that they self-identify with certain game titles and cultures. The question is not ‘what digital literacies does Minecraft offer’, but what does Mincraft culture say about how dubious the popular trope called “digital literacy” is. Minecraft has become central to the ‘talk-fest’ about games and game based learning, based on almost no evidence or research.

For parents and teachers, the questions to be asked remain around whether or not children and schools offer a well rounded, de-branded media education and on what basis do we bother to listen to people who don’t backup their claims with any real evidence. I wonder if playing “The Escapist” would be just as useful.

Malone’s theory of gaming.

Malone (1981) presented a theoretical framework for intrinsic motivation in the context of designing computer games for instruction. Malone argues that intrinsic motivation is created by three qualities: challenge, fantasy, and curiosity. Challenge depends upon activities that involve uncertain outcomes due to variable levels, hidden information or randomness. Fantasy should depend upon skills required for the instruction. Curiosity can be aroused when learners believe their knowledge structures are incomplete, inconsistent, or unparsimonious. According to Malone, intrinsically motivating activities provide learners with a broad range of challenge, concrete feedback, and clear-cut criteria for performance.

This theory pre-dates today’s technology, connectivity and digital culture. His theory emerges alongside the invention and domestication of the home video player. At the time, people we’re just beginning to expand their consumer biosphere from a radio and TV where information was created for them and about them to devices such as home micro computers, video players and the seminal Sony Walkman. Culturally, game offered these three qualities electronically for the first time, however anyone playing RPGs or numerous other table-top games would have thought how simplistic and unimaginative these blocky games appeared.

These three things are counter-intuitive to schooling. Challenge is associated with ‘being a better person and intrinsically feeling good about the self’ rather than accepting failure and repeated failure is motivating. Fantasy is rarely tolerated in school curriculum when facts prevail and therefore curiosity is tethered to time and distance a teacher feels a student can move from those facts in the time they make available.

I do like so much of the emergent theories about instructional design, technology and games in the 70s and 80s, but while they are interesting, much has changed technologically and culturally in game design and experience. Some schools have made the leap from drill and skill, others are talking about it. The key insight in Malone theory to me is clear cut criteria and concrete feedback which once again is best served from human interaction and empathy.

Malone, T. (1981). Towards a theory of intrinsically motivating instruction. Cognitive Science, 4, 333-369.

Google Feud

This game is interesting. Its called Google Feud.  Its super simple and could be really amazing if you could add your own terms for class. Oh how the conceptual frame and subjective frame might be more compelling. As it is, its interesting to play, and then discover what people are typing into google. Warning: you are going to loose time to this.

i-Solated

Technology is isolating I’ve decided. Not immediately isolating, but it grows over time so that by the time you (me) start to really recognise how terrible this is, it becomes a dubious way of life. Now don’t get me wrong, I do love to use technology. I’m not sure I can un-plug without catastrophic events following that choice. I imagine how utterly frustrating daily life would be without it. My day job runs on digital-rails and so does my PhD life (currently being neglected) so much that I spend vast amounts of time managing, creating and receiving it. Inbox 42,000 and I don’t care.

I learned a long time ago that making more information is a terrible idea for learning and teaching. So far the research suggests it makes very little difference to students. I wince at the “flipped classroom” tropes that some people have hung their hat on. I once heard a motoring journalist say “A Jag impresses the neighbours, if they no nothing about cars” and if you know how to use media well (and why media education has power) then the idea of ‘flipping’ becomes a null point. It does impress ‘other’ people if their own information fluency is limited to Word, Emails and giving the odd PowerPoint. And no, writing and publishing on the topic does not somehow validate neo-fauxisms.

Flipping is isolating. It places glass between the teacher and the learner. What do people mean when they say “what is normally homework becomes classwork”. There is no agreement that ‘homework’ benefits students and is anything more than a cultural construction connected to what parents are told schooling is.  What they mean is that they are putting some of their information in digital form and making it available to students. This doesn’t seem earth-shatteringly innovative. Having someone talk via video or assemble a set of videos is simply media delivery, and potentially isolating. There is hardly any real research into this, so your guess is as good as anyones.

Flipping comes with the same teacher-owned-assumed authority associated with the cognitive-apprentice. It isolates teachers from students in preference to them being part of information assemblages. But this post isn’t primarily about ‘flipped classrooms’, but what they represent. I think that over the last decade or more, that working with technology has become more isolating from the real and immediate word around me. I wonder what is happening out there, and find myself looking at Facebook and Twitter to find out. I am clearly insane. But I can’t ignore it, because at any second someone important will send me a message and I have to do something in the real-world about it. Now call me a neo-evolutionary objector, but what if I’m playing tug-o-war with the dog, or I want to go ride my bike in the rain because I like the pinging noise the hot engine makes. I can’t, because at any second something wicked this way comes … or occasionally, something sweet and uplifting (that isn’t a cat video or another Yo Momma video).

In my classroom, I am interested more and more in how I can use technology to connect with current (real time and live) questions and learning dilemmas my middle-school students are facing. I am frustrated that most tools are built for the ‘flipped classroom’ mentality. Take Edmodo for example. It does not have an RSS out function. I can pour other peoples content into my student’s online space, but I can’t push their content out. If I could, then I could pick up every kid’s post and know exactly which kid from 85 in the room I needs to talk with next. I don’t want to sit on a desk and stare at my Edmodo app (which doesn’t work very well on iPad btw). I want to get Edmodo to talk to IFTTT and IFTTT to get my Android phone to push a “ding” notice to the home screen. That way I’m always on route to a real-time learning drama or celebration.

But technology isn’t doing this. It is increasingly isolating ‘us’ from the real world. Online, I’m monitoring dozens of blogs, forums and feeds. I am spending many many unseen hours knitting together the feeds into things I can collect and analyse from Google Sheets to Evernote books. It all takes more and more time, not less.

Once the digital-dashboard is set up, I can start to flick information around, but it’s still at a distance and at a cost. The need to create more and more information workflows feels relentless and few institutions seems willing to label classes “high” levels of fluency needed. It’s a little like the motorbike test. Riders have to pull up a bike from 25kph to a stop (no falling off) in a certain distance. The problem is that if it’s wet, the distance stays the same.

I feel a lot like that with technology these days. I look at every new tool and wonder how this will connect me more to real-life. Institutions never liked people much and have created systems to isolate workers from the workplace. Fill out this online form only to be emailed immediately with “we got it” but down the line, no human ever emails or calls you to let you know what is happening or how you are. Technology seems increasingly interested in the next THREE months, because organisations seem to have decided thats how long they either need to commit to, or how long you need to have “it” before they want you to have the new “it”. And I’m a chap who actually likes technology!

So when I escape the isolation and tune into my favourite game-worlds, I feel like I am in charge of time. It’s like when I put my glasses on in the morning. I choose when the world comes into focus. I don’t doubt that the judgemental world will think “get a real life”, but what is real-life in an era where machines ping and ding endlessly. This is why I drive pre-ping cars and look at my phone at certain times. It isn’t good for me (or my students) if I am self-isolating via a screen. I am no where near the dose-response behaviour that I see out in ‘the real world’. There are people who clearly start to choke if they don’t tap, swipe and stare into their phones every 2 minutes. But this isn’t something I want to look back on and say “I was teaching”.

If the technology isn’t connecting to the real-world and it isn’t making real world conversations faster and more effective — what is ‘communication’ in the 21st century becoming?