Category: IT Education

Digital Technologies in the New Zealand Curriculum

This is a personal blog, so I should probably begin this post by stating that the opinions expressed here are entirely mine and do not necessarily reflect the views of my employer.

The New Zealand Ministry of Education recently began a consultation process on their proposals for a new digital curriculum in schools. Two parts of this curriculum in particular are up for discussion, and some details have been provided about these; Computational Thinking for Digital Technologies and Designing and Developing Digital Outcomes. Reading the descriptions of the proposed progress outcomes and outcome statements for students at different age levels under Computational Thinking for Digital Technologies I couldn’t help feeling that I was in something of a time warp. It reminded me of recently being sent a copy of a Cambridge computing exam for the UK GCSE. I first taught this topic in London in 1988 when GCSE was new, and looking at the exam questions the only significant difference I could find was some reference to the World Wide Web, which of course did not exist in 1988, and one rather grudging mention of mobile phones.

A similar sense of deja vu came to me looking at the proposed progress outcomes for computational thinking in the curriculum. Yes, there were algorithms and binary digits and input and output and software engineering methodologies, oh and of course sequence, selection, iteration etc., all those components that were so familiar in the 1980s. What I did not see so much of was components, patterns, frameworks, productivity tools and code automation, models, teams, customer interaction, iterative agile and lean processes, anything in fact from the last quarter century or so of software development. True, there is a sprinkling of contemporary computing terms such as big data, visual computing and artificial intelligence (not that this is particularly new) but underneath it all is some pretty old thinking.

Of course there is also the curriculum for Designing and Developing Digital Outcomes. You might say that I am looking in the wrong place for the content above. To be fair, this part of the curriculum proposal starts well; “Students understand that digital applications and systems are created for humans by humans”, but very soon sinks into topics such as patents, file types and network protocols. These are, of course, useful and important things, but I do feel that the broad understanding of digital systems that supports digital fluency for all students (not just the budding computer scientists) is rather lost in the mix.

I also wonder who is going to be teaching all of this in schools. How many teachers do we have in New Zealand who are well versed in 1980s style computer science? Some years ago I ran a session in an Auckland school for a group of digital technology secondary teachers from around the area. It became clear very quickly that all of these teachers had very different skill sets. They focused on particular tools and approaches, which were very diverse. This is not necessarily a bad thing, but it may be a challenge to implement such a curriculum with such a broad palette of experience and delivery in the available staff.

I would like to make it clear that I applaud the Ministry of Education for tackling the issue of the digital curriculum, and I wish them well in the journey towards making our students more digitally and computationally aware and capable. I hope that the current process of consultation and investment leads to great outcomes for everyone. As well as expressing my personal opinion in this blog, I have also shared my views through the online survey tool for the Digital Technologies | Hangarau Matihiko curriculum content consultation process, and I hope that other experienced educators, researchers and IT professionals will do the same. It is only by engaging in informed debate that we can get the best outcomes for future students.

JavaScript as a first programming language? Some pros and cons

Recently I was working with a group of JavaScript developers in Australia. One of them observed (from maintaining code written by other developers) that he felt people coded in JavaScript in various styles, depending on their programming background, and that this stylistic mashup was perhaps a consequence of the fact that no-one used JavaScript as their first programming language, so they brought their stylistic baggage with them from a range of other ‘first’ languages.

Now I don’t know if there is actually no-one out there who first learned to program with JavaScript, but I suspect there would be very few, for good reason. Historically, JavaScript has been hard to write (no decent development environments), hard to debug, hard to test, and the browser runtimes were slow and flaky. That is no longer the case. IDEs like WebStorm make it easy to develop code, and when it is used with the plugin for Chrome, it also provides a full debugging environment. There are now a range of test tools available, including QUnit, and the quality of JavaScript engines in browsers has increased hugely.

So, would JavaScript be a good first programming language? It has some nice features that would make it seem attractive. It supports higher order functions that can be passed around using variables, it supports variadic functions for variable length parameter lists, and supports closures. You could teach people to use functions without the object oriented baggage of something like Java. Once you do want to use objects, its type system does not support all the features of a classical inheritance based language, but on the other hand it is dynamic, so complex data types can be reconstructed at run time, a really cool feature.

What about the down sides? Well there are still a few. The loose type system is a trap for the unwary, as is the lack of block scoping in favour of function scoping (though the ‘let’ keyword will address this once the major browsers all support it), and another danger is the ease with which global variables can be created (either deliberately or accidentally.) Whacky features like hoisting (where you can use a variable before you declare it) might also confuse the beginner, and running your code in a browser might get in the way of focusing on the basics of the language at the expense of being distracted by the UI.

Some of these issues might be addressed with tools like Microsoft’s TypeScript language, which brings type checking to JavaScript, and tedious browser document navigation and UI issues are simplified by libraries such as jQuery.

So, would I want to try teaching JavaScript as a first language? It probably depends on the type of class. For students at the ‘softer’/applied end of computing, learning the pre-eminent language of the Web, with quick and easy routes to seeing something useful happening in a browser, might not be such a bad place to start.

Global Day of Coderetreat

gdcr_ad_textGlobal Day of Coderetreat

Last weekend I participated in the 2013 Global Day of Coderetreat, joining the session running at the Xero offices in Wellington, New Zealand. Along with 2,000+ other software developers across 165 locations on all continents, I spent the day honing my software craftsmanship, pair programming with other developers, using test driven development (TDD), and working within a range of changing and challenging design constraints. A coderetreat is an opportunity to look at the same programming problem (typically Conway’s Game of Life) from multiple perspectives, without the pressure to create a finished product but using the opportunity to reflect on how we build software. I would recommend that all software developers, at whatever level of skill and experience, take a look at the coderetreat.org web site and keep a look out for upcoming coderetreats in their local area. If you can’t find one, why not run one yourself? All the information you need is on the web site. The only people who won’t gain anything from it are those who just want to show other people how great they (think they) are. Fortunately, these people are a tiny minority of the software development community. Most of us will embrace the opportunity to challenge themselves and learn from others, in a day of coding that is surprisingly enjoyable.

The ICT skills shortage and why bleating is not enough

The shortage of ICT professionals is not a new phenomenon in New Zealand, nor in many other developed economies. Professional organisations such as NZTech and the NZ Institute of IT Professionals (IITP) are well aware of this issue. New Zealand government ministers have recently expressed their own concerns about this.. but hang on… what are they doing about it? The IITP has recently pointed out to the government that its funding for ICT research over the last few years has been pitiful. ICT is treated as a poor relation to other areas of research and development, leading to low investment and, of course, low returns. Government policy influences university policy, so that my own university research goals sideline ICT, indeed most aspects of technology. The underlying thinking seems to be that ICT is just an infrastructure service, not a ‘real’ discipline. Some might say that the link between ICT research and ICT skills in the marketplace are tenuously linked at best. However, perhaps some investment in, for example, ICT scholarships at both undergraduate  and postgraduate levels might be a sign to potential students that the government does actually care about the ICT skills shortage enough to do something about it rather than just bleating.

Too much money chasing not enough IT grads

Too much money chasing not enough IT grads

According the the government’s careers web site, taking data from TradeMe, 6 out of the top 10 paying jobs in New Zealand in 2013 were in Information Technology. The top paying job was IT project manager at $225,000. Strangely, this is a bad thing, because it shows the effect of demand outstripping supply. We just don’t have enough students taking IT and other computing degrees to meet the needs of the NZ IT industry. If we can’t find the skills, the potential benefits to the economy of IT enterprises will be lost and we will be back to cows and timber. Everyone, from the MoE to the schools to the universities, is trying to do something about this, but at the end of the day, an IT career somehow needs to be a more popular option for those still in education.

Information Systems versus Information Technology – Complementary to a ‘T’?

Recently, a colleague from the College of Business and I (in the College of Sciences) have been attempting to play football in the nomansland between the Information Technology and Information Systems disciplines, in an attempt to repair the damage of past turf wars played out within our institution. Central to our discussions are the ACM/IEEE curricula. The most recent ACM IS curriculum dates from 2012 and has 7 authors, while the IT curriculum has 9 authors (none in common with IS) and dates from 2008. The IS curriculum talks about careers and broad courses. The IT curriculum talks about foundations, pillars and capstones. Beneath the differing approaches, however, a common theme emerges. Where the IS curriculum talks about data and information management, the IT curriculum talks about databases. Where the IS curriculum talks about IT infrastructure, the IT curriculum talks about networking. This pattern is repeated in pretty much every area of the two curricula. Together these approaches from a kind of ‘T’ shape in each of the core topic areas. The IS curriculum provides a broad horizontal view of Systems while the IT curriculum provides a vertical dive into supporting Technology. By merging these curricula into a comprehensive programme of IT and IS majors, we can hopefully provide both the breadth and depth of systems knowledge and skills essential to our future graduates.

Outstanding exam answers

Exam marking time again. Always a joy to record some of the more esoteric exam answers from students. Here are some of my past favourites from our first year paper:

Question: Briefly explain the term ‘wiki’.
Answer: A wiki is a pedia.

Question Is exporting e-waste ethical?
Answer 1. E-waste is not ethical because e-waste affects web page speed.
Answer 2. Yes.  As the renewing speed is growing quickly.

Question: Describe some advantages and disadvantages of using computers
Answer 1: A disadvantage of using computers is that lightning might strike your computer in bad weather.

Answer 2: If the computer crush, it will damage people, for example, if the airport crush by computer problem, the airship may accidented.
Actually, that last answer is correct, just expressed in rather idiosyncratic language. Looking forward to this year’s crop!