Forensic Literature Reviews

Forensic Literature Reviews

A question that comes up from students from time to time is why we discourage secondary references in literature reviews, and it struck me that this is not just a one-dimensional question with a simple answer, but it actually leads to a rather interesting process which I have chosen to call literature review forensics. Forensics is defined as ‘relating to or denoting the application of scientific methods and techniques to the investigation of crime.’ Oh wait, was this a primary reference? Possibly not – it was the first definition that came up in a Google search. Best try a little harder. I checked the Oxford Dictionary. Same definition (phew!). Seems a bit strong, you might think, to associate crimes with literature reviews. Perhaps you will allow the working definition of ‘relating to or denoting the application of scientific methods and techniques to the investigation of secondary referencing’.

The problem of secondary referencing is that sources are apparently so easy to find these days (so much easier than when I started my PhD, trying to find printed journals in the library, waiting weeks for photocopied articles to arrive in the post through interlibrary loan), yet their real origin is sometimes less evident. An example of this in teaching and learning is the poster-like quotes that litter the Web from various supposed sources. A common example is “education is not the filling of a bucket (or pail) but the lighting of a fire”. Do an image search for that on Google and you will get an avalanche of hits. There must be thousands of these on classroom walls. The quote is variously attributed to W.B. Yeats, Socrates, Churchill etc. but in fact appears to come from an original quote by Plutarch as stated by David Boles. Oh wait, I have fallen into my own trap by quoting an unrefereed blog post. Well, more of that later.

Here’s another one you might have come across, supposedly from John Dewey; “If we teach today’s students as we taught yesterday’s, we rob them of tomorrow”. Awesome quote but I don’t think Dewey ever said it, or perhaps he said it but didn’t write it down. I take some corroboration from this blog post by Tryggvi Thayer. You’re probably thinking that once again I have fallen into the trap of citing another secondary reference, but in this case I did my very best to try to find the quote, since Dewey’s books are now out of copyright and available on the Web. I did an extensive search (up to the point where I lost the will to live, anyway) and was unable to find anything resembling that quote, so I’m fairly confident it doesn’t really exist. The fact that one of Dewey’s books was called ‘Schools of To-morrow‘ (with a hyphen) suggests an example of what I mean by forensics, looking a little bit deeper into the evidence before making assumptions about sources. One might assume that the book contains the quote by virtue of its title. Rookie mistake.

The first issue with secondary references is, of course, that your primary reference may be interpreting the other writer in a way that you do not think is reasonable. In fact they may just be making it up. You have no way of knowing unless you check the original reference, and this is often where you disappear into a rabbit hole, because the primary reference has taken a secondary reference from another source which was also using the second secondary reference and so on ad infinitum until either you can find the original reference or, as in some cases, it turns out be impossible.

A couple of examples I have come across are the following two diagrams, both supposedly sourced diagrams about enquiry learning. You might think it’s unreasonable to treat diagrams as references, but they still need to have their sources properly acknowledged, and if they claim to represent someone else’s ideas, do they really?
16926170994_bb0660102e_z    brunerispmodel

In the first example, ‘Murdoch’s Inquiry Cycle’, the academic source probably exists – it seems to be an amalgam of work from Murdoch, Branch, Stripling and Oburgbut – but it seems unlikely that this specific image came from any of those articles (please correct me if I’m wrong). There are many variations on the Web, none properly referenced as far as I can see. The second example, ‘Brunner’s Inquiry Process Model’, supposedly comes from a 2002 source, but I can’t find any evidence of the original version whatsoever. You could try starting at the University of Cambridge and see if you have more success than me. Of course these images (or their precursors) come from someone, and it may be that there is a proper reference hiding out there somewhere. My point is that it is sometimes really hard to find. That’s why my two example links come from blog posts by authors who clearly have no idea of the source of their material.

I think what these examples show is that it is a worthwhile and quite fascinating skill for students to develop a forensic approach to the literature. Ideally of course this would all be supported by a tool. The nearest one I can think of is Turnitin. Unfortunately, because Turnitin is primarily oriented towards detecting plagiarism rather than easily showing you the original source, it is much keener on showing you that your work is duplicating the work of some other student, rather than finding which mutual source was originally used, and it does take some manual delving around to try to gain any of this information from the Turnitin interface. It would be great if someone could develop a tool that could do literature review forensics and cut through the repetitions of mis-quotes, inventions and distortions that multiply endlessly and mislead the unwary researcher.

So, anyway, what about that Plutarch quote? Well, the blog I mentioned earlier links to a Google Books site suggesting that the source appears in Plutarch’s ‘Essays’ but unfortunately I was unable to find it, nor in Plutarch’s ‘Morals’. I’m not saying it isn’t there, just that a Google search of the text file to find anything resembling that quote failed miserably. Perhaps W.B. Yeats wrote that after all?

As a final note, one supposed quote that was widely circulated following the death of Stephen Hawking was the following; “The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.” Once again, this does not appear to be an authentic quote from Hawking, but nevertheless beautifully sums up the dictum that we should always know where our supposed knowledge comes from.

Digital Technologies in the New Zealand Curriculum

This is a personal blog, so I should probably begin this post by stating that the opinions expressed here are entirely mine and do not necessarily reflect the views of my employer.

The New Zealand Ministry of Education recently began a consultation process on their proposals for a new digital curriculum in schools. Two parts of this curriculum in particular are up for discussion, and some details have been provided about these; Computational Thinking for Digital Technologies and Designing and Developing Digital Outcomes. Reading the descriptions of the proposed progress outcomes and outcome statements for students at different age levels under Computational Thinking for Digital Technologies I couldn’t help feeling that I was in something of a time warp. It reminded me of recently being sent a copy of a Cambridge computing exam for the UK GCSE. I first taught this topic in London in 1988 when GCSE was new, and looking at the exam questions the only significant difference I could find was some reference to the World Wide Web, which of course did not exist in 1988, and one rather grudging mention of mobile phones.

A similar sense of deja vu came to me looking at the proposed progress outcomes for computational thinking in the curriculum. Yes, there were algorithms and binary digits and input and output and software engineering methodologies, oh and of course sequence, selection, iteration etc., all those components that were so familiar in the 1980s. What I did not see so much of was components, patterns, frameworks, productivity tools and code automation, models, teams, customer interaction, iterative agile and lean processes, anything in fact from the last quarter century or so of software development. True, there is a sprinkling of contemporary computing terms such as big data, visual computing and artificial intelligence (not that this is particularly new) but underneath it all is some pretty old thinking.

Of course there is also the curriculum for Designing and Developing Digital Outcomes. You might say that I am looking in the wrong place for the content above. To be fair, this part of the curriculum proposal starts well; “Students understand that digital applications and systems are created for humans by humans”, but very soon sinks into topics such as patents, file types and network protocols. These are, of course, useful and important things, but I do feel that the broad understanding of digital systems that supports digital fluency for all students (not just the budding computer scientists) is rather lost in the mix.

I also wonder who is going to be teaching all of this in schools. How many teachers do we have in New Zealand who are well versed in 1980s style computer science? Some years ago I ran a session in an Auckland school for a group of digital technology secondary teachers from around the area. It became clear very quickly that all of these teachers had very different skill sets. They focused on particular tools and approaches, which were very diverse. This is not necessarily a bad thing, but it may be a challenge to implement such a curriculum with such a broad palette of experience and delivery in the available staff.

I would like to make it clear that I applaud the Ministry of Education for tackling the issue of the digital curriculum, and I wish them well in the journey towards making our students more digitally and computationally aware and capable. I hope that the current process of consultation and investment leads to great outcomes for everyone. As well as expressing my personal opinion in this blog, I have also shared my views through the online survey tool for the Digital Technologies | Hangarau Matihiko curriculum content consultation process, and I hope that other experienced educators, researchers and IT professionals will do the same. It is only by engaging in informed debate that we can get the best outcomes for future students.

Time Capsule of the Future

Technology has transformed how we live our lives. Instead of calling distant family and friends on the phone, you can now video message them, for free, on your touch screen mobile device. A phone call from the airplane no longer costs the price of a flight, and you can surf the web, stream movies and chat digitally from your seat, even in Economy Class. No more snail mail and difficult collaboration on documents which just fill up your email inbox with countless edits, you can now co-create in The Cloud with people from all over the world (even if you don’t really understand what ‘The Cloud’ actually is). The list goes on and on.

In 2009, “Generation X” author Douglas Coupland, asked the question, “What if we were to receive a time capsule not from the past, but from the future – would we pay more attention?” He suggested that, instead of thinking about what we might want to give to some future generation, we might consider what we could send back 20 years into the past, to illustrate what astonishing changes we have experienced over that period. Think about all that has changed in the world since 1997, particularly in terms of digital technologies and all the useful (and probably not so useful) things that we now take for granted.

Now, consider how education has changed since 1997. Would a 1997 recipient of your box of wonders be amazed? Probably not. Much of our educational infrastructure and administration still follows the 19th century industrial model of subjects and periods and fixed term days and the delivery and examination of content. Many a secondary school classroom still has rows of chairs and a whiteboard at the front. Digital devices are to be put away in schoolbags and only used surreptitiously. Teachers talk for hours to bored and disengaged students about content that can be found delivered much more engagingly on YouTube.

We increasingly have to ask, what’s the point? Why are we preparing students for 1997 who will leave school in the 2020s? How will these graduating students help us to solve the chronic problems the world is facing if they haven’t already experienced real world learning?

This educational time capsule is, however, beginning  to fundamentally change. Schools and teachers are beginning to embrace the opportunities that digital tools – commonly used by the working population but so often denied to students in the classroom – offer to teaching and learning. An increasing number of schools have embraced change and redesigned their physical environments, their curricula, and their attitudes to digital tools to provide a skill based vision of how 21st century students can apply their learning to the real world..

Part of this quiet revolution is The Mind Lab by Unitec’s part-time in-service postgraduate programme that more than 2,000 New Zealand teachers have already passed through. The purpose of this programme, generously supported with scholarships by the NEXT Foundation, is to transform New Zealand education from the inside, several hundred teachers at a time.

Teachers cannot be expected to be able to integrate digital technologies into their teaching and learning without help from experts, peers and even their own students and whanau. It is not just about adding computers to classrooms, simply doing that achieves nothing. Students need to be given opportunities to develop relevant 21st century skill sets, and 21st century teachers need to know how to foster and leverage these skills with the support of contemporary tools.

Perhaps the most important thing teachers can gain from the programme is to be engaged in a community of practice that goes beyond the subject, the school, the decile, even the country, and enables them to plug into knowledge, skills and ideas from the best educational minds, freely accessible over the Internet. In a few years perhaps an educational time capsule really will be worth sending back into the past to amaze and impress the previous generation.

A version of this article was published in the print edition of The New Zealand Herald, Thursday May 30th 2017, pp.A20.

What Is Agile Architecture?

I was recently asked to deliver a training course on agile architecture. Although the materials were provided for the course by a third party, it soon became clear to me that they did not meet the client’s specific needs around understanding exactly what agile architecture involves. Indeed, there seems to be a broader issue, judging by the lack of explicit material published in this area. There wasn’t even a Wikipedia page until I began to create one as part of the course (please contribute to it!)

Why is agile architecture a difficult topic to teach? It seems that there are parts of the Agile Manifesto that are problematic for the architect. While the manifesto rates processes, tools, documentation and planning as less important than agile features such as interactions, working software and responding to change, those traditional components are at the heart of architecture, particularly in large scale systems. Further complicating the question are the different layers or levels of architecture, from the Enterprise Architect with a broad review of organisational systems, to the System Architect working on specific domains and services with software development teams.

The general consensus appears to be that the job of an architect is to do ‘just enough’ architecture at each stage of an agile project, but the difficulty is in defining what is ‘just enough’. Of course there is always an implicit architecture in a system, embedded in the code, and to some extent this can be serviced by techniques such as applying coding standards and using annotations. Another way of looking at architecture in an agile context is to see it as a collection of fragments, recorded as part of user stories. Stories that need to take account of architectural elements would include architectural fragments as part of the conversation around the story card. A more centralised view of architecture can be captured in the C4 model; Context, Component, Container and Class, that can be used to get a high level decomposed view of the most important aspects of a system.

Martin Fowler talks about architecture as being the things that are hard to change, and also discusses the idea that one job of an architect is to identify ways in which things that are hard to change are made easier to change, so that the whole process of developing an architecture can be more agile. One example of this would be to incrementally evolve a database schema, rather than believing it necessary to specify the schema early in a project.

One of the areas that seems to be important is the distinction between architecture as a noun, which relates to the elements of architecture that are explicitly documented outside of code, and architecture as a verb, meaning the role of the architect, which is much more about communication and supporting the development team. The role of the architect might be seen as a combination of transformational leadership, providing a vision and being a role model, and the more traditional agile style of servant leadership, enabling teams to perform.

So, what is the key job of an agile architect? Primarily it is to ensure that all stakeholders share the same vision, a vision that might be captured in a dynamic set of requirements and a set of models that may only be recorded informally, continually redrawn on whiteboards in discussions with stakeholders and teams, and continue to evolve with a constant eye on pushing critical decisions back as late as possible by, for example, exploring multiple architectures in parallel, and using tools and approaches that allow decisions to be reversed to, in effect, ’embrace change’ in the common mantra of agile development.

It seems that there is still a conversation to be had about exactly what it means to be an agile software architect, and perhaps the metaphor is the problem. A number of practitioners have questioned whether the concept of architecture relates closely to what we do in software development. One suggestion is that the role of city planner is a better metaphor than architect. This certainly resonates with the idea that city planning is a wicked problem, and that the challenges of designing large scale software systems share the same wicked characteristics

The Lean Educator

A couple of weeks ago I was on a road trip of 1,000 Ks or so, to teach at some of our remoter locations. Along the way I listened to an audio book of James Womack and Daniel Jones’ “Lean Thinking: Banish Waste and Create Wealth in Your Corporation.” It’s been while since I read their (and Daniel Roos’) excellent book on the Japanese car industry, “The Machine That Changed the World.” Back then I was interested in the ways that many of these ideas had been brought into agile and lean software development. These days I’m more interested in how ideas like lean thinking might apply to education. One thing is for sure, it’s not an easy road. Womack and Jones’ book is largely a series of case studies about the long, slow and endless process of lean transformation in various industries. Several of their stories make it clear that if you’re looking for a simple solution that doesn’t involve a lot of work, lean isn’t it. Also, their examples are overwhelmingly in product, rather than service, industries. What, then, can educators learn from a book that focuses so much on reducing inventory and shortening the supply chain, when our ‘product’ is such stuff as intellectual property and graduating students? The challenge is to look at our systems through new eyes, to truly understand what we mean by the value of education. As Womak and Jones put it;

Why is it so hard to start at the right place, to correctly define value? Partly because most producers want to make what they are already making and partly because many customers only know how to ask for some variant of what they are already getting.

(I should probably point out here that Womak himself had a go at Lean Thinking for Education back in 2006, but his take is a little industrial for my taste.)

Perhaps the core value that we need to take away from lean thinking is ensuring that all the steps in our educational supply chain deliver value to the customer. In other words, look hard at our value stream. We need to consistently ask; does this part of the curriculum deliver value to the customer? Does this form of assessment deliver value to the customer? Does this step in the enrollment process deliver value to the customer? Note that we don’t ask whether any of these things deliver value to the educational institution, though this often seems to be the major priority for some providers. Not that lean thinking ignores this. The point is that by focusing on delivering value, you will reap the benefits from your customers who will reward you for delivering a quality product at low cost.

What reduces value for the customer? Primarily muda (waste.) Importantly, there are two types of muda. Type 1 muda is waste required by fixed components in the way the current system operates (e.g. a student management system) and can’t be removed just yet, while type 2 muda can be eliminated immediately. What kinds of muda do educators encounter? Quite a lot of type 1, probably. Arcane administrative systems supported by poorly integrated software systems, organisational structures that are based on siloed interests and internal competition, turf wars and endless purposeless restructurings, buildings and classrooms from random historical periods and educational fashions, snail-like  curriculum change processes, remote, conservative accreditation bodies, politically manipulated funding processes and an industrial-era dominant ideology that endlessly harks back to an illusory golden age of education.

One of the obvious characteristics of education, as it is delivered by most institutions, is that it operates in batch-and-queue mode. Every semester a set of classes is delivered, over a fixed period of weeks that may have no relationship to how long it takes someone to learn something. Every year a batch of graduates is produced, who may or may not be prepared to take advantage of the opportunities currently offered by the world around them. Lean thinking would suggest this batch mode should be replaced by flow, where learning flows seamlessly by right-sizing what is offered to the learner. Instead of broadcasting batches of content in mass production lecture halls, the lean educator would be engaged in the whole learning value stream, working closely with their colleagues across the total process, not just a batch-block of material.

Another concept related to flow is pull. How might we move from a model that pushes educational content towards the learner, over a timescale dictated by the institution, to one that allows the learner to pull from the system what they need, when they need it?

Further complicating the search for lean thinking is the concept of the lean enterprise. You may be able to squeeze the muda out of your school, department, faculty etc, but the lean enterprise spans the whole value chain, and may involve many organisations. How can you convince all the stakeholders in the value chain to cooperate in becoming a single lean enterprise, setting their own agendas aside?

If you could, what would the lean educator deliver if there was no muda in our systems and education was a lean enterprise based on value, flow and pull? Maybe something like the following;

  1. A curriculum delivered by multiple organisations, tailored to suit the learner
  2. No restrictions on the hours per week, or the total time span of a learning journey, or which learning components may be combined with others
  3. Walk-in / lifelong enrollment. Come in the door, start a class. Come back in 10 years, take another one
  4. Any combination of blended learning delivery modes
  5. Instant assessment feedback
  6. Add your own…

Is that even possible? Well, there are two ways of improving; kaikaku (radical improvement) and kaizen (continuous incremental improvement.)  Both have a role, and neither have an end point, but the starting point can be today. We need to be constantly on the lookout for muda that we can remove from the value stream. Womak and Jones distill lean thinking into 5 principles. Surely it wouldn’t do us any harm to apply these to education?

  1. Precisely specify value by specific product
  2. Identify the value stream for each product
  3. Make value flow without interruptions
  4. Let the customer pull value from the producer
  5. Pursue perfection.

 

OECD Study Validates The Mind Lab by Unitec’s Postgrad Programme

Recently the OECD published a study called Students, Computers and Learning: Making the Connection. Unfortunately the media did the usual thing the media does and made a superficial (mis)reading of the document to come up with the headline Computers ‘do not improve’ pupil results, says OECD, and that was the BBC for heaven’s sake! Of course the report is narrow in scope, in that it focuses on PISA results, and concerns itself with only a handful of external references. It also only grudgingly acknowledges the success in digital teaching and learning in Australia, preferring to focus on its apparent mission to talk up the deadly drill and practice traditional schooling of the Asian territories who play the PISA game so well and make their kids so miserable in the process. No-one, surely, wants to see anti-suicide fences put up round our examination halls? Nevertheless a deeper reading of the document gives more interesting insights that validates our post graduate programme in digital and collaborative learning at The Mind Lab by Unitec. As the OECD report says, ‘technology can support new pedagogies that focus on learners as active participants with tools for inquiry-based pedagogies and collaborative workspaces,’ a philosophy very much in tune with our own. Of most interest, however, is Chapter 8, Implications of Digital Technology for Education Policy and Practice. In a previous study, Pathways to a Better World: Assessing Mobile Learning Policy against UNESCO Guidelines in a New Zealand Case Study, I looked at New Zealand policy in the context of the UNESCO Policy Guidelines for Mobile Learning. One of the conclusions from that piece of research was that it reaffirmed the importance of some core policy recommendations, such as the need to introduce the use of mobile devices into teacher education. The OECD’s much broader study also acknowledges the critical importance of teacher education in making the most of technology in schools; ‘Technology can amplify great teaching but great technology cannot replace poor teaching.‘ It also acknowledges that there are many benefits that PISA cannot measure, including the way that that ‘technology provides great platforms for collaboration among teachers and for their participation in continued professional development, thus empowering them as knowledge professionals and change leaders.’ These three themes of digital tools, collaboration and leadership lie at the heart of our programme. We would wholeheartedly echo the final words of the OECD report: ‘The key elements for success are the teachers, school leaders and other decision makers who have the vision, and the ability, to make the connection between students, computers and learning.’ We share that vision, and are busy giving teachers the same vision, and the ability, to transform education for the better.

(dis)connectivism: a learning theory for the ghost in the machine

One of the most recent attempts at a learning theory is connectivism, which attempts to address the relationship between knowledge and technology. At the same time there is an increasing disconnect between our physical bodies and our digital souls. In a somewhat baffling and opaque paper from 2010 called ‘Academetron, automaton, phantom: uncanny digital pedagogies’, Siân Bayne of the University of Edinburgh addressed the concept of the ‘uncanny’ in online learning. Once the layers are peeled aside, there are some useful ideas to consider. Bayne refers to ‘the multiple synchronicities available to us when we work online…[the] blurring of being and not-being, presence and absence online.‘ Our online lives are schizophrenically littered across multiple contexts, each one demanding a slightly different type of e-presence; an avatar, a profile, a photograph. We spread ourselves thin over the personal, the professional, the store, the auction, the review; constructing at one moment a Facebook life of “success so huge and wholly farcical“, the next, a LinkedIn profile designed to get that elusive new job to make that success less fictional. We lose the distinction between past and present. Chronology blurs. It is indeed uncanny when my dead mother’s Facebook accounts sends me a message, or a Google search tells me that we will have nuclear fusion by…oh… 2011? Alarming news stories of teenage suicide cults, seemingly driven by a desire to achieve digital immortality through physical death seem to take the disconnect between our real and virtual lives to extremes. Perhaps notwithstanding Ryle’s critique of mind-body dualism, we are all becoming ghosts in the machine. Can we ever call them back from heaven? This disconnectivism between a life lived and a fragmented digital artifact should perhaps raise some disquiet as to the role of pedagogy in the age of ghosts. Perhaps one question for educators is how we temper the tendency to make learning a process of digital publication. It sometimes feels as if the default assignment task these says is to ‘broadcast yourself’. Perhaps a better mantra would be, ‘reflect on yourself, protect yourself’ “for the vision of one man lends not its wings to another man.” Some things are better left to the imagination, rather than the app.

Ragile software development – a longitudinal reflection on post-agile methodology

Ok, so there is no such thing as a ‘ragile’ software development method. Nevertheless, for a number of converging reasons, I have recently been given cause to reflect on the history of rapid, lightweight and agile approaches to software development, and the current dominant ideology of methods. I use the label ‘ragile’ as an indicator of where we might have been, or where we might still go, in developing software in the post-agile era.

There’s a scene in Woody Allen’s movie ‘Sleeper‘, where two doctors from 200 years into the future discuss belief about diet. It goes like this:

You mean there was no deep fat? No steak or cream pies or… hot fudge?”

Those were thought to be unhealthy… precisely the opposite of what we now know to be true.”

Scientists tend to realize that this week’s theory is just that, and it may be replaced by a new theory at any time, based on our empirical observations. Diet promoters tend to take the opposite view. Everything in the past was wrong, but now we know the truth. I hope that software developers are more like scientists than fad dietitians, and will embrace change in their thinking.

Received wisdom has it, perhaps, that the problems of the waterfall approach to software development have been overcome by a legion of certified Scrum Masters leading their agile organisations to the continuous delivery of quality applications. If only we were so enlightened. Royce himself, in his famous ‘waterfall’ paper, stated “I believe in this concept, but the implementation described above is risky and invites failure.” He was talking about Figure 2 in his paper, the oft-copied waterfall diagram. It seems that few bothered to read the rest of the paper and the rather more challenging figures within it, consigning the software industry to decades of misunderstanding. Assuming that software development was in a chronic pre-agile crisis is also a misreading of history. Participants at the 1968 NATO conference, which was the apocryphal source of terms like ‘software engineering’ and ‘software crisis’ acknowledged that many large data processing systems were working perfectly well, thank you. DeGrace and Stahl told us that software was a ‘wicked problem’ 1990, but ‘wicked’ does not mean ‘insoluble’, though it does mean that we should not expect there to be one right answer.

The software industry has seen a series of new ideas about software development over the decades, many of which have been based on leveraging improvements in the hardware and software tools available to us. Kent Beck in the first XP book referred to fully utilising these new tools, turning up the dial on all the best practices at once, possibly to 11 (or was that just Spinal Tap?) Almost a decade earlier, James Martin had published his ‘Rapid Application Development‘ book. stressing the value of, among other things, prototyping, code generation, metrics, visualisation tools, process support tools and shared, reusable domain models. Later, in 1996, Steve McConnell’s ‘Rapid Development‘ emphasised many of the same ideas, stressing productivity tools, risk management and best practices. Both authors prefigured many practices of the lightweight (later agile) methods that were emerging in the late 1990s; Iterative, timeboxed development, customer engagement, small teams, adapting to changing requirements and quality assurance.

An underlying theme in rapid development is the concept of domain modelling and automated tools, including for code generation. Similar themes appear in Agile Modelling, Model Driven Development and, Domain Driven Design. Tools like the Eclipse Modeling Framework, and those based on the naked objects pattern, such as Apache Isis, put modelling at the heart of systems development.

The agile methods movement is at a point of change (hmmm, isn’t that the definition of a crisis?) Recent efforts have revisited the lessons of the Japanese car industry, with Lean and Kanban, in a search for ever ‘lighter’ processes, while at the same time a vision of agile as a traditional fixed methodology process has become established (endemic, even.) This has recently caused two signatories of the original agile manifesto to refer to ‘the failure of agile‘ (Andrew Hunt) and state that ‘agile is dead‘ (Dave Thomas). From another perspective, Vaughn Vernon lamented in his book on domain driven design in 2013 that in Scrum “a product backlog is thrust at developers as if it serves as a set of designs”

So, coming back to ‘ragile’, what is it? Well no more than an —acknowledgement that there may be a collection of practices from both rapid and agile —(and model/domain driven development) that remain relevant to the future of software development. Such an approach would emphasise tools, leverage prototypes, include shared domain models, embrace code generation, automate as much as possible, including estimation and project management, and deliver continuously. Such an approach might be considered radical, in the particular sense of going back to the roots of a phenomenon. Some of the ideas of Martin and O’Connell were much harder to do in the 1990s than they are now. Can there be any software developers who do not use code generators of one type or another? They generate object property methods, test stubs, service end points and a host of other components, they refactor code and create user interfaces and database schema. Rails and Grails developers work with domain models as a matter of course, allowing frameworks to build whole architectures automatically. It’s time we rethink how these strands might become a method for the 2020s that is able to cope with the distributed and parallel domain models of the cloud-based, in-database, Internet-of-things, plugin-driven, Web 3.0 applications of the future.

ICANN.sucks

icannsucks

There’s been quite a bit of debate in the press about the new .sucks top level Internet domain, including this article in the New Zealand Herald. It does have its proponents, of course. The nic.sucks website claims that it can be used to ‘foster debate’ and ‘share opinions’. They suggest that it is valuable for cause marketing, consumer advocacy, anti-bullying etc. I can’t help wondering why other less infantile domains can’t be used for these worthy causes. In fact, of course, it’s just a free-for-all that makes individuals and organisations have to run around paying stupid prices for these domains just to protect themselves from Internet trolls. Obviously no-one could have seen that coming, right?

The body responsible for allowing new domain names is ICANN, the Internet Corporation for Assigned Names and Numbers, which claims to be a ‘not-for-profit public-benefit corporation’. I do wonder about the public benefit aspects. The socially aware and compassionate people who suggested the SUCKS domain name were Top Level Spectrum, Inc., Vox Populi Registry Inc. and Dog Bloom, LLC. All concerned charities with our welfare at heart, I’m sure. Vox Populi also won the auction to have the right to extort money from everyone wanting to defend themselves from this domain name. The three SUCKS entries were some of the 1,930 suggestions received by ICANN for new top level domain names in 2012. You can see the full list at http://money.cnn.com/infographic/technology/new-gtld-list/. Most of them were reasonably sensible, if largely self-serving, with lots of corporations wanting their own domains. There were, however, several stupid and destructive suggestions that were clearly rejected out of hand. These included SEX, SEXY, WTF and SUCKS… oh, wait…

I suppose if you make more than half a million dollars from the faceless corporations who suggest a domain like SUCKS (that’s just for making the suggestions – each one cost $185,000) you owe them back, however much collateral damage you cause in the process. Not to mention the millions of dollars you can make from selling the rights to the domain itself, as this list of domain auctions shows. ICANN are now running around trying to close the stable door after the horse has bolted. Too little, too late.

It will be interesting to see who ends up as the owner of http://www.icann.sucks

Kinross Flat and the Amazon Jungle – An Indie Publishing Experience with CreateSpace and Kindle

I recently self-published my first novel, Kinross Flat, via Amazon CreateSpace and Kindle. This post is about my experience of the whole process, which was quite complex but well supported by Amazon’s various self-publishing tools. Amazon is not the only independent publishing platform, and I can’t speak for the relative merits of the alternative channels, so I’m not necessarily claiming that Amazon is the best. However, as the owner of a Kindle, it was the one that came to mind when I started thinking about indie publishing. I’d welcome others’ views on the alternatives.

CreateSpace is basically for print-on-demand, so if you only want to publish an eBook on Kindle then you don’t need it. However, the advantage of CreateSpace is that once you’ve set up your print-on-demand copy, the addition of a Kindle version is practically automatic, so you get both options at no cost. Yes, no cost – the whole thing is basically free (well, up to a point – I’ll get back to that later!)

So, what do you have to do? There are a number of ways of preparing your book for publication, but the best approach, I think, is to use the tools that are provided for you. You will need to register on the CreateSpace website, after which you will have access to an author dashboard that leads you through all the different steps required to publish your book. The easiest way to make sure your book is in the correct format is to download the Word template, which contains all the required styles and layouts. You can choose your book size, but I went with the recommended 6″ x 9″. For cover design, there is a free to use Cover Creator tool, which provides a relatively small number of basic layouts, but all of these can be customised in terms of font and background colour. You can also upload your own cover image and, of course, write all the cover text. The system will generate an ISBN for you, and add it to the cover with a bar code, automatically.

Once you submit your interior file (i.e. the book text), the system automatically checks it for compatibility, then generates a preview file, which you can check on line or download as a PDF. You can also order a preview hard copy, which is probably the best way to proof read it, but you may have to wait several weeks to get it, and you have to pay for it.

Once you approve the preview, after a few more system checks, including a spell check, your book gets released to the Amazon sales channels, but only after a number of other things have been done. You have to fill in a U.S. tax declaration, which will specify how much U.S. withholding tax you will pay on any royalties, based on the country where you are a tax resident. For New Zealand that was 10%. On the subject of royalties, you also have to choose your royalty rate (35% or 70%) and the retail cost of your book (there’s a handy calculator for this, which shows you the royalties you would receive through each distribution channel based on a U.S. dollar price.) Incidentally there are several different distribution channels you can choose from, but since it costs you nothing to choose them all it seems a bit pointless to exclude any.

After your CreateSpace book is published, it’s a simple step to choose to also distribute on Kindle.You get another chance to check the preview on a Kindle emulator, choose whether you want digital rights management, select a retail price, and away you go. The Kindle version appears a day or two later on the Amazon site, and eventually the print and Kindle versions get linked together. Not all Amazon sites will support the print-on-demand version, just Amazon.com, Amazon Europe and the CreateSpace store. The Australian site, for example, will only offer the Kindle version.

So, is it really free? Well, basically, yes. There are all kinds of options on the CreateSpace site to get help with design, formatting, marketing etc., and these can be quite expensive, but as long as you are reasonably computer literate, the tools don’t require much expertise to do everything yourself. You can, if you like, pay for a review on Kirkus, which may or may not be favourable and costs hundreds of dollars. It’s possible that might pay off in sales, but it’s an unknown quantity. You will, of course, have to pay for any preview copies, or any hard copies of the final book, but these are more or less at cost.

Overall, I found the whole process quite fascinating and supportive. I did occasionally get lost in the Amazon jungle, and ended up, for example, filling in the tax form twice, for reasons I still don’t understand. Nevertheless, I’d recommend it to anyone else who like me, regards themselves as an amateur author who just wants to share their work. If you’re a ‘real’ writer I suspect that the more traditional publishing channels are still the best way to go, since ‘indie publishing’, although it sounds cooler, is still just what used to be called ‘vanity publishing’, which doesn’t sound so cool!