Wednesday, 20 June 2012

Part 3: Data, analytics and learning intelligence

I've been using the learning cycle as a framework for a strategic approach to technology in schools. This is the third post of the series, the previous two having focused on access (mobile) and action (cloud). The next stage is that of reflection. The manifestation of this aspect in my proposed strategy is analytics.

In the basic learning cycle, reflection is the all-important point in the process when we widen our awareness, take a breath and open our senses to some objective evidence of the efficacy of our efforts. Reflections may be fluid and continuous (usually resulting in micro adjustments) or periodic (usually resulting in more macro or strategic reflections). We may self-reflect (internal validation) or we may seek out reflection in the observations of others or in data (external validation). In our journey  to becoming more effective learners, an important part of the process is calibrating our self-reflections to more closely match external validation. This is a lifelong process in which external validation continues to be important but we learn to learn more effectively because our internal validations are proved to be getting more accurate.

The calibration of internal and external validation is essential to the teaching and learning process. Without it, it's quite possible for individuals to entirely miscalculate their progress and consequently focus on the wrong things to generate improvement. I'm reminded of the contestants in singing contests on TV who are convinced they are superstars in the making but who can barely sing. This is an extreme example on the spectrum (perhaps delusional) however the underlying issue is a lack of calibration between internal and external validation of effective learning.

Of course, this is (in part) precisely the purpose of the teacher. The challenge is that, being human, we're not only capable of a little self-delusion at times but we can also project our delusions. In other words, the teacher as an instrument of reflection for learners also needs to be calibrated. Teacher calibration might come through the formative assessment process, summative assessment, experience and professional development. The challenge is to effectively and objectively benchmark our internal assessments.

This is the point at which I introduce the concept of data, analytics and learning intelligence (equate with business intelligence). Before you start telling me about the shortcomings of data in the learning and teaching process, hear me out. I know that human relationships underpin learning. What I also know is that human nature is such that we are simply not objective in our evaluations nor are we calculating machines. It is possible for us to miss patterns, to be 'mis-calibrated' or simply to be overwhelmed by too much data. We're fallible.

‘Big Data’ and analytics are 21st Century phenomena emerging from the already enormous, and still rapidly increasing, speed and scale that technology affords us in capturing, aggregating, storing and analysing data. There is more data available about human behaviour than ever before and a great deal of value is locked up in that data. The promise of analytics is that new insights can be gained from analysis of the data trails left by individuals in their interactions with each other and the world, most particularly when they’re using technology.

The rapid evolution of big data methodologies and tools has, to date, been driven by the business world which recognises in them the potential for unlocking value for their customers and shareholders. In this context the term ‘business intelligence’ is often used to describe the intersection of data and insight. When applied to education, analytics may be sub-divided into two categories: learning and academic. The following table describes that categorisation:

                                                                                                                                                                      
Academic analytics are the improvement of organisational processes, workflows, resource allocation and measurement through the use of learner, academic, and organisational data. Academic analytics, akin to business analytics, are concerned with improving organisational effectiveness.

We can define learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and optimising learning and the environments in which it occurs. In the same way that ‘business intelligence’ informs business decisions in order to drive success, so learning analytics is the basis of ‘learning intelligence’ that is focused on improving learner success.

Learning analytics are not the goal in themselves. Learning intelligence is the goal. Learning intelligence is the actionable information arising from learning analytics that has the potential to deliver improved learner success. The evidence from analytics in business is that there is deep value to be mined in the data. The objectivity and rigour that is represented by learning analytics provides an empirical basis for everything from learner-level interventions to national policy making.

The Society for Learning Analytics Research (SoLAR) is an inter-disciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training and development. Their mission as an organisation is to:
  1. Pursue research opportunities in learning analytics and educational data mining,
  2. Increase the profile of learning analytics in educational contexts, and
  3. Serve as an advocate for learning analytics to policy makers
Significant potential exists for analytics to guide learners, educators, administrators, and funders in making learning-related decisions. Learning analytics represents the application of “big data” and analytics in education. SoLAR is an organisation that is focused on a building a planned and integrated approach to developing insightful and easy-to-use learning analytics tools. Three key beliefs underpin their proposal:
  1. Openness of process, algorithms, and technologies is important for innovation and meeting the varying contexts of implementation.
  2. Modularised integration: core analytic tools (or engines) include adaptation, learning, interventions and dashboards. The learning analytics platform is an open architecture, enabling researchers to develop their own tools and methods to be integrated with the platform.
  3. Reduction of inevitable fragmentation by providing an integrated, expandable, open technology that researchers and content producers can use in data mining, analytics, and adaptive content development.
From my experience talking to educators, it's clear they usually know that there is data available and they know how to act on learning intelligence when they have it, but they're much less sure about the analytics phase. Whilst working on a national procurement for a learning management system last year I realised we really knew very little about the utilisation of key technology assets in the schools we were trying to build systems for. As it turned out this data was sitting, untouched, in log files in servers within these schools. I approached three of the schools and asked their permission to copy this data for the purposes of analysis. They knew it existed and were happy for me to analyse the anonymised data.

I was able to analyse the utilisation of technology assets (software and hardware) across these schools over a period of months in order to understand exactly how technology was used. This enabled me to show where the investment in technology was being dramatically underused and how it could be re-shaped to maximise utilisation of the investment in order to improve the chances of learning gains. I didn't have time to, but could have mapped this data against the timetable and assessment data to explore how technology mapped against attainment. This would have allowed me to correlate technology utilisation by different teachers, departments and schools against the performance of their pupils.

This example is the tip of the iceberg in terms of analytics and big data in education. In terms of my technology strategy, identifying and analysing key data in your school to produce learning intelligence will maximise the learning bang for your technology buck in an objective manner. It is a critical part of your strategy because without the analysis, you may well be making unnecessary or ineffective investments in technology. Don't be driven by technology; be driven by learning outcomes.

Wednesday, 21 March 2012

Personal Data Protection in the Cloud

A few weeks ago I was contacted by a student asking me to complete a questionnaire on cloud security issues as part of a dissertation for her degree. At the time I thought I should probably post my answers here but I was overtaken by events (or in plain speak, I plain forgot).

However, I was reminded this morning by an article published yesterday on the very same topic. The article is built around a joint statement issued by European Commission Vice-President Viviane Reding and US Secretary of Commerce John Bryson on the 19th March. The statement frames a high level conference on Privacy and Protection of Personal Data, held simultaneously in Washington and Brussels and, in their words, "represents an important opportunity to deepen our transatlantic dialogue on commercial data privacy issues." This is an excerpt from the statement:
"The European Union is following new privacy developments in the United States closely. Both parties are committed to working together and with other international partners to create mutual recognition frameworks that protect privacy. Both parties consider that standards in the area of personal data protection should facilitate the free flow of information, goods and services across borders. Both parties recognize that while regulatory regimes may differ between the U.S. and Europe, the common principles at the heart of both systems, now re-affirmed by the developments in the U.S., provide a basis for advancing their dialog to resolve shared privacy challenges. This mutual interest shows there is added value for the enhanced E.U.-U.S. dialogue launched with today's data protection conference."
The thrust of the student's questioning was that the uptake of cloud technology was being slowed by businesses' concerns about data security and privacy. I'm not so sure that's at the heart of the issue as you can probably tell from my answers:


Question: Despite its promises very few businesses have actually moved their operations to the Cloud. Why has the real application of Cloud computing not yet reached momentum among businesses?

Answer: I think the premise of the question is wrong, i.e. that very few businesses have moved operations to the cloud. To explain what I mean, we need to agree terms first. Cloud just means stuff hosted off premises. Web is cloud. Virtualisation is cloud. Streaming is cloud. If cloud means stuff hosted off premises, then a critical limiting factor is the pipe between the client and the host. Even with diversely routed connectivity, this is a business risk in terms of resilience and performance. Business risks need to be balanced against costs and benefits. The second issue for cloud services is that it is more difficult to integrate disparate systems – potentially from different vendors - to meet business specific requirements. There are not yet standards that facilitate this type of integration between cloud vendors (although discussions are in progress). The combination of issues I describe means that cloud services are not suitable for all business functions, business types and business sizes. For example, some businesses may be willing to sacrifice performance and resilience to achieve lower price or greater agility. A business whose main channel is the Web may already have the internal processes and culture to embrace more cloud services. When I said the premise of the question was wrong, I meant that I think most companies do take cloud services, albeit in a limited way. It’s true that most businesses haven’t embraced cloud for the full scope of their technology requirement but I’m not sure this is possible for most businesses given the present limitations of the technology. So really what we’re talking about is a hybrid scenario with a progressive shift to cloud services as bandwidth costs reduce, standards for integration emerge and the business case, taking account of the risks, gradually shifts in favour of cloud. This is part of the picture. There are also cultural and practical issues in terms of change management. On premises IT departments have traditionally kept a tight control over their networks and data. Releasing control is difficult for them. It’s only when competition becomes extreme that the old paradigms become unsettled and eventually unseated. I’ve deliberately left the wider data security issue out of this response because there are lots more questions about it later!

Question: A study by LSE has revealed that the top two issues on the way to adopting the Cloud are fears of data security and privacy and -data being offshored. In your opinion have these two issues been the main concern for your users/clients?

Answer: I have some sympathy with this view although when issues are complex, respondents often migrate to shrink-wrapped answers. My view is that the issues of data security and privacy are the go-to issues for cloud ditherers. They’re a form of displacement behaviour. In my experience, it’s rare that data security and privacy are truly critical factors in the decision to use (or not) a cloud service. They are of course critically important issues, but as a technology, ‘cloud’ usually has reasonable answers, at least relative to the security and privacy challenges that already exist due to human and system frailty. My experience is that the objection regarding data security and privacy is often the first provided objection but that a little digging usually reveals a more complex set of concerns, some technical, some practical and some cultural.

Question: Steve Ballmer, CEO of Microsoft believes that security is a personal responsibility of everyone in the chain (– employees, managers, end users). How important is human factor in ensuring security on all levels? 

Answer: Steve Ballmer’s comment highlights the absurdity of the data protection and privacy issue in the context of most businesses. That is to say, people are most commonly the weakest link in the security chain, closely followed by the systems and processes they devise. For example, in schools across the land you’ll still find passwords and user names written on post-it notes attached to the monitors of administrators with access to sensitive data about pupils. In the next breath, they will resist a cloud technology solution because they’re not sure where the data is located. There’s a significant lack of perspective about the relative significance of the human factor in most security breaches.

Question: Do you believe security is a two way responsibility for both users and providers?

Answer: In order to create a secure technology chain, people, processes and technology need to work together in a seamless way. This means reciprocal responsibilities between users and providers.

Question: Cloud providers are increasingly trying to convince users that because of their heavy investments in hardware, software and staff, security in the Cloud may be better? Would you say that security on average is better in the Cloud comparing to the in-house security?

Answer: For small and medium sized businesses in particular I’d say that this is true as long as you believe the cloud provider have robust and resilient systems themselves. The reality of most SMEs is that pressure to compete and grow creates budgetary pressure and that privacy and security are easy victims of this pressure. We still see many businesses which do not store and control data effectively and where staff are inadequately trained in the security systems. Aggregating demand through cloud removes part of this problem from the premises and frees up resources to focus on the ‘edge’ issues, i.e. people (and their systems).

Question: What legislation are you currently guided by in the Cloud industry? Do you believe it is sufficient enough for users’ security?

Answer: The UK’s Data Protection Act 1998, the US Patriot Act and the European Union's Data Privacy Directive all have something to say on this issue. In truth they’re all out of date in the context of cloud and there are various reviews of the legislation happening at present in order to stimulate the cloud industry. One of the issues is at what point permission is required from the data subject. At the moment, the legal view is that the data subject may need to provide permission even if a non-EU company stores data temporarily on an EU device, e.g. through a cookie as part of a social networking service. Moving personal data outside the EU therefore presents potential issues. Currently some cloud companies have circumvented this problem by basing data centres in the EU, e.g. Microsoft. Others have resisted making absolute statements about data location (such as Google) because their data is so widely replicated (data sharding) around their system for the very purposes of resilience, redundancy and security. So the legal landscape is somewhat at odds with the technical landscape.

Question: Some scholars have suggested we create an auditing board/authority to monitor activities of the providers. Do you think it is a good idea?

Answer: Issues of data security and privacy are very important issues. It may not seem so until something goes wrong and you are directly affected. Luckily most of us never experience the effects of a meaningful breach of our personal data. We may be irritated by it, for example if our credit card information is hijacked. However, there is a system of restitution in place and so it’s usually an irritation rather than a catastrophe. However identity theft (as another example) is potentially a very significant issue and one that is growing. So, in order to build confidence in the cloud, there inevitably needs to be some regulation and control. In the same way as integration standards between cloud providers will enhance take-up of cloud technologies, so regulation and legal harmonisation will enhance confidence and take-up.

Question: What are your predictions for Cloud computing security in the future?

Answer: As I said earlier, I think the shift to cloud is underway for most businesses. Whether it is as simple as web-based email or a web store front, or as complex as an entire company built on cloud computing, businesses are on the journey. To paraphrase Anais Nin, cloud adoption progresses when the risk it takes to remain tight in the bud is more painful than the risk it takes to blossom. Cloud leverages scale to deliver more for less. If it really does this well, then the business ecosystem will naturally select it. In my view, security and privacy are real issues that need to be tackled. The cloud providers are the guardians of valuable personal assets: our personal data. They are the data ‘banks’. Data is a valuable asset and therefore as vulnerable to abuse as the banking and financial systems. I would argue therefore that we need consistent and robust regulation and legislation in order to protect our interests. It is clear from the banking crisis that the trust and best intentions rarely work out well for the individual. My prediction would be that 'big data' and the 'cloud' will be a very important trend over the coming decades and that a robust legal and regulatory framework will emerge, along with standards for multi-vendor cloud integration.

So that's my take. What would your answers have been?

Thursday, 1 March 2012

Airbase: web made easy

Airbase is a resource I've put together to help learners and educators by giving them a place to record and find exciting Web stuff for learning and teaching. The simple premise of this project is that the Web is useful for learning and teaching. It’s useful because stuff on the Web is dynamic, accessible and very often free.

The Web is no longer just about information. For every piece of software on your computer, there’s probably an equivalent free Web app. Not only that, but Web apps don’t need installing, supporting or updating. The strength of the Web is also its weakness. There’s a lot of stuff out there and most of it is not easy to find, even with a good search engine. Airbase is a crowd-sourced solution to this challenge.

Educators and learners all around the world are finding really useful stuff on the Web all the time. The challenge is collecting this information together in one place and making it easy to find what you need, when you need it. Crowd sourcing uses a little bit of a lot of people’s time. If 10,000 people spend 10 minutes entering their top 10 web resources or apps into Airbase, we'll have 50,000 great items for you to search! The Web makes this possible. If you like the resource, recommend it to a friend. If you don't, tell us how to improve it.

I’ve built Airbase with two free Web apps: Google Forms and Google Spreadsheets. Google Forms is used to collect the information. Google Spreadsheets is used to store and filter the information. The information is freely shared for anyone to use. If this approach proves successful, I will move the resources across to a database so they can be searched more easily. This will also be free!

In order to contribute to Airbase, all you have to do is complete this form for any great Web sites, apps, resources or files you find on your Web travels. The more information you put in, the better it is for everyone. I estimate it takes less than one minute to enter all the data for one item.

The benefits of Airbase for you are that:

  • Educational categorisation means you’ll quickly find the stuff you need
  • You’ll only take time to submit good stuff so you’ll only find good stuff
  • Ratings and reviews will help you to identify the best of the best

All the entries are automatically stored in a spreadsheet which you can visit. The spreadsheet has filters which mean you can quickly find useful stuff which meets your particular needs. Don't worry about duplicating information. Although you can search to find if a site already exists, each duplicate will carry its own rating and possibly other unique information. I will worry about rationalising data if the concept works.

I welcome your comments and suggestions. Please comment here or include them in the suggestion box at the end of the submission form. Yes, that means you have to submit an item in order to make a suggestion!

Wednesday, 22 February 2012

It's all about the (validated) learning

This week I'm listening to The Lean Startup by Eric Ries. If you prefer visual consumption, check out Eric's  presentation to Google. If you like listening to books (like me), check out the Audible version. Whatever your preferred medium for consumption, and whether or not you're interested in starting a business, I recommend that you engage with his thinking a little if you're interested in learning in the context of change.

I should also state my personal interest. I'm currently a one third partner in a startup that's sailing into uncharted waters (Airhead Education). We have a great idea (built around the concept of a cloud desktop for schools), great people (check out our technical guru, Jason Dixon's blog) and, I think, the zeitgeist is in our favour. But we're trying to bring a new technology paradigm to schools and that means change. We know about 10% of what we need to know to even build a meaningful business plan! There's 90% (probably a lot more) to learn.

Eric begins his book by defining a startup as “a human institution designed to create a new product or service under conditions of extreme uncertainty.” This emphasis on 'uncertainty' is important. If I was to start a traditional grocery shop, I'd be walking a well-trodden path. There's loads of explicit learning which I can access in order to understand how to make it successful. Most management theory is focused on this type of business where the idea and the customers are well understood. The keys to success are effective planning and efficient execution. But what if your idea is disruptive and visionary and you have no idea how your ideas and products will be received by customers, or even who your customers are? Traditional management theory falls down.

This is where Eric steps in. His thrust is that a new type of management theory is required under conditions of extreme uncertainty (sounds like Quantum Management Theory to me). He's also keen to point out that, although this is the sea where entrepreneurs swim, it's also vital for there to be a similar management theory for intrapreneurs (those who behave like entrepreneurs but in the context of a mature businesses). In fact, mature businesses are often very poor at innovating because the negative impact of failure is magnified. Even minor failures can reflect badly on the brand. Mature businesses are usually conservative for this reason.

I'm not going to provide a complete synopsis of Eric's book but I wanted to pull out a couple of key points. In conditions of extreme uncertainty, one thing is for sure: you need to learn and fast. But is all learning equal? Most entrepreneurs fail a lot before they find success. You'll hear them say things like, "Well, it was tough but I learned a lot." What was it that they learned and was it worth learning? Perhaps at a personal level it was, but at the level of business, Eric argues that what would've been much more useful and timely to their venture was validated learning.

Validated learning is achieved using the scientific method. That is to say, you build a minimum viable product (MVP) or even just a mock up, set yourself a hypothesis to test, and then get out there and start testing it with customers immediately. Don't wait until you have a great product. Don't guess who your customers are and what they need. Build it. Measure it. Learn from it. Refine it. Go back around the loop. But faster this time (time and money are running out, remember?).

The problem with most entrepreneurs is that they're passionately attached to their vision and find it hard to pivot (pivot = a strategic change of direction with one foot firmly planted in validated learning). The problem with engineers and designers is that they're perfectionists and feel that they will be judged by the quality of their output. A minimum viable product is a scary idea for them. The problem with investors is that while you have zero revenue and a great idea, you're exciting. As soon as you make a penny in revenue, the questions start coming: why so little? The clock is ticking.

So as sensible as validated learning is, it's quite a tough management philosophy for participants in the startup to embrace. There's actually quite a lot of momentum in a startup. The potential for agility, yes, but the appetite for it? Not so much. You may have to accept a potential pivot (major strategic change) for each cycle of the process. You will certainly be constantly tuning (tactical change) based on new data. You will also be asking your customers to accept (and pay for) something less than perfect. But how else can you systematically and meaningfully evolve your product unless it's by validated learning? OK, you may be lucky and come up with the perfect formula first time. Unlikely. More than likely the market is changing almost as fast as your product. It's a race!

This management theory is particularly challenging for mature companies who are good at planning and execution but for whom innovation has become an aspiration rather than a reality. The idea of putting a MVP in the hands of their valued customers is very scary. The idea of pivoting every other day (read 'acknowledge failure and learn from it') is even more scary. But this is what learning looks like. Hard graft and lots of mistakes. Why would it be different for a big company than a startup?

Personally I think there are important lessons in here for organisational change as well as entrepreneurs and intrapreneurs. As was pointed out to me the other day, I talk a good game in terms of advocacy for educational change, but what about the 'how'? The problem is, I think, that many education leaders (like entrepreneurs) become victims of a grand plan when in fact what they need is constant evolutionary change based on validated learning. I call this the paradox of incremental transformation. What is called for in schools is not one grand plan. In fact the grand plan creates an unhelpful momentum of its own. It is not about unleashing massive transformation but rather a constant series of micro-experiments to test hypotheses that form the granularity of the plan and could change the plan. The key is to become agile at validated learning. Perhaps it's important to point out at this point that learners need to be participants in their learning. This is the reason why change imposed from above (or externally) is often met with resistance.

To achieve evolutionary growth as an organisation, leaders need to build a culture of support for  experimentation, failure, and in particular, advocacy for measurement and reliance on data to validate results. They need to be willing to react to validated learning quickly and implement change when it is proven to make a difference, even if the results are contrary to their expectations or wishes. The cycle of build, measure and learn is every bit as important to a school as to a startup.

Wednesday, 1 February 2012

Teachers make mistakes


When you've watched this TED video by Brian Goldman, I suspect you'll find yourself quite emotionally charged in response to his plea for a culture change in medicine. It hits close to home for many of us. He articulates a theme common to many professions, but particularly prominent in professions where 'esteem' and 'authority' are valued. His theme is the cultural denial of failure in the medical profession and the conspiracy of silence that accompanies it. Doctors don't make mistakes.

But of course they do! And Brian eloquently and passionately describes why it's essential to change the culture of medicine to one in which mistakes are openly acknowledged and embraced as learning opportunities.

I remember when I first embraced mistakes in learning (and it wasn't at school). I was in my mid twenties and a keen climber. As a relative beginner, I still tended to cling to the wall rather than dance with it. The nervous tension in my muscles precluded fluid movement! My more experienced climbing partner told me that I would relax when I began to trust him, myself and the equipment more deeply. However, the only way to learn that trust, and to move beyond my self-imposed limitations, was to try new moves, fail and come off the rock face - a lot. Rather than define success as staying on the rock face. He re-defined success for me as coming off the rock face. If I wasn't falling, I wasn't learning. If I wasn't falling (and surviving), I wasn't learning to trust him, myself and the equipment.

As a consequence of this learning, I'm guilty of tweeting the following aphorism on a regular basis: "Learning is inversely proportional to the intolerance of failure." It takes a few seconds to untwist the words but that's deliberate. I could've said, "We learn from our mistakes" and no doubt you'd nod sagely and move quickly on. But the phrase "the intolerance of failure" is important. In principle, we understand that we learn from our mistakes yet in so many situations we are intolerant of failure, both in ourselves and in others, and therefore we limit the potential for improvement.

Can you think of another profession in which this culture is rife? John Hattie can. John is a well known education researcher and author of the book Visible Learning. If you're a teacher I would thoroughly recommend you explore his research. There's a very challenging, two part video of him speaking on Youtube (here and here). In his book and in this video, he's very clear that teaching is one of those professions that's intolerant of failure. Mutual respect for colleagues is code for, "When I go into my classroom and close my door I'm going to teach any way I like so leave me alone." John's evidence indicates that most teachers spend less than a minute a month discussing teaching with each other. This is indicative of a culture of silence around performance.

When I attend my 14 year old daughter's academic review meetings, I've never heard a teacher say, "I'm failing your daughter and I need to work out how I can better meet her needs." On the other hand, I regularly hear, "Your daughter could do better if she..." But who is  failing who here? The focus on under-performing teachers tends to organise itself around the ability of head teachers to sack teachers who don't meet certain standards. In my opinion, this is a minor symptom of a much wider malaise facing the teaching profession. The bigger issue is that the profession's definition of 'under-performing' is hopelessly skewed towards extreme failure. I'm more concerned with the large number of average teachers who are chronically complacent about their own personal development than I am about the very small number of acutely failing teachers.

There's no doubt that many teachers are beginning to build PLNs (Personal Learning Networks) and that learning events such as Teachmeets are becoming more popular. Nevertheless, a culture is not something which changes overnight. It takes time, data and strong leadership. There is a deeply ingrained bias to label children as failing as opposed to teachers. This is the wrong way around and John Hattie's experimental evidence demonstrates it clearly.

I'd like to see a teaching profession that accepts it is making mistakes, and that actively invites data-led, teacher performance evaluation as a way of learning from those mistakes. I don't want this data to be used as a stick to beat teachers. I want every individual teacher to seek out this data as a means of steering their personal development within a supportive and vibrant culture of learning.

If mistakes aren't acknowledged then personal learning isn't happening. If personal learning isn't happening then organisational learning isn't happening. If organisational learning isn't happening then the teaching profession is not only failing students, but it is failing to learn from its mistakes. This is the unacceptable face of failure. Failure to learn from our mistakes.


UPDATE : I tweeted this today (5th Feb 2011): 'Failure week' at top girls' school to build resilience http://bbc.in/yEHKe1 #education #edchat

Thursday, 26 January 2012

Education fails technology?


As I’ve been blogging about the development of a School Technology Strategy, I’ve also been reading a recently published book called The Learning Edge by Bain and Weston. It’s a stimulating read in this context because it positions education as failing technology rather than the traditional reverse. That might not immediately chime with readers but bear with me. A few days ago I also read an interesting blog post by Wes Miller in which he explored the concept of ‘Premature Innovation’ in the context of Microsoft. The combination of these two sources has got me thinking...

Bain & Weston take the reader back to the work of Benjamin Bloom, the famous Educational Psychologist who in 1984 published ‘The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring’. In short, Bloom argued that one-to-one tutoring was the most efficient paradigm for learning but that, at scale, it is not practical or economical. He went on to say that optimising a relatively small number of significant variables may in fact allow group instruction to approach the efficiency of one-to-one tutoring. In this context, of particular interest is whether technology might simulate one-to-one tutoring effects such as reinforcement, the feedback-corrective loop and collaborative learning.

The promise of technology in education to date has almost always exceeded delivery and the blame has usually been attributed to technology. But is it really all the fault of technology? Well, Bain & Weston make a very interesting point in the context of Bloom’s research: although Bloom gave us a very useful framework for educational reform, there has been little systematic change in classroom practice for decades. The didactic model is still the beating heart of most schools. The practical implementation of research-based enhancements to pedagogy and curricula in schools has been painfully slow. In a very real sense, technology is the gifted student, sitting at the front with a straight back and bright eyes, full of enthusiasm, and being studiously ignored by educators. Education is failing technology.

Is this the whole story? Well, I certainly think it’s impossible to divorce a school technology strategy from an educational strategy with associated pedagogical and curricular implications. They go hand in hand. For example, a 1:1 ratio of devices to students is not going to make much of dent in learning in a school if the underlying pedagogy is predominantly teacher-led (for example). Technology will only ever leverage the benefits of a sound educational strategy and its practical manifestation. The biggest challenge for school leaders is therefore to construct a rigorous educational strategy and drive the change required to manifest it using research and data to drive continuous improvement. I see limited evidence of this in most schools.

If I’ve convincingly shifted the blame away from technology, perhaps it’s time to balance the scales a little. When reading Bain & Weston’s book, I was struck by the fact that a lot of the research focused on technology that I think fundamentally fails education, regardless of the education strategy. I think bright eyed, bushy tailed technologists sometimes suffer from premature innovation. This is where a seemingly great idea isn’t adopted or fails to fulfil its promise. A startling example from Wes Miller’s blog is the tablet. Tablets have been around for quite a while with very limited adoption before Apple stepped into the market. They launched the iPad and now tablet numbers are burgeoning and 1:1 iPad models for schools seem to fill every other blog post I read. Why?

As Steve Jobs was well aware, technology does not get used unless it does what it is designed to do really well and certainly better than a manual option. In a classroom, technology needs to work at the pace of the learner and/or the teacher. Even a 5 second delay can interrupt the pace and rhythm of a lesson. It also needs to be intuitive. It is just not fair to expect every teacher to be a technology expert and there isn’t time for endless training. Taking the iPad as an example, it’s hugely popular because a two year old can use it, it’s personal and mobile, wireless technology and the Internet are have matured sufficiently to fill it up with engaging content, and it is reliable. It’s turbo-charged book. The time is right.

Another example of a significant product failure in education due to premature innovation is the Virtual Learning Environment (or Managed Learning Environment or Learning Platform or Learning Management System). In the UK a Government agency called Becta was responsible for creating a functional specification for this product category. They then used this specification to put in place a framework off which schools might procure. The problem was that Becta tried to create an all singing, all dancing specification and it was just far too detailed. The resulting software created by the market to meet the requirement was therefore horribly over-engineered. The outcome? A very significant number of VLE products languishing in schools, not being used because they’re too difficult. A very big waste of money.

Again, in the VLE space we’re beginning to see disaggregation of the functional components into bite-size and usable chunks rather than a monolith with all the agility of a super tanker. Platforms are beginning to emerge which re-aggregate these simple elements into a manageable whole, retaining and enhancing usability in the process. The result? I’m beginning to see some interesting products in the VLE space.

Let’s not ever lose sight of the fact that technology is a tool and that my School Technology Strategy blog posts are implicitly (and now hopefully explicitly) intended to sit within the context of an educational strategy that attacks the 2 Sigma challenge with energy and evidence. Without educational change, the impact of technology on learning will be a placebo effect [placebo in the sense that there's nothing fundamentally changing but leaders feel better for ticking the technology box]. It is also the case that, even with a sound educational strategy, technology will only make a difference if it adheres to some very basic principles of usability and usefulness, a test that most technology in schools still fails.

Tuesday, 24 January 2012

School Technology Strategy Part 2: Cloud


In my previous post of this series I looked at the first and most important part of technology provision in schools: access. Without access, none of the potential learning associated with technology will take place. My conclusion was that although not all technology is or can be mobile, when constructing their technology strategy, school leaders should assume personally owned, mobile technology is the answer unless it clearly isn’t. They should assume this because the data demonstrates that mobile technology adoption is rapidly increasing because it offers a consistent, personal experience and availability at the point of need. Ask any mobile 'phone owner. This leads to high utilisation and more opportunities for learning. These are important attributes in successfully embedding technology within new pedagogies and curricula as well as for extending learning beyond the school into informal and social environments.

The second part of my blog on school technology strategy is the ‘doing’ or ‘action’ part. This means schools’ core mission. This could be something like: “To provide the opportunity for all students to learn and strive for excellence.” [taken from Washington Elementary School's web site]. The emphasis here is on learning, not technology. Technology is simply a tool and/or the subject of learning. For the purposes of this blog post, I will not explore technology as a subject, but rather I will focus on it as a tool. In this context, technology might:
  1. Qualitatively enrich or enhance learning and/or teaching
  2. Improve efficiency thereby releasing more time for learning and/or teaching
In developing a technology strategy for schools, school leaders must explore a range of options in both these categories in order to decide how they invest their budget. This means addressing two challenges:
  1. Prioritise investment of the technology budget to optimise learning
  2. Achieve best value through procurement efficiency and technical effectiveness
These two very simple steps hide a great deal of complexity but in separating them out, we begin to see where school leaders should lead and where they should follow. That is to say, school leaders are expected to have an opinion about how they wish to prioritise their resources to optimise learning outcomes in their organisations. That’s their job. It’s not (necessarily) their job to work out how technology can or will do this and then to procure and implement appropriate solutions. That’s probably best left to educational technology experts. There is a twofold and thorny problem here which may be characterised as “the blind leading the blind” or “the one eyed man is king in the land of the blind.” One face of the problem is school leaders who were not raised as digital natives and for whom technology is at best opaque and at worst an issue rather than an opportunity. The other face of the problem is that the so called ‘experts’ who tend to be either well meaning amateurs or individuals with vested interests. This is an unholy alliance in which neither party has much of an incentive to challenge the other.

So to whom should school leaders be listening when it comes to translating their organisational learning aspirations into learning outcomes through technology? Out of 28,000 teachers who qualified in 2010, just three individuals had a computer-related degree. Teachers are experts in learning and teaching, not technology strategy. Network Managers and ICT Technicians have a very clear vested interest in maintaining or expanding their roles rather than seeking out the most effective technology solutions. If it’s not them, then perhaps it’s a trusted partner organisation or a technically minded Governor. In my experience of the former, companies will sell what they have and a lack of competition leads to complacency. With regard to the latter, it’s rare to find Governors who understand and are sympathetic to technology in the context of education as their experience is usually derived from the corporate space. I’m not trying to discredit the positive motivation of any of these individuals. I know their hearts are generally in the right place. Nevertheless, in an average secondary school an annual technology budget is in excess of a quarter of a million pounds and good value means more learning. It is not something to treat lightly.

So my contention is that there’s very often a gaping hole where one would hope and expect to see an experienced education technology strategist without a vested interest. Becta tried to fill this space for schools in the UK and certainly they provided much needed advice, guidance and some procurement efficiency whilst they existed. However, they also fell into the trap of technology for technology’s sake. For example, some of their procurement frameworks for school products and services were so detailed that they drove unnecessary product and service complexity in the market. Complex products don’t get used unless they add real value. This is exactly the situation for many MLE and VLE products which languish in schools, receiving minimal usage and simply ticking the ICT box. Steve Jobs understood this well. Technology is only good if it’s used. Of course Local Authorities and other organisations such as NAACE and BESA have tried to plug the hole in various ways and there's no doubt they do good work. The issue I see is that their impact is inconsistent because many schools are islands and, as such, they are isolated.

This sounds like it’s become a pitch for employing an educational technology strategist however that’s not the point of this post. I’m attempting to paint a generalised picture of schools’ ICT in the UK as over-complex, significantly behind the curve in terms of technology innovation and woefully inauthentic in terms of the experience it provides of the 21st Century digital world we’re trying to prepare our young people to thrive in. I’m suggesting that there should be less ‘complex and expensive’ technology and more ‘simple, fast and exciting’ technology. Schools cannot afford to do everything.

I have already blogged about leading technology and the cloud technology paradigm in education. Both these posts are broadly built around the concept of the 80/20 rule. That is, 80% of the results of any endeavour take 20% of the time and 20% of the cost. The majority of time and money is spent in trying to achieve the last 20%. In practice, most schools and users are utilising their software and hardware at substantially less than 80% by any measure. This means a high level of investment and a low level of utilisation; the worst possible solution.

The reason that the cloud paradigm is rapidly gaining traction in businesses is that businesses are very sensitive to utilisation and efficiency as these directly impact their profitability. The same argument should apply in schools. By increasing utilisation and aiming to deliver solutions that don’t deliver in excess of 80% of requirements, schools will dramatically increase the value they deliver. I worked with a very large number of schools during the UK’s Building Schools for the Future (BSF) programme and one regrettable feature of the project was the waste of money that arose from attempting to reinvent 100% personalised solutions for every project. Schools tend to believe they are all different from each other. In my experience they are 20% different and 80% the same. Recognising this fact drives a different approach to the provision of technology. I respect and enjoy the vocational passion of educators but I do not think that this passion necessarily helps them to make wise investment decisions.

If one aims to meet 80% of a school’s technology requirements for 80% of the time, the optimal technology paradigm will almost certainly shift from the traditional on-premise, client/server model to an out-sourced, centrally hosted (or cloud) model. To date, the evidence seems to support the principle that cloud technology delivers more for less in schools by reducing the on-premise investment in technology (both hardware and people). The rapid advancement in web technology is such that even a pure web model may deliver 80% of a school's technology requirements without considering other cloud technologies such as thin client and virtualisation. However a Web 2.0 model is almost certainly going to be a more authentic experience for most young people and it is in this sense that cloud technology delivers more for less.

The benefits of moving to a predominantly cloud technology paradigm are outlined below and summarised in the diagram:

  • Less day to day management 
  • Less local infrastructure, resources and energy required 
  • Quick and easy to deploy, update and scale 
  • Available on many devices and operating systems 
  • Available any time, anywhere 
  • Consistent experience from any learning location 
  • Stronger links between home and school 
  • More budget for content, analytics and training

There will of course still be a requirement for investment in on-premise, school technology but only for the delivery of specialist requirements such as CAD or high-end video editing. As with my blog post on mobile, the message for school leaders developing their technology strategy is not: “Everything in the cloud”. It is: “Think cloud first.” The actual answer is almost certainly a hybrid solution but a hybrid that favours a significant proportion of delivery via cloud technology.

Thursday, 12 January 2012

Gove on ICT

I've just read a great blog post from Josie Frasier called 'Computer Science is not Digital Literacy'. I completely agree with her sentiments when she says: "I'm a huge fan of the current wave of enthusiasm and political will to transform the way that ICT is delivered in schools." She also name checks Code Academy and Coding for Kids, suggesting you check out the #codingfokidrs hash tag on Twitter for related links, discussion and resources. I repeat them here because I agree! However, I also agree with her that Gove's speech at BETT 2012 crashed together some terminology and ideas that are best differentiated.

In the comments on Josie's blog I can see some disagreement on how to define Digital Literacy and Computer Science, particularly whether having a grasp of Computer Science is necessary to be proficiently digitally literate. Personally I see the distinction as quite clear. Computer Science is a subject area and Digital Literacy is a skill-set that could be deployed across all subjects. Naturally there may be some Computer Science in Digital Literacy (and visa versa) just as there is Maths in Computer Science. The key point is that Computer Science is a discrete subject area in which skills such as logic and coding may be learned. The thrust of Gove's argument is that the IT industry needs the skills and knowledge represented by the qualification in Computer Science.

In Gove's speech, the current ICT curriculum is targeted as the root cause of the lack of relevant computer skills in young people and for being "dull". I would remind everyone that a curriculum is a framework and that the ICT curriculum isn't dull, the teaching of it is (or too often is).

Digital literacy is a set of competencies and knowledge that all young people should be taught for application across all subjects, much as literacy and numeracy. See Josie's blog entry for a more in depth exploration. However my key point is that the current ICT curriculum does nothing to inhibit the teaching of Digital Literacy (nor indeed does it specifically encourage it). Neither does the ICT curriculum prohibit the teaching of Computer Science material.

In my opinion, if Digital Literacy and/or Computer Science material are not taught, either as discrete subjects or part of another (Maths or ICT, for example), this is a function of education leaders, teachers and exams, not the ICT curriculum per se. Unfortunately teachers are not, in general, well prepared. Out of 28,000 teachers who qualified in 2010, just 3 had a computer-related degree. In my opinion, the deeper issue here is threefold:

1. Leaders who accept a "lock and block attitude" to the digital age
2. An exam system that tests knowledge and skills that lack relevance in the current digital age
3. Teachers that lack the mandate and the skills to fully embrace the digital age

Whilst I welcome the initiatives that Gove has outlined in his speech in the spirit in which they're intended, I think that they risk casting adrift, in a large ocean, many leaders, teachers and schools who were already adrift and lost in a pond.

The reason that so many schools turn out pupils inadequately prepared for the IT industry and the digital age is that they use the ICT curriculum as a lifeboat, clinging to it for dear life and keeping the water of the digital age as far away as possible. Taking away the lifeboat and 'asking' them to sink or swim... Well, I can almost feel the sharks circling.

Tuesday, 10 January 2012

School Technology Strategy Part 1: Mobile

The diagram I've created for this blog entry is designed to summarise a strategy for technology evolution in schools. The outer ring represents the strategic technology focus to deliver the requirement (the inner ring). Thus my proposed strategic focus for delivering 'access' is 'mobile'. I will split this discussion over four blog entries and in this one I'm going to focus on the 'access' quadrant.

If the diagram looks vaguely familiar to the educators out there, that's because it's organised around the general principle of a learning cycle. Of course, that shouldn't come as a huge surprise, given that organisational learning and evolution is broadly the sum - over time - of the learning and evolution of its constituent parts (or people, as we affectionately call them).

Access is usually implicit rather than explicit in the learning cycle. Clearly, however, access to technology is a prerequisite for learning with or through technology. Access in this sense is certainly not assumed, even in developed countries such as the UK. So let's start by exploring some of the facts.

One credible and recent report on this subject is from Ofcom on UK children’s media literacy (published in April 2011 but based on survey data from 5 to 15 year olds taken in Q2-Q4 2010). Here are some of the relevant findings:

  • Home Internet use is 67% for 5-7s, 82% for 8-11s and 90% for 12-15s 
  • School Internet use is 10% of 5-7s, 9% of 8-11s and 4% of 12-15s 
  • No Internet use is 8% of all 5-15s 
  • Smartphone ownership is 3% of 5-7s, 13% of 8-11s, and 35% of 12-15s 
  • 89% 5-15s from AB homes use Internet compared to 69% from DE homes

I've also come across a recently published Childwise Report based on survey of 2,770 age 5-16s in Q4 2011:

  • Among age 7-16s 61% have a mobile phone with Internet access
  • Children use their mobiles for an average of 1.6 hours a day
  • Growth in Internet use through mobile phones is biggest trend
  • More than 75% of secondary age pupils now using mobiles to get online
  • Before school children are now more likely to use mobiles than watch television

Finally a report published in December 2011 by Cloudlearn entitled 'An End to Locking and Blocking' provides a compelling insight into the practical value of mobile devices and social media in learning: "The headline is that teachers, departments, schools and individuals have arrived at similar sets of common sense, professionally evolved, cautiously applied, effective and tested policy guidelines for using social media and portable devices safely, effectively and engagingly."

The combination of these reports tells me:

1. There is still a real digital divide and the eLearning Foundation's estimate that around 1 million school age young people in the UK do not have access to the Internet is still about right.
2. The data very clearly shows that mobile phones are rapidly becoming the most commonly used technology device through which young people access digital tools.
3. It is quite possible to embrace mobile technology and social media in schools and what's more, it enhances learning and teaching and engages young people and their parents/carers.

The current landscape of technology and Internet access in schools is less about a digital divide than it is a digital literacy divide between the young digital natives and staff who tend to be digitally naive (and therefore fearful). All schools have Internet access and devices through which to access the Internet. The pertinent question is the efficiency with which the provision translates into access. I have two general observations in this area:

1. School owned technology generally has low utilisation
2. Utilisation of technology is proportional to the digital literacy of the staff

I use the word 'utilisation' in its very literal sense, that is the proportion of the total lifecycle of a device that is spent being used for the purpose for which it was designed (as opposed to propping up a shelf etc). The utilisation of school-owned technology is usually low because it's not personally owned and therefore there's lots of white space due to timetabling, weekends and holidays. By contrast, the window in which personally owned technology may be used is every day, from the moment a person wakes, till the moment they go to sleep.The difference in utilisation between school-owned devices and personally-owned devices is enormous. Utilisation equates to learning, even if it's social and/or informal learning. And don't get me started on the utilisation of schools' investment in software. Most software in schools is used very sporadically and the total percentage of the feature-set used is 10 to 20%. Taken in the round, the total utilisation of school-owned technology investment achieved by most schools is staggeringly low.

The second point above relates to the fact that many schools and educators are still on a journey towards confidence when it comes to technology (I know there are significant exceptions). In practice, technology in schools is not generally supporting a 21st digital curriculum, nor indeed catalysing a transformation in pedagogy (both of which it could and should be doing). The didactic approach is still the beating heart of most schools. A common technology provision in a classroom would be an Interactive Whiteboard (IWB) and a share in a laptop trolley (laptops optional). I see the IWB as a digital placebo. It gives education leaders the feeling they're 'doing' technology without actually 'doing' anything (except spend money). Harsh but - taken as the generalisation it is intended to be - true. And don't get me started on the Internet. Most schools filter out every element of the Internet that young people find engaging. Rather than leveraging its potential to arm young people with essential 21st Century digital skills while teaching behavioural self-regulation, it's locked and blocked (see my blog post on eSense).

The current technology delivery model in schools is, in my opinion, broken. It is expensive to provide, install and maintain, and it is used inefficiently and ineffectively. See my previous blog post for more detail on this but, in short, even if school-based computers were used 100% of the school day, every teaching day of the year, this would still only equate to a utilisation of 21% (1235 teaching hours) compared with the potential utilisation of a personally owned device (5840 waking hours). Meanwhile about 1 in 10 young people between the ages of 5 and 15 do not have access to a device and/or the Internet at home.

The very obvious solution to addressing  both these issues is to provide each and every learner with a mobile device and a means of accessing the Internet from home through that device. This solution is a win-win-win:

1. It addresses the digital divide
2. It enhances access to technology
3. It reduces the cost to schools

Let's first look at why I think a mobile device is the answer. As we prepare young people for 21st Century jobs, there is no doubt that 21st Century digital skills need to become second nature. The level of confidence we need to instill is gained from embedding technology in their daily lives, both formally and informally. This in turn means using technology in and out of school, with a consistent experience that links the environments. The most efficient and effective way to achieve this objective to is provide a device that is mobile (moving between home, school and any other learning locations) so that it's on hand 24 hours a day, 365 days a year. This achieves a much higher utilisation of the device and delivers a consistent experience between home, school and other learning locations. It is also the reason young people love their phones. (NOTE: although mobile phones are a dominant technology, I'd argue that this is because they're mobile and personally owned. Any mobile device will leverage benefits of consistency and utilisation).

In my opinion, the investment in such a device should be a shared responsibility between parents/carers and school, with financial support for those parents/carers unable to afford a device and/or Internet access at home (look at schemes such as www.GetOnlineatHome.org). The partnership should see the school providing an educational service layer and an affordable device loan/purchase scheme, with parents/carers paying for the device itself (and Internet access from home). The investment in a device by parents/carers on behalf of their child or children, engages them in a shared responsibility for the device. At the same time, it divests the school of responsibility for managing a large ecosystem of devices and allows them to focus on the educational service layer, i.e. what is accessed on the device and how this is used to enhance learning. This is where schools have expertise (or should), not in the management of technology which is an expensive diversion from their core mission. (NOTE: I fully appreciate that there will still be school-owned technology provision in schools but it need only be for specialist requirements).

The key and only benefit to schools that embrace a strategy of personally owned, mobile devices for every learner, is that more learning will occur because (for example):

  • All students have technology access, anywhere, anytime
  • Mobile access to digital tools underpins a 21st Century digital curriculum
  • Learners and their parents/carers are more connected and engaged
  • Budget can be refocused from technology management to education

For young people, the benefits are (for example):

  • Addressing the digital divide through equity of provision
  • A device that is mobile and personal, delivering a consistent experience
  • Engaging digital tools that reflect their authentic experience of the Internet
  • Development of technology skills that prepare them for the 21st Century

"Access with mobile" is only the first part of my technology strategy for schools but, in a very practical sense, it is the most important issue for all schools to address. Without access there is no action. It is not however the entirety of the strategy, including in the 'access' category. What I'm proposing is that mobile is the dominant paradigm in the access category and therefore all related policy, planning and decision making in schools should place a premium on 'mobile'. In my next blog entry in this series I will look at the 'action' part of the strategy - the educational layer I was referring to earlier.