BETTposter Syndrome

In today’s hyper-connected world, social media has become a stage where we present our best selves. Scroll through any feed, and you’ll see holidays, achievements, smiling faces, and exciting plans. I note a post from Mark Anderson recently in relation to the upcoming BETT conference, referring to all the countdowns, grand claims and other positive posts currently being shared. Rarely do we encounter the mundane, the struggles, or the moments of doubt. This curated reality, while aesthetically pleasing, comes with a hidden cost: the rise of imposter syndrome and other feelings of needing to do more than is possible or having not done enough.

Imposter syndrome is that nagging feeling that we don’t measure up, that everyone else is doing better, living fuller lives, and achieving more. One look at social media on the lead up to the BETT conference could easily leave you feeling like a BETTposter.Ironically, many of us who feel this way also contribute to the cycle by posting positive content ourselves, myself included. I find myself suffering from imposter, or BETTposter, syndrome, however I also often share positive posts as to plans, achievements or ideas, including posting as to speaking at BETT. Why? Because that’s what people like, or perhaps more accurately, that’s what the algorithm likes. And in a world where visibility equals opportunity, we often feel compelled to play by those rules. It is through engagement in social media in the past that I have connected with so many amazing educators and technologists, with these social media initiated conversations often continuing into the real world. I certainly wouldn’t want to miss out on these opportunities, including possible future opportunities, through either not posting, or posting things which others wont like whether this due to the algorithms involved or not.

So, what’s the solution? In relation to BETT and social media, its about accepting that what is presented on social media is carefully curated and therefore presents a particular view or lens on things, often tilted towards the positive. The reality will, in most cases, fall short of this.In terms of the event itself, it is simply the case of accepting there will be more people there than you can possibly meet, more stands to see than is physically possible and more talks to listen to than can be logistically achieved, outside a bit of time travel or having multiple clones of yourself. So the perfect is not achievable leaving us to do what ever we “reasonably” can. This means go with a plan in terms of what talks to attend, what stands to visit and who you want to catch up with, leaving time to travel around the event, time for rest and lunch, etc, and time for the accidental or unintended meetings you had never planned but which add so much to your thinking. But with this plan also accept that no plans survives contact with BETT (or maybe the enemy!) and therefore there will be talks you planned to see but missed, or people you wanted to meet but never got round to or where you did meet but didn’t capture the all important selfie.

Working in education though, the above poses an interesting question in terms of how we support our students to thrive within this world, maximising the potential for social networks to help them develop networks and opportunities, while minimising the risk of unrealistic expectations and imposter syndrome. My thinking is that this challenge isn’t easy to address, however what we can do is at least get started. Every discussion of this is likely to represent at least some forward and positive movement. And so I leave you with the above and the ask that you consider how and where these conversation might occur in your school. For those attending BETT, enjoy and I look forward to hopefully seeing you there.

Cool but is it safe?

As 2025 drew to a close, I noticed a trend on LinkedIn, a trend that I was tempted to engage with but just couldn’t bring myself to do. I saw many people turning to the CoAuthor Rewind service to create cool personalised summaries of their achievements, and I saw many people engaging, liking and commenting on others summaries. These tools promise something delightful, an elegant, AI-generated reflection of your year, packaged in a way that feels both professional and personal. It’s easy to see the appeal: who wouldn’t want a neat, shareable narrative of their successes? And the stats on peoples posts seemed to back this up, with people engaging with these posts.

But beneath the surface of this seemingly harmless activity lies an important question: what are we giving away, and what habits are we forming?

The Allure of Convenience

The popularity of CoAuthor Rewind is no surprise. In a world where time is scarce and digital tools promise efficiency, the ability to summarise a year’s worth of work in minutes feels revolutionary. These tools can even serve as a motivational boost helping people reflect on progress and celebrate milestones, plus can serve to engage other professionals through LinkedIn creating or solidifying professional networks.

Yet, the very ease that makes these tools attractive is also what makes them risky especially given the fact users are providing access to their LinkedIn profile data in order for the summary to be produced. Also their use, and the ease with which they create attractive and useful content can create habits that erode caution over time.

Rewind CoAuthors Privacy Policy

I took a quick look at the privacy policy to see if this might allay my concerns; It didn’t. It included a vague data retention period and statement in relation to whether any user content might be used to train AI models. For those in the UK or EU, the fact the organisation is based in the US and “may process information in the United States and other countries” is a concern especially given the lack of any mention of model clauses or GDPR. The “may” element of this alone makes me nervous; Don’t they know where they process their data? The fact the “Services do not current respond to do not track signals” was another concern as was the the vague statement as to the third party service providers and vendors which the service might disclose your data to. As third party risk and incident grow it is increasingly important to be aware of what third parties are being used by a given vendor.

Now none of the above would necessarily stop me using the service, but they are all risk factors which need to be considered. I wonder how many of those using the service to create their little 2025 rewind looked at the terms and conditions or the privacy policy and actively considered the risk?

The Risk Isn’t Always Immediate

It’s tempting to dismiss these concerns as minor. After all, users are consenting to share their data, and the vendor does have a privacy policy in place. But as well as the data protection issues there are also human issues. As one person shares does this normalise the behaviour of sharing and possible not checking the data protection risks, and as others join the crowd, do we then see collective normalisation of this behaviour? Is there a danger that ease and convenience, plus a bit of herd mentality leads people to stop asking critical questions?

In schools, if staff become accustomed to uploading personal data without scrutiny, what happens when the same habits extend to student records? Even anonymised data can carry risks if aggregated or misused.

Balancing Innovation and Responsibility

This isn’t an argument against tools like CoAuthor Rewind. In fact, they represent an exciting evolution in how we interact with technology. Lets be honest, they are also good fun. But it does highlight the delicate balance between embracing innovation and safeguarding privacy.

In education, this balance is never a zero-sum game. We don’t have to choose between being creative and being secure. Instead, we need to cultivate digital habits that combine curiosity with caution. That means embedding data protection awareness into professional development, encouraging critical thinking, and ensuring that policies keep pace with emerging trends.

A Teachable Moment

For schools, this trend might offer a valuable opportunity to discuss with staff and students:

  • What data is being shared?
  • Why is it valuable?
  • But what are the risks?
  • How can we enjoy the benefits of technology without compromising trust?

By framing these conversations around real-world examples, we can move beyond abstract warnings and help people develop practical strategies for safe, responsible use.

Conclusion

The rise of CoAuthor Rewind is a reminder that technology’s greatest strength,, in its ability to make life easier, can also be its greatest weakness if it encourages complacency. It also highlights balance, and how nothing is only positive or negative. As we celebrate the creativity these tools enable, we must ask: how do we ensure that what feels fun today doesn’t become a risk tomorrow?

New year, new resolutions

As the calendar turns and we step into a new year, many of us feel the pull to set resolutions. I have gone through this process for a number of years, often posting my plans or pledges here. For some, it’s about ambitious goals while for others, it’s about gentle intentions. For me, my resolutions are now less about rigid targets and more about creating a balance across the different dimensions of my life. After all, life is a journey, one we know the inevitable destination of, so the real challenge is to make the most of the path we walk.

Why Balance Matters
I have waxed lyrical about balance across a number of areas for many years, acknowledging that things that appear simple are seldom so. In considering life and the year ahead, a few years ago I came across a framework that resonated with me. You can see it here. It encouraged looking at life through multiple lenses:

  • Relationships
  • Body, Mind, and Spirit
  • Community and Society
  • Job, Learning and Finances
  • Interests and Entertainment
  • Personal Care

This approach reminds me that success in one area should not come at the expense of another. It’s easy to become hyper-focused on professional achievements or financial growth, but what about health, friendships, or personal joy? It is about thinking more broadly about what it means to be a human being, which with all the discussion and increasing use of AI may become all the more important. It’s also worth noting that all your resolutions don’t need to be new, they can simply be to continue doing what you are doing currently. There is no point seeking to fix that which isn’t broken.

Targets but with a caveat
Targets give us direction, but they shouldn’t become blinders. I have seen various people post as to avoiding setting targets or outcomes, and about looking at different approaches more focussed on habits or thinking or similar. In setting my goals for 2026 I want to avoid the trap of chasing goals so relentlessly that I miss the beauty of the journey or the unexpected opportunities that arise along the way. That said, I still find targets useful. Life rarely unfolds according to plan, and I can certainly speak from a degree of personal, and sometimes painful, experience there. The unexpected will happen, sometimes delightfully, sometimes disruptively, and adaptability is key. Having lived through financial challenges, redundancy, emigrating abroad and divorce, I think I am more accepting of the unexpected and the difficult now than I was when I was younger. One of the most liberating lessons I’ve learned is that change is inevitable. The unexpected is, by definition, unpredictable. When circumstances shift, we need to pivot without guilt or frustration, accepting goals which may remain unfulfilled while accepting that new goals may have been added along the way.

Who are the goals for?
Ultimately, resolutions should serve you. They are deeply personal, and there’s no universal formula. For some, the best resolution might be not to make any at all, and that’s perfectly fine. For me, it’s about maximising the journey, and having some targets to routinely revisit and measure progress against helps me significantly. This year, as with last I suspect I wont share my goals here, although I likely will share some thoughts at the end of the year as to how I progressed against my goals.

We all know the endpoint of life, which makes the journey itself precious. Lets make 2026 a great year!

Review of 2025

As 2025 speeds to a close I, like many other staff in schools, find myself wondering where the time and the year has gone.  Working in schools seems to be a constant sprint from the start of a term, until half term, then onwards to the end of term, before a brief pause and go again.

As the year draws to its close it is an important opportunity to stop and reflect so I thought I would share some of my thoughts, the highlights and even some of the low lights from the year with you.

Now for a number of years now I have been sharing pledges, setting out my plans and targets for the year ahead however at the start of 2025 I didn’t do this.  I did personally make some notes as to what I would like to achieve and to do however I never packaged this up into the usual blog piece as I would have done in the past.   I had been questioning whether having targets led to chasing the targets rather than more broadly experiencing and hopefully enjoying the year, so this may have led me to not put aside the time to create a blog post.   I simply deprioritised this particular task.   And I think this is something I am struggling with;   Having lists and targets, which allow easy measuring of progress and planning, etc, versus being more freeform, having the opportunity to respond to things as they arise, and maybe getting my head up more, to enjoy life, the world and everything.   I suspect as always it is about finding the appropriate balance, where I think recent years have moved me more towards lists, spreadsheets (who doesn’t like a todo list with colour coding?) and targets, a little more than I am now comfortable with.

Fitness has been a running theme (did you see what I did there?) for me over the last few years.   My distance for the year was just over 250km, which was far less than the 500km and 1000km from previous years, but this year was marked with such inconsistency in my running that even achieving 250km was something to celebrate.   I got there in a couple of concerted efforts over a couple of months rather than more evenly across the year.   I suspect here it was simply a case of other more important things taking priority and I am happy with that, but it also highlights that I need to see how I can better balance fitness out with other demands, as this year the inconsistency caused me some stress and disappointment.   I note that at this point, from a weight point of view, I am heavier than I have been for several years, where weight was a big part of me starting to run, so this is something I am going to have to reflect on ahead of 2026.

Reading has been another thing I have been trying to do more of;  Sadly it has been finding the time, however my increasing need to drive around has led me to listen to audiobooks more and more frequently.  This has allowed me to get through quite a few books.  I don’t think I have engaged with the various titles quite as well as I would have if reading them normally, complete with inserting post-it notes and annotations, but listening to a variety of titles, is definitely far better than doing no reading or listening at all.

 2025 has definitely seen me having more time for me, my partner, and our kids.   This has been absolutely amazing, however has required me to reduce time spent on other things.   Sadly to have more time for one thing we need to have less time for something else;  Something I think those leading education at national level could do with remembering.   On reflection I wouldn’t have done it any differently.  I do however need to get this balance right such that I challenge myself in terms of being busy, yet don’t have unrealistic expectations which leads to stress and disappointment.    On the positive side, of particular note is our nice new world map board where 2025 has seen us put a number of pins in the board to indicate places we have been and things we have done.   From this point of view 2025 has been an amazing year, with my relationship with my partner going from strength to strength, one amazing memory after another, and the year isn’t even done, with plans for what little remains including a very significant life event, which actually sees me sat here in highland dress, but enough of that for now.  We only recently found ourselves flicking through our photo library from events across the year, and we have done so much and had such fun.  This all makes me all the more positive and happier as to the looming 2026. 

2025 has however saw some less positive moments in some health issues within my immediate family, with these causing some worry and stress.   As I write this, one of my elderly family is once again back in hospital having had tests and then a minor surgery.   This adds an element of uncertainty, worry and stress to the immediate weeks, and over the festive period, however I hope that the tests and resultant care of the NHS will be able to address things and allow all to enjoy the festive period and progress on a firm footing into the year ahead.  I do however see 2026 as involving some difficult discussions as to the future.   There is also some uncertainty in relation to my son, who having finished college is still looking for the first steps on the career ladder. I compare his situation to mine, when i was that age, and I think things have only got more difficult for the young. I am sure something will present itself however i recognise how difficult it is when receiving rejection emails or simply not hearing back following applications for junior positions. As to apprenticeships, I feel there are just so many young people chasing the limited number of opportunities, which is a shame as I see apprenticeships as a great way of providing young adults both additional educational opportunities but also the much needed experience of the working world.

Looking to work and to my involvement in technology and education, it has been a solid year, albeit I don’t think it was as busy as 2024.   January saw my first opportunity to speak at the BETT conference, delivering a cyber security session to a packed room, despite a worried period of preparation and fear that my session, last thing on the final day of the conference, would be poorly attended.   This year has also seen me awarded the EduFuturists Outstanding IT professional award.  The highlight of this, other than the uprising event itself and all of the wonderful people there, was the fact the award was given to me by Dave Leonard, an EdTech superstar and my friend and colleague.  Also adding to this event was the fact I also collected an award on behalf of my Millfield colleague Kirsty Nunn.     Then there has been my work with the ISC Digital Advisory Group, leading on the 2025 conference, before becoming the new chair of the group, and beginning the planning for next years conference (save the date: June 19th, University of Roehampton, if you are interested!  Promises to be an amazing event).     And I can’t fail to mention the Digital Futures Group and a WhatsApp group which I often struggle to keep up with.   Such a brilliant group of people, doing so many amazing things in schools across the country, while also being open, friendly and supportive people that I feel blessed to consider as close friends.   And there is also the ANME, and several hundred WhatsApp groups (thanks Rick 😉) and being able to lead the various Southwest regional events, and some amazing discussions in relation to technology in schools.  

Looking forward to 2026, I look forward positively but also am conscious that it likely will represent a year of some challenges and change, but when is this not going to be the case.   It would be nice for things to be easy and convenient; however I also note that we need struggle and challenge in order to feel like we have actually achieved something.   Some difficulty is actually desirable; it is just about trying to find the right balance.   2025 definitely had some challenges, but looking back it has, in a broad and general view, been successful and positive.   I have some amazing memories particularly personally, and I suspect the day ahead of me will see more amazing memories created before 2025 is finished.   As I mentioned back at the end of 2024, maybe the journey is more important than the destination, and the journey through 2025 was filled with memories, and a sense of progress.   What more can you ask for?

School social media checks in a world of AI

In the UK, it’s increasingly common for schools to carry out social media checks on candidates invited for interview. The intention behind this practice is clear: safeguarding students. These checks are designed to identify concerns that can be discussed openly during interview rather than to eliminate candidates before they’ve had a chance to explain. But the social media checks rely on the AI algorithms inherent in social media tools, with all their flaws, plus the increasing availability of AI tools introduce new risks and challenges.

The Promise and the Pitfalls

On the surface, social media searches feel and look simple and straightforward.   You might ask candidates to provide their account details for the social media services they use, or you may simply search to find them.    You can then peruse or search the content, what they have posted, replied to, etc, and from this identify questions or areas of discussion for interviews. 

But online posts are often without context meaning it is all too easy for posts to be misinterpreted.  It makes me think of VAR and football, that things often look worse when slowed down in a video review, or when slowed down looking at posts without the emotions and pace of a specific moment in time, responding to others posts and thoughts.   Also do HR professionals truly understand how social media and their search or display algorithms work?  It is AI algorithms which decide what information to present meaning there is a risk of bias.    The algorithms might focus particularly on those posts most likely to spark a reaction, as they seek to keep people on their platform, not understanding the aim of the HR staff member carrying out the searches.   A harmless cultural reference or a joke taken out of context could be surfaced simply because the system has been trained to surface such comments due to the greater likelihood that such comments would lead to strong feeling, comments and the engagement in the platform. This may make things look worse, or better, than they are.   Equally AI algorithms might surface different types of post based on gender, ethnicity, age or other data points related to a candidate, potentially introducing bias to the data the HR team have available to them.

The Rise of Synthetic Content

Then there’s the growing threat of fake content. Deepfakes and AI-generated images are no longer the stuff of science fiction, they’re here, and they’re convincing. Imagine a candidate being implicated by a fabricated photo circulating online, or even a fake video. Without robust verification processes, schools could make decisions based on lies. How many HR teams are prepared to spot a deepfake? How many even know what to look for?   Also, as we see growing numbers of people using wearable technologies, such as smart glasses, how are HR to react when footage was taken without the applicants’ knowledge, before being posted online?   How would they even know it was without consent, and therefore illegal?  Would it be acceptable to use such a post within an interview process?   What if the applicant pointed out it was taken without consent and therefore was being processed illegally both by the poster and now the school?

Safeguarding vs. Fairness

The tension between safeguarding and fairness is real. While protecting students is paramount, recruitment must remain ethical and transparent. Social media checks should never become covert screening tools. Candidates deserve the chance to explain context, and decisions should be based on facts, not assumptions. Yet when AI enters the equation, the line between fact and fiction can blur alarmingly quickly.

There’s also the question of privacy. GDPR sets clear boundaries, but do all schools adhere to them when using AI-driven tools? Consent is critical, as is clarity about what these checks involve. Without transparency, trust in the recruitment process erodes and that’s a risk no school can afford.

Bridging the Knowledge Gap

The truth is, many HR professionals in education are experts in safeguarding and compliance, but not in data science or AI ethics. This knowledge gap matters. If we don’t understand how these tools work, we can’t challenge their outputs. We can’t ask the right questions about transparency, fairness, or verification. And we certainly can’t protect candidates from the unintended consequences of flawed algorithms.   This for me is key, in ensuring HR staff understand the tools they are using if undertaking social media checks, including understanding the risks which may arise from the use of AI powered search tools inherent in social media platforms.   Additionally they need to understand the risks as they relate to fake content, including audio, images and video;  What you hear or see may not be all that it appears to be.

A Call for Reflection

Ultimately, the goal is simple: to keep students safe without compromising the integrity of recruitment. Achieving that balance requires more than technology, it critically requires understanding, vigilance, and a willingness to challenge the systems we rely on, and the content they may present.   If schools are carrying out social media checks, the widespread availability of generative AI tools, and our increasing awareness of the risks particularly around bias, means maybe we need to revisit this and ensure we have considered all of the implications.

Think fast, act wisely

I recently gave a presentation titled “think fast, act wisely” at the Berkhamsted IT Conference looking at the future of education as I see it.  Below are my key thoughts as presented at the conference.

The landscape of education has always been shaped by the tools at its disposal. In 1998, when I entered the teaching profession as a fresh faced, and very young looking newly qualified teacher, the classrooms I worked in were dominated by blackboards and chalk.  I still remember dusting myself off each day having leant on the board as I was writing things.  The late 1990s and early 2000s saw the first technological shifts, but the pace of technological change has only accelerated since then. As we stand on the brink of increasingly frequent breakthroughs in artificial intelligence (AI), quantum computing, AI use in medicine and even bioengineering, it’s clear that technology is transforming our world. Yet, education often lags, moving at a much slower, incremental pace.

The Slow March of Educational Change

Despite the rapid evolution of technology, the structure of education remains stubbornly familiar. Promises of transformation have been plentiful, but delivery has been inconsistent. Interactive whiteboards, virtual learning environments (VLEs), flipped learning, MOOCs, and 1:1 device programs have all brought benefits, but rarely have they redefined learning in a fundamental way.   The SAMR model has presented a way of examining our tech integration in schools, however time after time we have been stuck at substitution, never progressing to the potential of augmentation, modification or redefinition.  The IWB for example simply replacing the whiteboard which had previously replaced the blackboard, or the 1:1 device, often being little more than a replacement for the textbook and workbook.

The barriers to the potential of tech are not just technical but human, including insufficient training, resistance to change, or processes that fail to adapt. Sometimes, the technology itself falls short particularly in relation to the sales promises when compared with pragmatic tech use in real life classrooms.   If there has been some meaningful progress, as I believe there has been in the last 5 years, it has been the result of significant external forces, like the COVID-19 pandemic.  I personally saw the potential here, with one student in particular who struggled in the classroom but thrived engaging remotely via Teams.   The issue here was that this progress was not sustained, and there was a “rubber band effect,” snapping back to old habits once in-person learning resumed. 

Artificial Intelligence: Promise and Peril

The arrival of AI in education marks a new chapter, and another external factor impacting on education. Tools like ChatGPT have democratized creativity and output, allowing students of all abilities to produce impressive work. AI can also assist teachers with content creation, marking, and summarizing, freeing up time for deeper engagement. However, these advances come with significant risks: ethical concerns, bias, accuracy, and the potential for disinformation. The binary debate of block or allow misses the nuance required to navigate these challenges.

If education continues to move slowly, it risks falling behind. And as this occurs some users will simply seek to act on their own, using whatever tools they feel are of benefit.   Shadow IT, where users, both staff and students, unofficially use technology solutions may then introduce safeguarding and cybersecurity risks. Also, the divide between those who embrace new tools and those who don’t will widen, exacerbating inequalities. Moving too fast, however, brings its own dangers.   I, for example, worry of those chasing “shiny new things” without proper embedding, straining resources, and overlooking critical issues like data protection and AI ethics.

Striking the Right Balance

The key is balance. Every new technology brings a mix of risks and benefits, and schools must develop and understand their own risk appetite. Regulation and compliance are essential, and many look to centralized guidance for support however this is slow to arrive, often seeing publishing only after technology has already changed or moved on.   It is therefore about considering the risks and about going much deeper than the block or allow narratives that often prevail.   It is possible to find a middle ground blocking some things, while allowing others, accepting some risks, accepting the weakness of technical controls, but mitigating through education and also using mistakes people may make as an opportunity for learning rather than as situations that must be eradicated.     We must think deeper.

For example, exam specifications warn against copying material and stress the importance of “own work.” But what does that mean in an age where AI tools are embedded in the learning process? If I use spell checker or grammar checker, or if I get AI to help start my document or to review the content and offer feedback;  is it still my work?   In thinking about this, if my teacher teaches me something, and I then use this knowledge to write a piece of coursework, if this my work?   And why wouldn’t I want to use AI to review and help me improve my work;  Don’t we want students to achieve their full potential, using the technologies which are available and will be available in their lives beyond the school?   The phrase, “must be the students own work”, seems pretty straightforward however in a world of AI tools, where these tools are embedded in the productivity tools we want students to use, it may not be that simple.    We must think deeper.

Rethinking the Purpose of School

This is a moment to reflect on the fundamental purpose of school. Do we need physical buildings? What do we want our schools to achieve, and how should technology help us get there?    Values and progress are crucial as is accepting that change is inevitable.  As such we need to become more agile and flexible to such change.   But we are not alone.  Through collaboration, discussion, and sharing we can collectively approach the challenges.

Big Questions for the Future

As we look ahead, a couple of big questions emerge:

  • What does human flourishing look like, and how do we support students to thrive now and in the future?
  • What is the purpose of education? How do we assess learning in a world shaped by AI, and what are we actually measuring?
  • How can AI best support students, teachers and school leaders, freeing them to think deeply and creatively?

The answers require courage, pragmatism, and a willingness to adapt. The education sector has an opportunity to be brave, but this means learning to move faster, but also to act wisely, navigating the balance between these competing requirements. It is due to this that we need to get better at managing risk, and that includes actually establishing what the risk of harm is and taking crucial decisions in relation to allocated resources.  We cant address every risk to the same extent.   

Above all, we must remember that we are on a shared journey.  In a world of AI, synthetic identities and AI assistants , it is collaboration, and human connection, that may be our greatest asset.

Citizenship Education in an AI-Driven World

Having recently met with colleagues looking at Digital Citizenship Education this encouraged me to scribble together some thoughts, which form the below post. I believe, now more than ever, we need to re-examine our education system especially as it relates to digital citizenship and preparing our students for the digital world we now live in, and for the digital world of the future, whatever that might look like.

For years, educators have spoken about “digital” citizenship as if it were a distinct concept, something separate from the so-called “real” world, and real world citizenship. But that separation no longer exists. Today, digital systems underpin almost every aspect of daily life: banking, healthcare, travel, shopping, and even social interaction. When these systems fail, life grinds to a halt. You just need to look at the Crowdstrike incident and the more recent AWS issues to see this.   Any separation between the digital and the real worlds which may have existed in the past, no longer exists.

This raises an important question: does the term digital citizenship still add value? I don’t think it does. Instead, we need to think about citizenship in a connected world; a world where artificial intelligence (AI), automation, and global networks shape how we live, work, and relate to one another. The challenge is not simply about teaching children how to behave online; it is about preparing them for a society where technology mediates almost every interaction. It’s also about preparing them for a future world where the technologies of today will have been replaced by new technologies, some we can predict and others that may not currently be as evident.

But this digital or technological change isn’t new to discussion of citizenship.  Citizenship education has never been static. It has always evolved in response to societal forces. In early modern Europe, it sought to counter superstition and establish rational norms. In the twentieth century, it became a bulwark against fascism and communism, promoting democratic values and civic responsibility. Today, the forces shaping citizenship education are different but no less profound. We face questions about identity and belonging in a globalised world, the ethical implications of AI, and the fragility of truth in an era of misinformation and disinformation.  Privacy is also an issue, or construct, which is now under question as we are faced with a world where people wander the streets with AI powered smart glasses and other wearables on, constantly recording, cataloguing, summarising and recommending our every action.  These are not abstract concerns; they affect how societies cohere and how individuals navigate their rights and responsibilities.

As I think about this I wonder about a fundamental tension: do we teach for a global society or at national level, reinforcing national norms? Should education prepare young people to embrace diversity and shared human values, or should it prioritise national identity and social cohesion? This is not a trivial question. It touches on debates about migration, climate change, and international governance. It also exposes the political nature of citizenship education. What we choose to teach and what we choose to omit reflects the kind of society we want to build.  These decisions are made at national levels, albeit contain some reference to globalisation.

Layered on top of these questions is the reality of AI and algorithmic decision-making. Increasingly, decisions that affect our lives including credit approvals, job applications, even court outcomes are mediated by algorithms. The news, even on trusted news platforms, are influenced by the swathes of data and content generated on social media.  AI tools further complicating things in enabling the generation of fake content which makes it increasingly difficult to discern the truth and whether something presented as true is actually true or if something presented as false, is actually false.   Understanding how these systems work, and their potential biases, is essential for informed citizenship. Without this knowledge, individuals risk becoming passive subjects of technology rather than active participants in shaping its use. 

Then there is the problem of information disorder, of misinformation and disinformation. Deepfakes, misinformation, and polarised media ecosystems challenge the very notion of truth. If citizenship education once focused on teaching civic literacy, it now must teach epistemic resilience in the ability to question sources, verify facts, and resist manipulation. In a world where AI can generate convincing falsehoods at scale, this skill is not optional; it is foundational.

So, what should citizenship education look like in this context? It cannot be reduced to a checklist of technical skills. It must cultivate critical thinking and not just the ability to analyse arguments, but to interrogate algorithms, question data, and understand the socio-technical systems shaping our lives. More than ever we need to not just question what we see and hear, but also the why;  Why an algorithm has chosen to present this content over other content and what this might mean.   It must emphasise human skills such as empathy, collaboration, and adaptability, qualities that machines cannot replicate but which are vital for social cohesion and ethical decision-making. It must foster ethical literacy, enabling students to grapple with questions of fairness, privacy, and accountability in AI systems. And it must build resilience, preparing young people to cope with uncertainty and change in a world where disruption is the norm.

Citizenship education in the age of AI is not about adding a few lessons on online safety or digital etiquette. It is about rethinking what it means to live responsibly and ethically in a world where technology mediates almost every interaction. Educators, policymakers, and communities must ask hard questions: what values do we want to uphold in a connected world? How do we balance global responsibilities with local identities? How do we ensure that technology serves humanity, rather than the other way around?  What does it mean to flourish in a technological world and how do we support our students to flourish?

The answers will shape not just curricula, but the future of democracy itself. Citizenship education has always been about preparing young people for the world they will inherit. Today, that world is algorithmic, interconnected, and uncertain. Our task is to ensure they enter it not as passive users, but as active, ethical citizens.   And this all requires that we start thinking deeper, asking more probing questions and supporting and encouraging our students to do the same.

Education and Football: Is It All About the Result?

I’m a football fan. I love the game, the drama, the unpredictability although don’t get me started on VAR.   I’ll admit, my team is having a bit of a rough patch right now. We’re not playing well, the performances are shaky, but I still find myself hoping for a win, no matter how it comes. A scrappy goal in the 89th minute? I’ll take it. A lucky deflection? Fine by me. Three points are three points.

That got me thinking about education.

Because in many ways, education today feels a lot like football. It’s all about the result. For students, parents, and even universities, the final grade is what matters. An A* is an A*, and a D is a D, regardless of how you got there. Whether you coasted through with natural talent, worked tirelessly every night, or crammed in the final weeks, the grade on the certificate is the same. Just like in football, where a win is a win, no matter how ugly the game was.

But should it be that way?

If education is truly about learning, about growth, development, and preparing young people for the future, then surely the journey matters just as much, if not more, than the destination.

The Scoreline Obsession

Grades are the currency of education. They open doors to continued education, to universities, apprenticeships, and jobs. They’re used to measure school performance, teacher effectiveness, and student potential. In a system so heavily focused on outcomes, it’s no wonder that the process of learning often takes a back seat.

This results-driven culture can be seen everywhere: revision guides that promise top marks with minimal effort, tutoring services that focus solely on exam technique, and students who ask, “Will this be on the test?” rather than, “Why does this matter?”

It’s understandable. Just like football fans want to see their team win, students and parents want to see results. But when we focus only on the final score, we risk missing the bigger picture.

Learning as the Journey

Imagine a football team that wins every game but never improves. They scrape by with luck, individual brilliance, or opposition mistakes. They don’t train hard, they don’t develop tactics, and they don’t build team chemistry. Eventually, that luck runs out.

The same is true in education. A student might achieve a top grade through memorisation or last-minute cramming, but what happens when they face a university course that demands critical thinking, independent research, or long-term project work? What happens when they enter the workplace and need to collaborate, adapt, and solve real-world problems?  And collaboration, critical thinking and other so called “soft-skills” are going to be all the more important in a world of AI, robotics and other tech tools.

Learning is the training ground. It’s where students build the skills, habits, and mindset they’ll need for life beyond the classroom. It’s where they learn to fail, to reflect, to try again. It’s where they discover what they’re passionate about, what they’re good at, and what they need to work on.

If we reduce education to a scoreboard, we risk turning it into a game of short-term wins rather than long-term growth.   What we measure, in tests, in coursework and other things which are easily measurable becomes what matters, rather than focusing and measuring what really matters; learning.

The Pressure to Perform

There’s another side to this too. When results are all that matter, the pressure on students can be immense. Just like footballers who fear making a mistake in front of a hostile crowd, students can become anxious, disengaged, or even burnt out.

We see this in rising levels of exam stress, in students who feel like failures because they didn’t get the grade they hoped for, and in those who give up entirely because they believe the system isn’t built for them.

But if we shift the focus to learning, to progress, effort, and resilience, we might create a more inclusive, supportive environment. One where students are encouraged to take risks, to ask questions, and to grow at their own pace.

Rethinking Success

So how do we balance the need for results with the value of the journey?

We can start by redefining what success looks like. Yes, grades matter. But so does curiosity, creativity, collaboration, and character. We need to celebrate improvement and engagement as much as achievement. We can value the process of learning, not just the product.

Teachers can design lessons that encourage exploration and reflection. Parents can ask about what their children learned, not just what they scored. Universities and employers can look beyond grades to see the whole person.

And students? They can be reminded that their worth isn’t defined by a single letter on a piece of paper.

Final Whistle

Don’t get me wrong, I still want my football team to win. And I understand why students want top grades. But just like in football, where the best teams are those that grow, adapt, and play with purpose, the best education systems are those that value the journey as much as the result.

Because in the end, it’s not just about what you achieve. It’s about who you become along the way.

Creating original work?

What Does It Mean to Present Original Work?

In an age of abundant information and powerful tools, the idea of “original work” is increasingly complex.   I often find it difficult philosophically when exam boards state, in schools and colleges, that work should be the “students own”.   What does that actually mean in todays world?     If you write or create something based on a lesson, a book, or with the help of AI, is it still your own?   

Learning vs. Creating: Where Does Originality Begin?

Originality doesn’t mean creating something in a vacuum. In fact, most original work is built on what we’ve learned from others. If a teacher explains a concept and you write about it in your own words, that is your work. You’ve processed the information, interpreted it, and expressed it through your own understanding. The same applies when you read a book and then write about it.   You add your voice, your synthesis, and your framing make it original.  

The Role of AI and Other Tools

Using AI to help structure your thoughts, refine your language, or even co-write parts of your work doesn’t automatically make it unoriginal. Tools have always been part of the creative process and we don’t want to remove them.  Think of spellcheckers, grammar guides, or tools to make writing easier such as the word processor or PowerPoint for creating a slide deck, or even just brainstorming with a friend.  Even consider the humble pen and paper.   Without the tool the output would be different, so the tool shapes the output, but yet we still consider the output to be our own.   The key question is: Are you using the tool to express your own understanding, or are you outsourcing the thinking entirely? 

If AI helps you articulate or present what you’ve learned, it’s still your work. But if you rely on it to generate output which you haven’t engaged with or understood, then I think it is fair to say the final product isn’t your own.  So, engagement and intent are key here.   If AI tools are being used for the right reason, to learn, and you engage with the AI in the production, co-production if you will, then it’s your own work.

Originality Is About Ownership of Thought

At its core, originality is about intellectual ownership. It’s not just about where the information came from, but how you’ve made it your own. Did you wrestle with the ideas? Did you connect them to other things you know? Did you form a perspective? If the answer is yes, then your work is original even if it’s inspired by others or supported by tools.

The Product vs. the Process

One of the most important insights is that the final product doesn’t always show the depth of learning behind it. A polished essay might look effortless, but it could be the result of hours of reflection, revision, and growth. Equally it could have been easy for an individual.  Conversely, a well-written piece generated mostly by AI might lack the personal journey that makes learning meaningful.  That said it could also represent the output to hours of effort, iteration, exploration and revision with the support of AI tools.

In considering the classroom and art for example, who achieves more, the student with strong artistic skills who produces a good graphic with little effort or the student with poorer skills who also produces a good graphic, but with the support of AI where they engaged with the process, contributing their ideas and identity but relying on AI tools for the realisation of the work?   Is it the product we value, in which case does it matter?   Or is it the process in which case one of the students clearly was more engaged in effortful learning.   I suppose it depends on what you are actually assessing.   And maybe that’s part of the issue.   Have we become to focussed on the product, the good graphic, rather than looking at the process, or more importantly effortful learning.

So perhaps in looking at if work is the students own, a better question is: Does this work reflect what they’ve learned? If it does, then it’s a valuable and original contribution.

Note: this piece was written with the help of AI.  It comes from my ideas and initial prompting, was refined through further prompting with final edits of the text.   In posting it I do so as it reflects my views on originality however benefits from AI’s broader vocabulary and structure.    Is it still my work?   I think it is.

Technology in Schools: Innovating While Staying Safe

The technology in education landscape is evolving at a pace that often feels dizzying.   One look at the last few years and Artificial Intelligence (AI) development alone and this is pretty evident, never mind the pair of smart glasses I am currently experimenting with and what they might mean for students with English as a second language or with special educations needs, not to mention the challenges around academic integrity.   New tools, both hardware and software, emerge almost daily.   For schools these new tools offer such potential but adopting is complex and takes time, and relies on teachers having space to trial, experiment, and build confidence before these tools become part of everyday practice.  

The Challenge of Adoption

Introducing new tools often means rethinking lesson plans, learning new interfaces, and managing technical hiccups, all while maintaining the core responsibility of delivering quality education.  This all takes time.   With time and workload being such an issue, as has been identified by successive teacher workload surveys, it is often a case of comparing possible benefit versus the cost of exploring and testing new tools, where such exploring and testing may identify tools that aren’t fit for purpose.  When this happens the time appears wasted, leading to some being reluctant to invest the time in the first place.   For those who do invest the time, it is either at the expense of other things or at the expense of themselves in exploring in their own time.  And I note I have seen many a teacher present at webinars and conferences as to how they have used one amazing tool after another, which is great at an individual level and based on their personal investment in exploring and experimenting, but the question is how we scale this up to become the norm across all teachers.

Some years ago I presented at a conference in Dubai in relation to the need for teachers to build confidence where they wish to fully embed technology tools in their practice, however building such confidence is difficult when there isn’t the time and where platforms and solutions move on so quickly.  So, what is the solution?

Democratising Creativity: Let Students Lead

Perhaps the solution lies in shifting the focus. Instead of expecting teachers to master every new tool, why not empower students to experiment?  In the words of David Weinberger, “the smartest person in the room is the room” so what if we count the students in our classrooms?   There is only so much experimenting a single teacher can do, but a class full of students can do much more experimenting and sharing, with this facilitated and directed by the teacher.   The teacher doesn’t need to know every app or tool as long as they understand possible risks and pitfalls.

Coursework and project-based learning offer ideal opportunities for pupils to explore emerging technologies, whether that’s creating multimedia presentations, using AI to generate ideas, or leveraging coding platforms to build prototypes.  AI tools in particular mean that really impressive products, be these images, presentations, videos, or more, can be put together by any student able to type prompts into an AI, and then able to review and refine their prompts and resulting output.    We can truly allow students to be expressive and exploratory in their learning and in how they evidence their learning.   It is very much the “democratising creativity” which I have heard Dan Fitzpatrick refer to on many occasions.   

Safety First: Privacy and Data Protection

Of course, this freedom must come with guardrails. Schools have a duty to protect students’ welfare, their data and their privacy, so need to ensure that experimentation happens within safe, ethical frameworks. This means clear policies on what tools can be used, vetting and risk-based consideration of platforms, and ongoing education about digital responsibility.

Data protection isn’t just a legal requirement; it’s a trust issue. Parents, staff, and students need confidence that personal information is secure and that applications and tools which teachers introduce or recommend have this security at their heart.   I note that nothing is without risk but that doesn’t absolve us of responsibility to do what is reasonably possible, to do due diligence, put mitigation measures in place and protect our students where possible. 

Cybersecurity incidents, such as phishing attempts or data breaches, underscore the importance of vigilance. Even seemingly minor lapses can erode trust and expose vulnerabilities. I myself have seen how data incidents can have an impact years after the initial incident and where the initial incident was deemed to have been low risk.   We therefore need to ensure we have processes in place to manage and minimise such risk.

But, things going wrong, error and failure are part of the learning experience, and allowing students to experience such things, along with supporting them on the road beyond, and in building their resilience, are important parts of preparing for the world beyond education.  Supporting students to develop the skills so they can evaluate what they have done, refine and improve is likely to become more and more important in a world where information and knowledge is so widely available.

Balancing Innovation and Responsibility

The path forward requires balance. Schools must embrace innovation to remain relevant and prepare students for a digital future, but they cannot do so at the expense of privacy, security, safety or equity.  In considering new tools and technologies this therefore gets me thinking about the below:

  • Data Protection: Does the technology introduce data protection risks in terms of data sharing, including with AI tools for training, or with advertisers or other third parties?    How long is data held for?  Is the use of data limited to a clearly stated purpose?  Where is the data geographically stored?
  • Cyber Security: Does the technology vendor put in place basic measures for security such as MFA and requirements for breach and vulnerability notification?
  • Ethics and transparency:  Is the tool ethical and “right” in its planned use?  
  • Age limitations and T&Cs: Does the planned use align with the terms and conditions, including any terms which limit the age of users?  Also is the tool designed for and appropriate to the age of planned student users?
  • Intellectual Property: Who owns the product of the given technology?   Is it the user, be they student or staff, or is it the technology vendor?
  • Sustainability: Is the solution financially viable into the future, if there is a cost, as well as environmentally sustainable?

I note the above aren’t all the possible considerations but they are a good starting point.  I also note that, for older students, it may be appropriate to get the students themselves thinking and investigating the above before seeking to use or explore a given tool.

Looking Ahead

Technology will continue to reshape education, but its success depends on thoughtful integration. By empowering students to lead experimentation, schools can harness creativity while giving teachers the time and support they need. Teachers could be the tech guide on the side although there may be work required to help them build the confidence to act as such.   Coupled with strong privacy safeguards, this approach ensures that innovation enhances learning without compromising safety.