Why it is incumbent for edtech companies to lead the way on data privacy when the Government demonstrates that it won’t do it for you.
As we move into an even more digital age in education, accelerated by the need to use more edtech during the COVID-19 pandemic, we are increasingly finding that there are edtech vendors who don’t adhere to either the legal or moral imperatives to do right by our children.
Historically, we’ve come on quite a journey with data privacy from my living memory of the Data Protection Act of 1984. Designed to protect personal data stored on computers or in organised paper filing systems, it enshrined terms such as “data subject” and “data user” and “processing” and “disclosing” concerning data – although it also contained a term which can only be defined as being archaic, “computer bureau,” with the non-PC gendering in the description where it explains that a computer bureau happens, “if he provides other persons with services in respect of data, and a person provides such services”.
It also enshrined seven basic principles, which still form much of what we know today within the key principles of the 2018 Data Protection Act and many of the principles of the GDPR.
Taking duty seriously
Whilst some may find it difficult to keep on top of their roles and responsibilities relating to their statutory duties, there’s no denying that when it comes to education and dealing with data and young people, not only do we have a legal duty but, equally, a moral duty too. It is not right to use tracking, clicking or biometric data for commercial gain.
As systems and technologies develop, guidance and advice around the legalities of all of these have been created to help. Earlier this year, we saw ‘Defend Digital Me’ release its “State of Biometrics 2022: A Review of Policy and Practice in UK Education” document which covered lots of the issues and concerns around biometrics and their use in education. More recently, we’ve also seen England’s Department for Education release the guidance document, “Protection of children’s biometric information in schools”. These provide useful information to inform how we work – and certainly the DfE’s document is very helpful, with templates for parental consent forms and clear reinforcement of instructions around DPIAs (Data Protection Impact Assessments).
In balancing all of this, Information Governance Lead at NetSupport, Tony Sheppard, recently commented on a LinkedIn thread on the topic of FRT (Facial Recognition Technology) and shared about approaches being balanced and proportional around their use. The conversation discussed using facial recognition for cashless payment systems to facilitate speed of service at school lunchtimes. He shared:
“The benefits are never made clear enough. They rarely get tied in with whole-school strategies. Without this, you are not doing a full impact assessment. Explore the benefits, carefully watch the risks and explore the alternatives. Unless all questions are asked and answered, we don’t get a true picture.”
Education supplier attitudes unmasked
Well, it’s great that many are trying to do the right thing, but it is also clear that several companies truly aren’t. This report in the Washington Post, linked to research by the advocacy group Human Rights Watch who analysed 163 educational apps and websites across 49 countries found worryingly that almost 90% of those tools checked were designed to send the information they collected on to third-party advertising technology companies. Shocked? You should be. At the one end, we see responsible companies undertaking painstaking conversations to both support schools and improve the conversation around data privacy, at the other end of the spectrum we can see how less scrupulous organisations are behaving.
In its report, “How Dare They Peep into My Private Life? Children’s Rights Violations by Governments that Endorsed Online Learning During the Covid-19 Pandemic” Human Rights Watch shared:
“Human Rights Watch found that children’s educational websites installed as many third-party trackers on personal devices as do the world’s most popular websites aimed at adults. […] Put another way, children are just as likely to be surveilled in their virtual classrooms as adults shopping in the world’s largest virtual malls, if not more so.”
What NetSupport is doing
Most NetSupport products, such as NetSupport DNA or Manager or Notify, are what are called ‘on prem’ (on premises) products, so we never receive any data from you, unless of course you get in touch and need some support which we, of course, happily provide.
Our online SaaS (software as a service) products, such as classroom.cloud, require that for the solution to work, we do hold data in our Azure cloud data storage but only for the purposes of the software working in the way it is intended. We are clear about our use of data in our data responsibilities and, via our data privacy notice, you can see what we store, for how long, for what purpose and that it is completely within the control of our customers.
Our modus operandi is to adopt a ‘beyond best practice’ approach and we work with some of the best people in the business to help provide a great balance check to ensure we go beyond what is expected.
For example, if you were to run a search on our classroom.cloud product using ‘Blacklight’, a free real-time website privacy inspector, you’ll see we have just the one ad tracker for Google Analytics. We use this simply for site maintenance and ensuring our tool is fit for purpose based upon the entry points of viewers.
Does your company do this?
Top takeaways for your organisation to consider
It may well be the case that you’re a vendor reading this and wondering, “What should I do?”. So, we thought we’d provide a few simple actions to help you understand where you are at, what your responsibilities are and what you can do.
- Read the information freely available from the Information Commissioner’s Office. It’s a great place to read up and learn about roles and responsibilities.
- Check out your solution and consider its privacy using a tool such as Backlight.
- Act based upon what you find from your research.
- Make sure your data processing agreement is clear, transparent and helpful.
- Share about your data privacy approaches. It’ll increase transparency and hopefully advocate for your product and approaches too.
And, if you’re an educator, why not consider making this a benchmark for your interactions and engagements with any educational technology you may be considering?