TRUST IS BUST, DEMOCRACY HANGS ON: JUST, WHO HAS SUSSED, WHO IS FUSSED?

Portrait of Chris Holmes and Guide Dog

Chris Holmes and guide dog

Thirteen months in the making, today (29th June 2020) we published “Digital Technologies and the Resurrection of Trust”. Our Lord’s select committee report into democracy and digital technologies and a committee on which I was honoured to serve.

Thousands of pages of written evidence, hundreds of hours of in person testimony and many more hours of analysis and discussion distilled into 45 recommendations. Each recommendation standing on its own merit but, taken as a whole, with the potential to positively impact our digital experience and democratic engagement very much for the better.

Chaired brilliantly by David Puttnam, our cross-party committee approved every detail of the report unanimously.  Lord Puttnam said, understandably, it was like making a movie.

It has all the elements of Hollywood blockbuster:  David’s data, purloined by Goliathian tech giants. Our love for democracy tested to its limits, light and dark, duplicity disguised as dream, a man, on an unshakeable mission, ‘to do no harm’.

But this movie is already rolling and has been for well over a decade; picking, pushing, click bait, click hate, the more extreme, the more cash they cream.  The business model is simple, the more extreme, shocking or divisive the content, the longer the user dwells and the more monetizable the whole thing. The ‘attention economy’. Oh yes, at the same time, your data is stored and sold to the highest bidder for whatever they choose – they own it.  The bucks have never stopped here.

And the point, for some, is not to manipulate the democratic process. It’s not about backing a winner or determining the loser, the greater prize, is to destroy trust in the very democratic process itself.

As Lord Puttnam perfectly puts it:

“It is easy to forget the fragile foundations upon which so many of our freedoms are built – until they become threatened.”

And threatened they most certainly are, right now, under a “pandemic of misinformation and disinformation”.  Fake news is nothing new particularly not in the political arena.  What is new is the volume and the velocity which distorts and drowns out facts.  Yes, those fusty things, facts, thoroughly and fundamentally done in.

Covid could hardly bring a clearer screen for this moving picture. Fake news now ramped up to killer status.  In the midst of the pandemic, the anti vaxers, conspiracy theorists and Covid cures taken despite their incredulity. It’s not just the proposed Dettol that stinks.

Just this last week or so, two separate events epitomising so much of what we speak of in our report.  First, the Trump Tulsa rally.  Putting aside the decision to hold a mass rally during the worst pandemic in a century, let’s consider the role of social media in this scene.  K-Pop loving Gen Z via TikTok got heavily involved, creating fake accounts and snapping up vast numbers of Tulsa tickets they had no intention of using.  As a result, just over six thousand Trump fans in a venue fitted for over nineteen thousand. Interestingly, the true success of the TikTok attack was probably not the, undoubtedly powerful and undermining, images of empty seats but the destruction of potentially lucrative data for the Trump campaign team come the November election,

So, objective achieved, Trump, successfully trolled.  Successful, in its own terms, certainly but what of the essence of it?  Is trolling Trump any more virtuous than Trump twittering?

And now Facebook faces an advertising boycott despite announcing plans to prohibit hate speech and better protect groups such as immigrants from attacks. Facebook’s plans have not gone far or fast enough to prevent dozens of brands including Unilever, Verizon and Coca-Cola cancelling advertising for between a month and six months causing shares to fall more than ten percent over the last week.

What about freedom of speech? I completely endorse this right as the bedrock of democracy.  But what about freedom of reach?  It is significant and brings us back to that issue of volume and velocity.  Say what you like, it may well leave a rancid taste but your right to say it, is right.  But, when such views, for want of the bucks, are promoted and recommended by the platforms, this is a distortion, not a right, and, as we state in the report, the platforms must have a clear responsibility for this.

Our 45 recommendations deal with this principle and, also cover, in detail, the regulation, regulators, sanctions and more:

Regulation of mis/disinformation

We recommend that the Online Harms Bill (OH Bill) should be introduced within a year of this report’s publication and should make it clear that mis and disinformation are in scope.

As part of the OH Bill, Ofcom should produce a code of practice on misinformation – if a piece or pattern of content is identified as misinformation by an accredited fact checker, it should be flagged as misinformation on all platforms. The content should then no longer be recommended to new audiences.

Fact checking

Ofcom should work with online platforms to agree a common means of accreditation, initially based on the International Fact-Checking Network (IFCN), a system of funding that keeps fact checkers independent both from Govt. and from platforms and develop an open database of what has been fact checked across platforms and providers.

Content moderation

The Government should establish an independent ombudsman for content moderation to whom the public can appeal should they feel they have been let down by a platform’s decisions. The ombudsman’s decisions should be binding on the platforms and create clear standards to be expected for future decisions for UK users. These standards should be adjudicated by Ofcom. The ombudsman should not prevent platforms removing content which they have due cause to remove.

A joint committee of Parliament would oversee the work of the proposed ombudsman, including setting the budget and having the power of veto over the chief exec’s appointment.

Political advertising

We recommend that relevant experts in the Advertising Standards Agency (ASA), Electoral Commission, Ofcom and UKSA should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code.

Imprints

The Government should legislate immediately to introduce imprints on online political material. This could be done through secondary legislation.

Advert libraries

Ofcom should issue a code of practice for online advertising setting out that in order for platforms to meet their obligations under the ‘duty of care’ they must provide a comprehensive, real time and publicly accessible database of all adverts on their platform. This code of practice should make use of existing work on best practice.

Personal data in political campaigns

The Government should legislate to put the Information Commissioner Office’s (ICO) draft code on political campaigners’ use of personal data onto a statutory footing.

Algorithmic recommendation

For harmful but legal content, Ofcom’s codes of practice should focus on the principle that platforms should be liable for the content they rank, recommend or target to users.

Ofcom should issue a code of practice on algorithmic recommending. This should require platforms to conduct audits on all substantial changes to their algorithmic recommending facilities for their effects on users with characteristics protected under the Equality Act 2010. Ofcom should work with platforms to establish audits on relevant and appropriate characteristics.

Ofcom should be given the powers and be properly resourced in order to undertake periodic audits of the algorithmic recommending systems used by technology platforms, including accessing the training data used to train the systems and comprehensive information from the platforms on what content is being recommended.

Platforms v publishers

The report uses the term ‘platforms’ but holds them to a responsibility for a duty of care, responsible for the content that they promote to large audiences, rather than content they host. Ofcom should have the power to sanction platforms that fail to comply with their duty of care in the OH Bill. These sanctions should include up to 4% of global turnover, and powers to enforce Internet Service Providers blocking of serially non-compliant platforms.

Regulatory capacity

The Government should introduce legislation to enact the ICO’s proposal for a committee of regulators that would allow for joint investigations between regulators. This committee should also act as a forum to encourage the sharing of best practice between regulators and support horizon scanning activity.

The Centre for Data Ethics and Innovation should conduct a review of regulatory digital capacity across the Competition and Markets Authority (CMA), ICO, Electoral Commission, ASA and Ofcom to determine their levels of digital expertise. This review should be completed with urgency, to inform the OH Bill before it becomes law.

Freedom of expression

We protect free expression online by focusing on what platforms algorithmically promote rather than what they host. This means that platforms would not be encouraged to remove harmful but legal content. They would be required to not promote it through their algorithms or recommend it to users. This gives people the freedom to express themselves online but stops it from reaching large audiences.

We also support free expression by improving platform’s content moderation decisions. We do this by requiring greater transparency of what content they take down so that the rules that govern online debate are clearer and by establishing an online ombudsman who will be empowered to act for users online.

Anonymity online

Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users.

Online voting

We received a small amount of evidence that was in favour of online voting. In the round, however, opinion was overwhelmingly against introducing voting online. We heard that online voting might cause people to question the trustworthiness of election results and create fertile ground for conspiracy theories.

Exercising your democratic vote is an important act that should have some ceremony about it; visiting a polling station, for those for whom this is possible, is an important part of this. We should not seek to substitute or undermine this significant and important act with an online process.

Journalism in a digital world

We recommend that the CMA should conduct a full market investigation into online platforms’ control over digital advertising.

The Government should work urgently to implement those recommendations of the Cairncross Review that it accepts, as well as providing support for news organisations in dealing with the impact of COVID-19.

Education/digital literacy

Ofsted, in partnership with the Department for Education, Ofcom, the ICO and subject associations, should commission a large-scale programme of evaluation of digital media literacy initiatives.

The Department for Education should review the school curriculum to ensure that pupils are equipped with all the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review of initiatives recommended above. All teachers will need support through Continuous Professional Development to achieve this.

 

Currently, we have a sub optimal online world that is contributing to the erosion of trust in our democratic institutions and processes.  We hope that our report and the 45 recommendations within demonstrate that there is nothing inevitable about this and, velocity aside, not that much which is new.

It’s in our hands, our (well washed) hands and as we type, tap, and share, we must take more care.  What kind of conversations and discussions do we want to be part of and on what kind of social media? Do we want rigorous respectful debates that are open, transparent, accountable and trustworthy?

If not, why not pack away the public square, put away the polling stations, and with muffled cry, let our democracy die.

It’s no one else’s democracy, save ours.

AI, A Public Good?

The Covid pandemic is providing ample opportunities to consider the potential of technology for public good, indeed, what area of public good is more important than public health? One example, the NHSX tracing app promised significant benefits although, unfortunately, these seem far from being realised. It does, however, give us an opportunity to look at where it has worked and consider the risks.

A recent report from Stanford shows how Taiwan managed to avoid the extreme lock down measures seen here and around the world yet still successfully limited and contained the spread of the virus. How? The report shows five interconnected factors: pandemic readiness, national electronic health records database, wide scale testing, big data analytics and the use of mobile technology to track movements of individuals who tested positive for Covid-19.

The benefits of a functioning mobile app are clear but the use of this technology has raised concerns around transparency, trust  and data privacy rights.  This is an important issue for public discussion. I am passionate about the potential of technology for the public good and believe there is no better part of society than the public sector to lead the charge in the UK’s role as a global leader in responsible AI innovation. Our Civil Service colleagues will be able to do a tremendous amount of social good if they approach the design and implementation of AI systems by making the realisation of ethical purpose and the pursuit of responsible practices of discovery a first priority.

Last year the govt published a guide to using artificial intelligence in the public sector.  The guidance consists of three sections: understanding AI; assessing, planning and managing AI and most importantly using AI ethically and safely. The guidance focuses heavily on the need for a human-centric approach to AI systems which aligns with positions of other forums including our work on the Lords AI select committee. The Guidance also stresses the importance of building a culture of responsible innovation, and recommends that the governance architecture of AI systems should consist of a framework of ethical values; a set of actionable principles; and a process-based governance framework.

I have asked the government what plans they have to put this guidance on a statutory footing.

I hope they will think carefully about the statutory and non statutory mechanisms to ensure the safe and ethical use of AI and data technologies. The government has also promised that a national data strategy will be published this year. It is absolutely essential that we get this right. If we make sure we are regulating in such a way that supports the design implementation of ethical, fair and safe AI systems then that really would be ‘world beating’.

 

Chris questions the government about the “potential and pitfalls” of big data

 

 

Today (5th February 2018) in the House of Lords, Chris asked the Government what action they are taking, or plan to take, to ensure that people are aware of their rights and obligations in respect of data protection and privacy.

Writing in Politics Home Chris argues that we must raise awareness and start a public debate to promote a greater understanding of data: the value, use cases, protections, obligations and ethics including asking what a social contract based on data for public good might look like.

“Data is increasingly described as the fuel driving the 4th IR, just like the oil and electricity of previous industrial revolutions, but unlike those earlier examples we, the people, are producing the data. Huge tech giants have users and customers, and as users our data is ‘farmed’ and monetised then sold to customers. Most of us understand and accept this arrangement as beneficial to us and relatively benign, in exchange for information about our shopping habits we receive (sometimes useful) targeted adverts. But the potential of data, particularly as fuel for Artificial Intelligence, extends far beyond these commercial applications. Already, insurance companies the police and other organisations are watching and using our data in ways that are less well known or understood. Just one statistic helps underline the exponential ‘everythingness’ of this; 90% of all data was created in the last two years.

Data is a precious resource. Of course, as with anything, all that glistens is not gold, but there is gold, diamonds and doubloons a plenty if we get this right. Imagine, a health service transformed through insightful, responsible use of data, in law enforcement, big data could become our best detective and if we no longer fancy horse in the lasagne, data has a role. We can transform our public services and be at the forefront of global standards, it’s down to us. Everything and nothing changes. We are experiencing an extraordinary pace of change, all driven by data but we still seek truths, reliability and connection. Thus, we must have conversations about data, who owns it, what rights we have and what obligations. What could a social contract based on data for public good look like?

Progress is being made in terms of legislation governing data; from May this year the GDPR will come into force and the Data Protection Bill is currently making its way through parliament. Many colleagues have done excellent work on the Bill and I am heartened that the government has accepted an amendment to ensure a more robust data regime for children following outstanding cross party working led by Baroness Kidron. Also in the Lords, Baroness Lane-Fox tabled an excellent debate in September calling for improved digital awareness.

It is beholden upon us as legislators to determine how we regulate, how we transact, how we thrive in the Fourth industrial revolution. In doing this we must protect citizen’s rights and responsibilities and most of all ask what society we want to be. I welcome the Governments work to develop a digital charter and establish a centre for data ethics and innovation. These are excellent initiatives but we must keep up the pressure particularly in communicating this work to the general public. We cannot leave it to the tech giants to determine what is best for us; as we saw recently with Zuckerberg’s claim that Facebooks recent changes were made “in order to amplify the good”.

In conclusion, we have everything we need and more to engage in the most marvellous of public debates, like that Baroness Warnock so beautifully achieved with reproductive advances decades ago. We are born human, we have the ethics, we can plot the innovation all rooted in transparency and trust, commerciality and care and an unyielding focus on proportionality and purpose putting data sharing at the heart of a new 21st Century social contract.”

Chris questions the Government on its response to the fourth industrial revolution

Today (15th November 2017) Chris asked the Government what cross-Whitehall work they are undertaking to maximize opportunities from the fourth industrial revolution; particularly in terms of digital skills, artificial intelligence, machine learning and distributed ledger technology.

The question was intentionally broad and included a range of technologies and priorities as Chris hoped to highlight that the Government must grasp the ‘everythingness’ of this new technology.

As Chris wrote in Politics Home: “the 4IR is already well underway and it will make the first industrial revolution sound a mild murmur by comparison… There is no separate world of digital. It won’t be possible to focus on, for example, health, education or defence and leave others to “do the digital”.  Crucially, the 4IR is inevitable, not optional and whilst I welcome the inclusion of digital in DCMS I seek ultram online no prescription reassurance that the scale of the challenge and the necessity for a cross-governmental approach is understood and acted upon.

The technology may be complex – who really knows what goes on inside the black box at Deep Mind or appreciates the finer details of the cryptograph hash function of Bitcoin. But this is not about the tech per se it is about the potential, the solutions which can be realized and what will be required from Government, from all of us, for such realization to become reality.”

 

What should the government do with blockchain?

Today I’ll be asking the government what they have learnt from an ongoing trial in which benefits are paid to people via a system using blockchain (or distributed ledger) technology. A blockchain is an asset database that can be shared across several networks, and the trial – run by fintech company GovCoin and researchers at University College London – pays participants through an app on their smart phone which connects them to various services.

I passionately believe in the potential for technology to transform our lives for the better and think it essential that both government and society start from the point of a considered can, rather than a fearful can’t. I hope that learning more about the Govcoin trial will help us all understand, and be part of, the changes that are coming. I also believe the government needs to look wider than the Department for Work and Pensions for applications of this technology; across Whitehall and the whole public sector and also seriously consider the move from proof of concept to pilot to scale.

Advances in technology can absolutely be about empowering, enabling and creating closer more effective relationships. Distributed ledger techonology, if applied properly by seriously addressing issues of privacy, security, identity and trust, can offer incredible benefits to us all, including, but not limited to, reduced costs for government (and taxpayer) and better services for individuals.

As a member of the Lords Committee on Financial Exclusion our report, published this weekend, found buy ultram er online that more than 1.7m people in the UK do not have a bank account, further estimates suggest at least 600,000 older people are financially excluded. A combination of  distributed ledger technology and developments such as the Payment Services Directive 2 (PSD2) could lead to greater financial inclusion of people currently on the fringes of the financial system. These are serious and tangible benefits.

Another significant area of potential is the transformation of the relationship between government agencies and citizens. Greater transparency and trust should lead to a more equal, connected and far closer relationship. But this will not happen as a matter of course and there needs to be a principles-based, appropriate framework that is underpinned by an understanding of the philosophical, psychological and legal issues at play.

The best way for the government to move ahead with the work is to adopt clear, honest communication with the public. There must be clarity about what the government is trying to do and how to get there – and crucially how it’s a joint endeavour. This means a further shift towards user-centric service design and co-production that sees people as active parties, rather than passive recipients. People must understand the value of their data and have ownership of it. The Digital Economy Bill, currently making its way through parliament, offers a start in dealing with some of these questions and considerations but lacks ambition and has provoked understandable concerns regarding privacy and data sharing powers.

Digital Skills Select Committee Report Published

Chris on screen, Sky Digital View

The report, entitled “Make or Break: The UK’s Digital Future”, urges the incoming Government to seize the opportunity to secure the UK’s place as a global digital leader and makes some significant recommendations. Key recommendations are: treating the internet as a utility and prioritising and regulating it in the same way as other essential utilities, buy cheap clonazepam making digital literacy a core subject, as important as English and Maths and putting a single “Digital Agenda” at the heart of Government with a Cabinet Minister responsible for it. Chris feels strongly about the need to set more ambitious goals and urges the next Government to adopt the Committee’s recommendations.

Full report

Media reports

Incredible new 3D audio technology

 

Chris sitting on BBC Breakfast sofa to discuss new technologu to help visually impaired people

Chris being interviewed about new technology to help visually impaired people.

Chris took part in a trial of an amazing new device, currently at prototype stage, that equips blind people with extra tools and information when navigating city centres, including accessing public transport.

Speaking on Radio 5 live Chris described the headset as “your navigator, personal guide, transport information, local historian, all the (information about) shopping stuff… It’s absolutely incredible. And it’s the confidence it gives you. It was the first time I had done that route and it just felt so comfortable.”

The technology – developed by Microsoft, Guide Dogs and the UK government’s Future Cities Catapult – takes the form of a smart tramadol buy online headset paired with a Windows Phone app, which has been designed for people with sight loss.

The headset is a modified pair of headphones, which hooks over the wearer’s ears and rests on their jaw bone, transmitting sound to their inner ear using vibrations. This means that the wearer can hear sound from the headphones and from their environment simultaneously.

Microsoft has added a small 3D-printed box on the back of the headset containing an accelerometer, a gyroscope and a compass, as well as a GPS chip, so that the user’s position can be tracked.

Click here for further media coverage of the device…