TRUST IS BUST, DEMOCRACY HANGS ON: JUST, WHO HAS SUSSED, WHO IS FUSSED?

Portrait of Chris Holmes and Guide Dog

Chris Holmes and guide dog

Thirteen months in the making, today (29th June 2020) we published “Digital Technologies and the Resurrection of Trust”. Our Lord’s select committee report into democracy and digital technologies and a committee on which I was honoured to serve.

Thousands of pages of written evidence, hundreds of hours of in person testimony and many more hours of analysis and discussion distilled into 45 recommendations. Each recommendation standing on its own merit but, taken as a whole, with the potential to positively impact our digital experience and democratic engagement very much for the better.

Chaired brilliantly by David Puttnam, our cross-party committee approved every detail of the report unanimously.  Lord Puttnam said, understandably, it was like making a movie.

It has all the elements of Hollywood blockbuster:  David’s data, purloined by Goliathian tech giants. Our love for democracy tested to its limits, light and dark, duplicity disguised as dream, a man, on an unshakeable mission, ‘to do no harm’.

But this movie is already rolling and has been for well over a decade; picking, pushing, click bait, click hate, the more extreme, the more cash they cream.  The business model is simple, the more extreme, shocking or divisive the content, the longer the user dwells and the more monetizable the whole thing. The ‘attention economy’. Oh yes, at the same time, your data is stored and sold to the highest bidder for whatever they choose – they own it.  The bucks have never stopped here.

And the point, for some, is not to manipulate the democratic process. It’s not about backing a winner or determining the loser, the greater prize, is to destroy trust in the very democratic process itself.

As Lord Puttnam perfectly puts it:

“It is easy to forget the fragile foundations upon which so many of our freedoms are built – until they become threatened.”

And threatened they most certainly are, right now, under a “pandemic of misinformation and disinformation”.  Fake news is nothing new particularly not in the political arena.  What is new is the volume and the velocity which distorts and drowns out facts.  Yes, those fusty things, facts, thoroughly and fundamentally done in.

Covid could hardly bring a clearer screen for this moving picture. Fake news now ramped up to killer status.  In the midst of the pandemic, the anti vaxers, conspiracy theorists and Covid cures taken despite their incredulity. It’s not just the proposed Dettol that stinks.

Just this last week or so, two separate events epitomising so much of what we speak of in our report.  First, the Trump Tulsa rally.  Putting aside the decision to hold a mass rally during the worst pandemic in a century, let’s consider the role of social media in this scene.  K-Pop loving Gen Z via TikTok got heavily involved, creating fake accounts and snapping up vast numbers of Tulsa tickets they had no intention of using.  As a result, just over six thousand Trump fans in a venue fitted for over nineteen thousand. Interestingly, the true success of the TikTok attack was probably not the, undoubtedly powerful and undermining, images of empty seats but the destruction of potentially lucrative data for the Trump campaign team come the November election,

So, objective achieved, Trump, successfully trolled.  Successful, in its own terms, certainly but what of the essence of it?  Is trolling Trump any more virtuous than Trump twittering?

And now Facebook faces an advertising boycott despite announcing plans to prohibit hate speech and better protect groups such as immigrants from attacks. Facebook’s plans have not gone far or fast enough to prevent dozens of brands including Unilever, Verizon and Coca-Cola cancelling advertising for between a month and six months causing shares to fall more than ten percent over the last week.

What about freedom of speech? I completely endorse this right as the bedrock of democracy.  But what about freedom of reach?  It is significant and brings us back to that issue of volume and velocity.  Say what you like, it may well leave a rancid taste but your right to say it, is right.  But, when such views, for want of the bucks, are promoted and recommended by the platforms, this is a distortion, not a right, and, as we state in the report, the platforms must have a clear responsibility for this.

Our 45 recommendations deal with this principle and, also cover, in detail, the regulation, regulators, sanctions and more:

Regulation of mis/disinformation

We recommend that the Online Harms Bill (OH Bill) should be introduced within a year of this report’s publication and should make it clear that mis and disinformation are in scope.

As part of the OH Bill, Ofcom should produce a code of practice on misinformation – if a piece or pattern of content is identified as misinformation by an accredited fact checker, it should be flagged as misinformation on all platforms. The content should then no longer be recommended to new audiences.

Fact checking

Ofcom should work with online platforms to agree a common means of accreditation, initially based on the International Fact-Checking Network (IFCN), a system of funding that keeps fact checkers independent both from Govt. and from platforms and develop an open database of what has been fact checked across platforms and providers.

Content moderation

The Government should establish an independent ombudsman for content moderation to whom the public can appeal should they feel they have been let down by a platform’s decisions. The ombudsman’s decisions should be binding on the platforms and create clear standards to be expected for future decisions for UK users. These standards should be adjudicated by Ofcom. The ombudsman should not prevent platforms removing content which they have due cause to remove.

A joint committee of Parliament would oversee the work of the proposed ombudsman, including setting the budget and having the power of veto over the chief exec’s appointment.

Political advertising

We recommend that relevant experts in the Advertising Standards Agency (ASA), Electoral Commission, Ofcom and UKSA should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code.

Imprints

The Government should legislate immediately to introduce imprints on online political material. This could be done through secondary legislation.

Advert libraries

Ofcom should issue a code of practice for online advertising setting out that in order for platforms to meet their obligations under the ‘duty of care’ they must provide a comprehensive, real time and publicly accessible database of all adverts on their platform. This code of practice should make use of existing work on best practice.

Personal data in political campaigns

The Government should legislate to put the Information Commissioner Office’s (ICO) draft code on political campaigners’ use of personal data onto a statutory footing.

Algorithmic recommendation

For harmful but legal content, Ofcom’s codes of practice should focus on the principle that platforms should be liable for the content they rank, recommend or target to users.

Ofcom should issue a code of practice on algorithmic recommending. This should require platforms to conduct audits on all substantial changes to their algorithmic recommending facilities for their effects on users with characteristics protected under the Equality Act 2010. Ofcom should work with platforms to establish audits on relevant and appropriate characteristics.

Ofcom should be given the powers and be properly resourced in order to undertake periodic audits of the algorithmic recommending systems used by technology platforms, including accessing the training data used to train the systems and comprehensive information from the platforms on what content is being recommended.

Platforms v publishers

The report uses the term ‘platforms’ but holds them to a responsibility for a duty of care, responsible for the content that they promote to large audiences, rather than content they host. Ofcom should have the power to sanction platforms that fail to comply with their duty of care in the OH Bill. These sanctions should include up to 4% of global turnover, and powers to enforce Internet Service Providers blocking of serially non-compliant platforms.

Regulatory capacity

The Government should introduce legislation to enact the ICO’s proposal for a committee of regulators that would allow for joint investigations between regulators. This committee should also act as a forum to encourage the sharing of best practice between regulators and support horizon scanning activity.

The Centre for Data Ethics and Innovation should conduct a review of regulatory digital capacity across the Competition and Markets Authority (CMA), ICO, Electoral Commission, ASA and Ofcom to determine their levels of digital expertise. This review should be completed with urgency, to inform the OH Bill before it becomes law.

Freedom of expression

We protect free expression online by focusing on what platforms algorithmically promote rather than what they host. This means that platforms would not be encouraged to remove harmful but legal content. They would be required to not promote it through their algorithms or recommend it to users. This gives people the freedom to express themselves online but stops it from reaching large audiences.

We also support free expression by improving platform’s content moderation decisions. We do this by requiring greater transparency of what content they take down so that the rules that govern online debate are clearer and by establishing an online ombudsman who will be empowered to act for users online.

Anonymity online

Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users.

Online voting

We received a small amount of evidence that was in favour of online voting. In the round, however, opinion was overwhelmingly against introducing voting online. We heard that online voting might cause people to question the trustworthiness of election results and create fertile ground for conspiracy theories.

Exercising your democratic vote is an important act that should have some ceremony about it; visiting a polling station, for those for whom this is possible, is an important part of this. We should not seek to substitute or undermine this significant and important act with an online process.

Journalism in a digital world

We recommend that the CMA should conduct a full market investigation into online platforms’ control over digital advertising.

The Government should work urgently to implement those recommendations of the Cairncross Review that it accepts, as well as providing support for news organisations in dealing with the impact of COVID-19.

Education/digital literacy

Ofsted, in partnership with the Department for Education, Ofcom, the ICO and subject associations, should commission a large-scale programme of evaluation of digital media literacy initiatives.

The Department for Education should review the school curriculum to ensure that pupils are equipped with all the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review of initiatives recommended above. All teachers will need support through Continuous Professional Development to achieve this.

 

Currently, we have a sub optimal online world that is contributing to the erosion of trust in our democratic institutions and processes.  We hope that our report and the 45 recommendations within demonstrate that there is nothing inevitable about this and, velocity aside, not that much which is new.

It’s in our hands, our (well washed) hands and as we type, tap, and share, we must take more care.  What kind of conversations and discussions do we want to be part of and on what kind of social media? Do we want rigorous respectful debates that are open, transparent, accountable and trustworthy?

If not, why not pack away the public square, put away the polling stations, and with muffled cry, let our democracy die.

It’s no one else’s democracy, save ours.

Lords report asks for ethical AI

Chris has been a member of the House of Lords Select Committee on AI and on Monday (16th April) the final report and recommendations were published: “AI in the UK: ready,AI Select Committee Report launch willing and able?” Following 9 months of expert witness evidence and extensive consideration the report’s conclusions and recommendations emphasize that the UK is in a strong position to be a world leader in AI but that putting ethics at the heart of development and use is the best way to do this. AI, handled carefully, could be a great opportunity for the economy. The report buy valtrex pills makes 74 specific recommendations but one key recommendation is for a cross-sector ethical code for AI, underpinned by 5 principles:

1. AI should be developed for the common good and benefit of humanity.

2. AI should operate on principles of intelligibility and fairness.

3. AI should not be used to diminish the data rights or privacy of individuals, families or communities.

4. All citizens have the right to be educated to enable them to flourish mentally, emotionally and economically alongside AI.

5. The autonomous power to hurt, destroy, or deceive human beings should never be vested in AI.

Chris visits Rolls Royce site in Derby

Chris is undertaking a Fellowship Programme organised by the Industry and Parliament Trust (IPT). IPT fellowships are educational, non-lobbying attachments for parliamentarians, providing a unique perspective of the challenges and issues facing UK industry. Chris is particularly interested in the ways in which businesses and organisations are introducing new technologies and exploring different approaches to innovation. With this focus and visits arranged across sectors Chris hopes to get a detailed perspective on the strengths and challenges facing UK industry as the fourth industrial revolution gathers pace.

At the start of his visit Chris was given a tour of the Technology Exhibition where he was able to learn more about next generation manufacturing techniques and technologies including the Trent XWB aero engine, the most efficient wide body aero engine. Rolls-Royce has invested £30m in its production facility in Derby to help to deliver its order book including 1,600 Trent XWB engines. Chris was also delighted to sit down and hear from a group of young people involved in the on site Apprentice Academy.

IPT Press Release about Rolls Royce Visit