Category Archives: Hansard

Amendment 103 | Employment Rights Bill – Report (2nd Day) | Lords debates

My Lords, it is a pleasure to move Amendment 103 in my name. As this is the first time I have spoken on the Bill on Report, I declare my relevant interests as set out in the register as a member of the global advisory board of Endeavour plc and of the science and technology advisory committee of the Crown Estate, and I had a speaking engagement with the FCSA earlier this year.

Amendment 103 is incredibly simple and extraordinarily important for all those young people who have the most appalling start to their career through finding themselves on the wrong end of an unpaid internship. This has been going for decades and it goes on in some of our smartest industries in the 21st century.

The amendment is a reincarnation of a Private Member’s Bill that I brought forward in 2017. I am delighted to say that when I brought that Bill, which is now Amendment 103 to this Bill, it received full-throated support from the Labour Opposition, whom I thank. It also received full-throated support from the TUC and the noble Baroness, Lady O’Grady, whom I thank.

The amendment simply seeks to give young people the right to have a positive experience—often their first—of entering the labour market. Unpaid internships are already illegal under the National Minimum Wage Regulations, but this amendment further clarifies and specifies what work experience is and, crucially, what it is not. It stops work experience being used as a cover for unpaid internships.

When I drafted the amendment, my first inclination was to have work experience paid from day one. But after wide consultation with businesses and trade unions and across civil society, it was clear that four weeks was the right point to suggest that young people—indeed, any person—could do genuine work experience, overseeing, learning and replicating tasks. If that person is brought on board and is doing work from day one, they are protected by the National Minimum Wage Regulations and are entitled to pay. Work experience has a vital role to play in our society and, as the results of my consultation underpin, four weeks is the right point at which to set the limit.

When the amendment was debated in Committee, when sadly I could not be present, a number of views were put forward that suggested there were difficulties with it because unscrupulous employers could simply have numerous rounds of four-week or part-of-four-week periods, but that is not accurate. The wording describes it as a

“continuous or non-continuous period which exceeds four weeks”,

so the drafting already caters for employers who might seek to get around it by having continuous periods of unpaid work experience.

As one young person put it to me, you cannot pay the rent or pay for food with a glowing CV. Ultimately, it is just a question of talent. Why would we want businesses and organisations not to be able to take from the widest, broadest and most diverse talent pool to go into these roles? Some of these roles are at the classier end of the labour market, but it goes through all strata of the labour market. Surely these positions should be open to all on a fair and equitable basis. That is what this amendment would allow for.

We have the ideal opportunity with this Bill to put this right. It seems more than extraordinary, with so many of the other issues that are covered in this not unsizeable Bill, that there is nothing on unpaid internships, nothing to protect those people who find themselves being exploited at the beginning of their career. I ask the Minister: if not this Bill, what Bill? If not this amendment, will the Government not bring forward some wording to end this pernicious practice, which still prevails in 21st-century Britain—a desperate, dispiriting, Dickensian practice that still goes on across our labour market? Why would the Government, alongside all their other measures, not take this opportunity to close this loophole? It would allow young people, or any person seeking to get their first foothold in the labour market, to have a positive, supportive work experience into paid employment. I very much look forward to the Minister’s response. I beg to move.

Amendment 110 | Employment Rights Bill – Report (2nd Day) | Lords debates

I thank the Minister for his response. It is good to hear that the consultation is coming in the autumn, and we can only hope that is the early autumn and that following that, perhaps there can be some more pace, and it will not be put out to 2027. We also hope the Minister will consider what happens in the interim for all those businesses currently doing the right thing that are disadvantaged by being in a market where some others are perhaps not operating to the same standards and codes of practice. But for now, I beg leave to withdraw the amendment.

Amendment 110 withdrawn.

Amendment 161 | Children’s Wellbeing and Schools Bill – Committee (6th Day) | Lords debates

My Lords, it is a pleasure to take part in this group of amendments and to follow all noble Lords and give more than a nod to many of the amendments that have already been debated. I also wait in anticipation for my noble friend Lord Moynihan’s amendment, which I would have signed if I had been quicker with my drafting pen. I shall speak to Amendment 186 in my name and I thank my friend in sport, the noble Baroness, Lady Grey-Thompson, for co-signing that amendment. I am also grateful to all organisations that have been in contact with me on the issues this amendment addresses.

The Government have set out their plans for breakfast clubs, but in many ways those plans are silent when it comes to children with special educational needs and disability. There is a whole series of risks with not being clear in the Bill in relation to the issues that are specific to those groups of children: not least the question of food itself and the attendant issues; transport—how those young people get to school in the first instance—and the specialist support that is often required throughout the school day. Without consideration of those three issues, it is likely that the plans will leave children with special educational needs and disabilities with suboptimal—or potentially no—ability to access the breakfast club provisions.

Current data shows that a third of children with special educational needs are entitled to free school meals but do not access them. That stat would increase if you considered the specific context of breakfast. The evidence is clear that, as other noble Lords have pointed out, when it comes to good-quality, nutritious food there is an academic benefit and a mental and physical benefit—food for thought, food for sport.

If a third of young people with special educational needs and disabilities are not enabled to take the opportunity of free school meals, it seems clear that the Bill needs to be far more specific when it comes to the nature of provision that can be inclusive for all those who would wish to benefit from such provision. It is a question not just of the nutritious food but of the social network and the relationship element. If SEN and disabled children are unable to access the breakfast clubs, they are cut out of not only the food provision but that important part of the social network—the relationship nature of the whole school day experience. What happens if the transport is structured in such a way that it does not get to the school until the official start of the academic school day? Again, SEN and disabled children are effectively excluded.

For many people, food can be a difficult subject to discuss. There are specific issues when it comes to those with disabilities, particularly those who suffer from ARFID and other such conditions. The relationship to food can be complex. The Bill is largely silent in this respect. If the Bill does not specifically address the issues around transport, the provision of that specialist support and food provision, breakfast clubs will not be inclusive and will not enable and empower those with SEN and disabilities. There are many start points in life that impact people’s educational career and, subsequently, their work career. They can be positive or otherwise. Breakfast clubs need to be in that positive bracket. Currently they are somewhere short of it.

In short, the Bill needs to be clear that breakfast clubs are inclusive for all. As ever, “inclusive by design” does not just mean making provisions that benefit those with special educational needs or disabilities. It means benefiting the whole school population and the whole school experience. If the Government do not make amendments to this effect, the outcome is far more likely to risk those children with special educational needs and disabilities being disadvantaged before the school day has even begun.

AI and Creative Technologies (Communications and Digital Committee Report) – Motion to Take Note | Lords debates

My Lords, it is a pleasure to take part in this debate. I declare my technology interests as set out in the register. It is also a pleasure to follow my friend, the noble Baroness, Lady Lane-Fox, who has done as much as anybody in this country over the past two decades for all the technology industries in her work both as an entrepreneur and as a visionary for what technologies can do—not least, public good.

I congratulate my noble friend Lady Stowell—and, indeed, all members of the committee—on this excellent report. I can also comment positively on the approach that she took when doing this report in having it not just as an isolated moment in time but as a series of reports, one building on another, during her time chairing the Communications and Digital Select Committee.

In my final initial thank you, let me say that I am grateful to our two maiden speakers. Both of them, not least my noble friend Lord Massey, gave us excellent perspectives and a number of excellent ideas for not just this current Minister but all HMT Ministers to consider; all were excellent suggestions that would really make a difference.

My comments on the Select Committee’s report could be quite short. I agree with pretty much everything in it and all the recommendations. It goes to the heart of the issue, which we have discussed certainly in my time here, over the past 12 years, and previous to that. We have a stunning start-up scene in the UK, with fabulous universities, great spin-outs and a very good seed funding model, but it all gets tricky when you get to the scale-ups. This report highlights the issues, puts the recommendations and, I hope, gives the Minister much to think on.

As my noble friend Lord Willetts put it, when it comes to Silicon Valley, nobody should be beguiled to believe that this is a great success of the free market. It is birthed very much from state intervention—intervention in the right way, to enable and to empower the crowding-in of private capital. I cannot improve on his perfect construction—Hamiltonian, not Jeffersonian.

This debate, like so many that we have had on artificial intelligence and other technologies, goes to the heart of an issue. Government and wider society often struggle, because it can be seen as “a part of” or as “sector specific”. As has been noted in this debate, AI is a technology. I go beyond that. It is a series, a set, a constellation of technologies, some yet to be brought into being. I gently suggest that part of the problem with the current narrative and with the Government’s approach is that AI is not seen as this constellation of technologies but, reductively, as just gen AI. Although important, that is but one part of this constellation. Time will tell, but I do not believe that gen AI will be either the most interesting element of AI or the one most likely to deliver anything that we would recognise as a return on investment.

How much does it cost to do an AI model—£500 billion, £5 million, or somewhere in between? The answer is complex. However, getting to grips with smart funding in the UK AI ecosystem and context gives us the best chance of solving this scale-up challenge. I suggest there are three elements we may want to consider in the overarching approach to solving this issue. The first is to consider principles rather than prescription. We give ourselves the best opportunity if we take a principles-based, outcomes-focused, input-understood approach to everything that we do in this space, with trust and transparency—the technologies are nothing without those—inclusion and innovation, interoperability of both the technologies and the regulatory frameworks around the world, accountability, accessibility and assurance. These are good principles for all approaches, certainly regarding AI. Perhaps the Government will consider putting such principles on a statutory basis. They are largely set out in the 2023 White Paper. Giving them statutory effect would only be positive in this mission, which we all need to focus so clearly on.

Secondly, we need a right-sized, agile, adaptive and flexible AI authority. Do not think “big, behemothic, do-it-all AI regulator”. It could be just a development of the role of RIO, so ably chaired by my noble friend Lord Willetts. It could be a coming together of RIO and the DRCF. Maybe it could be a new entity in toto. Although it is right to take a domain-specific approach—that is where the domain expertise lies—we need to assure those three core elements that any of us need when we come across AI, or indeed anything: clarity, certainty and consistency. Without a guiding mind or an agile regulator, how can we have that clarity, certainty and consistency of application? Whether we come across AI in health, education or defence, how can we be assured that we will be having a similar experience? The AI authority could be that champion of the principles, the custodian and a centre of experts, giving an efficient and effective solution to the current situation of various regulators competing for a scarce talent pool. How does the nation benefit if Ofgem, for example, gains a particular data scientist and Ofcom does not? We do not benefit as a nation, and nor does our AI ecosystem.

Thirdly, we need to thoroughly and finally smash the myth, the false dichotomy that recurs with tedious inevitability, that you can have either innovation or regulation but never the twain shall meet. All history—not least, recent history—tells me that right-sized regulation is enabling and empowering of innovation. Take the regulatory sandbox in fintech. A measure of its success is that it has been replicated in just under 100 nations around the world—a UK creation by a UK regulator. It was great to see the announcement earlier this week from that regulator, the FCA, in combination bringing forth the supercharged AI sandbox.

We know how to do this, yet we are not doing enough of it. We all know bad regulation. In no sense does that mean that regulation is, of itself, bad. Right-sized regulation is good for investor, for innovator, for citizen, for creative, for consumer and for our country. Right-sized regulation, structured in an agile way, can be our path to the future and those future technologies, however they may be and in whatever form they come into being. The Government rightly talk of growth. These sectors, deploying these technologies, are most likely to bring this growth to bear.

How are we to move from being an incubator economy, an incubator country, when it comes to these technologies, to putting the right support in place, having the right skills, putting the right funding and expertise in place, at start-up and certainly at scale-up, and at the right level to shoot at nothing short of the “unicornification” of the UK economy? That is because of not only the economic benefit that will flow from having unicorns but the role-modelling that having those companies in our country can do, and how that brings forward all levels of the developmental pyramid. We know how to do this. We can do it at pace and we must. We are talking of our data, our decisions and, if we get this right, together, our human-led, AI-enabled, AI-empowered futures.

AI and Creative Technologies (Communications and Digital Committee Report) – Motion to Take Note | Lords debates

My Lords, it is a pleasure to take part in this debate. I declare my technology interests as set out in the register. It is also a pleasure to follow my friend, the noble Baroness, Lady Lane-Fox, who has done as much as anybody in this country over the past two decades for all the technology industries in her work both as an entrepreneur and as a visionary for what technologies can do—not least, public good.

I congratulate my noble friend Lady Stowell—and, indeed, all members of the committee—on this excellent report. I can also comment positively on the approach that she took when doing this report in having it not just as an isolated moment in time but as a series of reports, one building on another, during her time chairing the Communications and Digital Select Committee.

In my final initial thank you, let me say that I am grateful to our two maiden speakers. Both of them, not least my noble friend Lord Massey, gave us excellent perspectives and a number of excellent ideas for not just this current Minister but all HMT Ministers to consider; all were excellent suggestions that would really make a difference.

My comments on the Select Committee’s report could be quite short. I agree with pretty much everything in it and all the recommendations. It goes to the heart of the issue, which we have discussed certainly in my time here, over the past 12 years, and previous to that. We have a stunning start-up scene in the UK, with fabulous universities, great spin-outs and a very good seed funding model, but it all gets tricky when you get to the scale-ups. This report highlights the issues, puts the recommendations and, I hope, gives the Minister much to think on.

As my noble friend Lord Willetts put it, when it comes to Silicon Valley, nobody should be beguiled to believe that this is a great success of the free market. It is birthed very much from state intervention—intervention in the right way, to enable and to empower the crowding-in of private capital. I cannot improve on his perfect construction—Hamiltonian, not Jeffersonian.

This debate, like so many that we have had on artificial intelligence and other technologies, goes to the heart of an issue. Government and wider society often struggle, because it can be seen as “a part of” or as “sector specific”. As has been noted in this debate, AI is a technology. I go beyond that. It is a series, a set, a constellation of technologies, some yet to be brought into being. I gently suggest that part of the problem with the current narrative and with the Government’s approach is that AI is not seen as this constellation of technologies but, reductively, as just gen AI. Although important, that is but one part of this constellation. Time will tell, but I do not believe that gen AI will be either the most interesting element of AI or the one most likely to deliver anything that we would recognise as a return on investment.

How much does it cost to do an AI model—£500 billion, £5 million, or somewhere in between? The answer is complex. However, getting to grips with smart funding in the UK AI ecosystem and context gives us the best chance of solving this scale-up challenge. I suggest there are three elements we may want to consider in the overarching approach to solving this issue. The first is to consider principles rather than prescription. We give ourselves the best opportunity if we take a principles-based, outcomes-focused, input-understood approach to everything that we do in this space, with trust and transparency—the technologies are nothing without those—inclusion and innovation, interoperability of both the technologies and the regulatory frameworks around the world, accountability, accessibility and assurance. These are good principles for all approaches, certainly regarding AI. Perhaps the Government will consider putting such principles on a statutory basis. They are largely set out in the 2023 White Paper. Giving them statutory effect would only be positive in this mission, which we all need to focus so clearly on.

Secondly, we need a right-sized, agile, adaptive and flexible AI authority. Do not think “big, behemothic, do-it-all AI regulator”. It could be just a development of the role of RIO, so ably chaired by my noble friend Lord Willetts. It could be a coming together of RIO and the DRCF. Maybe it could be a new entity in toto. Although it is right to take a domain-specific approach—that is where the domain expertise lies—we need to assure those three core elements that any of us need when we come across AI, or indeed anything: clarity, certainty and consistency. Without a guiding mind or an agile regulator, how can we have that clarity, certainty and consistency of application? Whether we come across AI in health, education or defence, how can we be assured that we will be having a similar experience? The AI authority could be that champion of the principles, the custodian and a centre of experts, giving an efficient and effective solution to the current situation of various regulators competing for a scarce talent pool. How does the nation benefit if Ofgem, for example, gains a particular data scientist and Ofcom does not? We do not benefit as a nation, and nor does our AI ecosystem.

Thirdly, we need to thoroughly and finally smash the myth, the false dichotomy that recurs with tedious inevitability, that you can have either innovation or regulation but never the twain shall meet. All history—not least, recent history—tells me that right-sized regulation is enabling and empowering of innovation. Take the regulatory sandbox in fintech. A measure of its success is that it has been replicated in just under 100 nations around the world—a UK creation by a UK regulator. It was great to see the announcement earlier this week from that regulator, the FCA, in combination bringing forth the supercharged AI sandbox.

We know how to do this, yet we are not doing enough of it. We all know bad regulation. In no sense does that mean that regulation is, of itself, bad. Right-sized regulation is good for investor, for innovator, for citizen, for creative, for consumer and for our country. Right-sized regulation, structured in an agile way, can be our path to the future and those future technologies, however they may be and in whatever form they come into being. The Government rightly talk of growth. These sectors, deploying these technologies, are most likely to bring this growth to bear.

How are we to move from being an incubator economy, an incubator country, when it comes to these technologies, to putting the right support in place, having the right skills, putting the right funding and expertise in place, at start-up and certainly at scale-up, and at the right level to shoot at nothing short of the “unicornification” of the UK economy? That is because of not only the economic benefit that will flow from having unicorns but the role-modelling that having those companies in our country can do, and how that brings forward all levels of the developmental pyramid. We know how to do this. We can do it at pace and we must. We are talking of our data, our decisions and, if we get this right, together, our human-led, AI-enabled, AI-empowered futures.

AI Opportunities Action Plan – Question | Lords debates

My Lords, Matt Clifford delivered an excellent report, with 50 wide-ranging recommendations across our economy and society. Does the Minister agree that the fact that they rightly range widely makes clear the need for the Government to bring forward cross-sector AI regulation to ensure that, wherever we come across AI in our lives, there will be clarity, certainty and consistency on how we have that AI experience, which would surely be good for innovators, investors, creatives, citizens and our country?

Amendment 148 | Employment Rights Bill – Committee (7th Day) | Lords debates

My Lords, it is a pleasure to follow my friend the noble Lord, Lord Clement-Jones. In doing so, I declare my technology interests as set out in the register. It is a pleasure to follow him because this has always been his “WAIRIA” of expertise—bear with me. I will speak to my Amendments 289 to 298 and 314 to 316, but before doing so, I give full-throated support to everything the noble Lord said and his amendments. We are very much on the same page.

There is a strange situation with government at the moment when it comes to AI. That is not specific to employment rights but across the piece. We have been subject to it for the past year. We are told consistently that the Government will not be bringing forward cross-sector AI legislation. That position is to be defended if it is taken—the Government have decided on a domain-specific AI approach. But the difficulty with that is that whenever we have had domain-specific legislation coming through your Lordships’ House—be it product regulation, data or any of the Bills that I, my friend the noble Lord, Lord Clement-Jones, and others, have worked on—we have been told that those are not the Bills where AI is to be considered. In only a slightly reductive way, we currently have a situation, to be clear, where the Government are saying they are not bringing forward cross-sector AI legislation and specific Bills are largely—not exclusively—not the place to incorporate AI issues.

The amendments that noble Lord, Lord Clement-Jones, and I set out in this group are key to one of the most important sectors—it is broader than a sector, and such an important aspect of our lives. It is how we are employed, what that employment looks and feels like, and how it is experienced by all of us. These amendments do not seek to address issues that will occur next year, next month or even tomorrow. AI is impacting workers right now, oftentimes without them even knowing that it is in the mix.

My first amendment seeks to suggest that the principles that have variously appeared in White Papers and other reports are put on a statutory basis in the Bill. We give ourselves the best opportunity to optimise with AI if we take a principles-based, outcomes-focused and input-understood approach. Similarly, I set out in Amendment 290 that all employers and organisations that develop, deploy or use AI should have an AI responsible officer. For this, do not think burdensome, bureaucratic or overcompliance. Because of the proportionality principle, it simply means that there is an obligation on those employers to report on their use of AI in the workplace. It can be well understood through reporting obligations such as those set out in the Companies Act, which employers will be very familiar with at this stage.

My amendments then move to questions of use. What happens where IP or copyrighted material is being used in the workplace? There needs to be labelling so that everybody is clear on, and there is transparency about, what is going on. What about the use of workers’ data? This is an incredibly rich resource that should not in any sense be served up or sold off to the highest bidder. The use of AI in the workplace should be clear and transparent, and workers should have an opt-in, not an opt-out, responsibility, as set out in the amendments.

Then, as the noble Lord, Lord Clement-Jones, has touched on, there is the question of automated decisions. It is clear that workers not only have to be aware that ADM is being used—and have the right to opt out—but also need the right to a human explanation of what is happening in those situations. If we are to optimise things with these technologies, concepts such as “human in the loop” and “human over the loop” must be understood. Safeguards need to be in place, not least where ADM is used, and this could form part of the data protection impact assessment that employers have to undertake.

Then there is the question of regulators. Employment and recruitment currently find themselves wide open to the use of AI. An individual may find themselves not getting shortlisted, not getting hired and not even knowing that the reasoning behind that was algorithmic processing rather than human judgment and human reasoning. It is critical to consider the right approach to fill that regulator gap. Would a specific employment and recruitment regulator do the job? My view—and I think there is evidence to support this—would again be that we could have a cross-sector AI authority. Again, do not think of a bureaucratic and burdensome AI regulator; instead, think of a nimble, agile, adaptive and, crucially, horizontally focused AI regulator, not only in the area of employment rights but across the whole of our economy and society. It would deliver that clarity, consistency and certainty that we all need wherever we come across AI in our working, professional and private lives.

It is so significant that, in Amendment 315, I believe there should be a commission on AI in the workplace. Mindful of comments from Monday, I am certainly no fan of setting up a commission to delay or kick issues into the long grass. But perhaps by using the technology to solve some of the issues that are created by the technology, we could have a reimagined approach to commissions and consultations.

Finally, I come to Amendment 316 and the algorithmic allocation of work. This is already happening, and it has already been in front of the courts. It is clearly an issue and one that needs to be fully understood. The Government need to state clearly their position on this most significant of matters. I look forward to other speakers and to the Minister’s response.