What is Labour for the Long Term, the mysterious group funding Labour MPs?
Wes Streeting is among Labour frontbenchers backed by a new group with links to so-called ‘Effective Altruism’
Keen politicos may have noticed a new presence at this year’s Labour’s conference.
Nestled among the more familiar fringe events for the party’s affiliated organisations was an event sporting frontbench MP speakers and prominent journalists hosted by ‘Labour for the Long Term’ (LLT).
LLT popped up on the MPs’ register of interests soon after, donating a £30,000-a-year ‘policy adviser’ to health secretary Wes Streeting. On its website, it boasts of having support from other shadow ministers, too, such as Anneliese Dodds, Alex Sobel and Fleur Anderson.
It has recently advertised for a ‘policy and operations manager’ worth up to £50,000 a year, suggesting a funding pot larger than most of Labour’s affiliated socialist societies. And it is understood that the group hopes to continue growing its roster of advisers and staffers for MPs and other decision makers across the spectrum of the Labour Party.
Get one whole story, direct to your inbox every weekday.
When asked, LLT would not disclose who funds it, beyond saying that donations have come from private individuals who are “UK citizens who support the Labour Party”.
So what is it?
LLT has a pretty eclectic range of priorities – stopping the use of AI in the military, pandemic preparedness, and allowing neighbours to vote on new local housing developments – all guided by the principle of making ‘long-term’ thinking into a Labour Party priority.
For those looking for any kind of through line between these policies, they bear stark similarities to the priorities of the ‘Effective Altruism’ (EA) movement that has developed over the last two decades, something that was first pointed out by Morgan Jones and David Klemperer in the journal Renewal.
Initially, believers in Effective Altruism wanted two broad things: for people to increase their charitable giving, and for the use of scientific rigour in optimising philanthropy to have the most effect. Leading proponents, for example, are opposed to donations to cancer charities as they say curing cancer will have a marginal effect on increasing human lifespans versus its cost, while, say, mosquito nets stop countless malaria deaths for a far smaller cost.
In recent years, EA has developed into a more specific movement, concerned with funnelling money into a distinct set of causes it believes pose an existential threat to humankind. That belief – known as ‘long-termism’ – suggests that the survival of the human race should be the number one concern of those seeking to change things through philanthropy, arguing that the suffering and death of hypothetical future lives is somewhat, or totally, equal to those currently living.
That in turn leads the movement to spend countless millions or billions on research into the risks of AI, or think tanks exploring nuclear disarmament, as they are seen to pose bigger threats to the existential future of humanity than, say, poverty – or even, in some EA circles, than climate change.
More often than not, the movement is also about siphoning money away from those causes deemed less worthy, toward those it says are more “effective”.
To some, this might seem fairly reasonable. But EA’s favourite concerns are extremely capital intensive to research, and often come with little to no tangible results, largely thanks to the speculative nature of predicting – and then working on stopping – things that haven’t come close to happening yet.
And at this point there is not a huge amount of evidence that, at the end of the conveyor belt of think tanks and book tours funded by EA, we are substantively closer to, say, nuclear annihilation being any less likely or less deadly than would be the case otherwise.
LLT does not identify itself as an offshoot of Effective Altruism, and the group sees itself as having broader interests than EA purists. But there are a number of historic links to EA among its board members, while many of EA’s favoured niche topics – like the threat of AI and nuclear annihilation, which together rarely make an appearance outside of EA circles – are prominent in LLT’s policy briefings.
And over the last few months, the Effective Altruism movement has been hard to ignore. Its leading light, Oxford University professor William MacAskill, has been on what just have been one of the biggest media rounds of the decade, at a cost of a reported $10m. He has appeared in The New Yorker, The Washington Post, The Times, The New York Times, Time, the Financial Times, The Atlantic, The New Statesman, NPR, Wired, The Guardian, Virgin Radio, The Daily Show and countless others – mostly to promote his new book, ' What We Owe the Future'.
And while the community itself is small – one internal poll suggested fewer than 10% of US adults had any proper awareness of it and only a few thousand members have signed up to the movement’s pledges – its pockets are deep. One leader of the movement estimated that EA had roughly $46bn at its disposal, an amount that had grown by 37% a year since 2015.
There’s also something more sinister to EA’s growing prominence – what one member self-described as the “Ponzi-ishness” of the movement: it has become increasingly fixated on growing its numbers. In fact, Open Philanthropy, one of the largest EA funds, has spent some $234m – its fourth largest spend area – on growing the reach of the Effective Altruist community, with everything from building media ecosystems to a network of university EA societies. It has spent less than half this much on pandemic preparedness ($141.5m) and less than one 18th of this amount ($12.7m) on immigration reform. In the UK, my own research shows 20 of the UK’s 24 Russell Group universities have, or had, an EA student society. For a movement based on opposition to wasteful philanthropy, it’s an odd approach to take, to say the least.
In the US, Silicon Valley tech moguls are typically its biggest proponents – like Facebook founder Dustin Moskowtiz, infamous founder of crypto-exchange FTX Sam Bankman-Fried (who has just been charged with fraud in the US) and Elon Musk, who recently claimed ‘long termism’ is a “close match for my philosophy”.
And if it were restricted just to the fortunes of billionaires, that would be one thing – but given its growing size and focus on political policy, it seems plausible that at least some of its priorities will spill over into frontline policymaking.
The risk with a solely long-term view is that you sacrifice the short-term in its wake
Bankman-Fried, as just one example, funnelled tens of millions of dollars in donations to Democratic and Republican politicians in the US that he thought were sympathetic with the ideas EA focuses on – usually through anonymous, undisclosed “dark” donations.
Then there are groups like LLT. While it stresses that it isn’t an avowedly Effective Altruist group and does not have “any affiliations or funding from groups related to Effective Altruism”, its undisclosed deep pockets and long-termist policy priorities still pose a similar problem for Labour politics.
None of the Labour MPs identified in this piece offered comment on their links to LLT, or whether any LLT-funded staffers would have a role in policy.
Looking out for the long-term can be a good thing – one that helps people and saves government money.
But the risk with a solely long-term view is that you sacrifice the short-term in its wake – if you think potential future versions of artificial intelligence are a bigger existential threat than, say, poverty, you prioritise acting on the former over the latter.
LLT says it wants a political return of so-called ‘cathedral thinking’ – that those who designed Notre Dame didn’t live to see the fruits of their own labour.
Notre Dame’s construction, though, didn’t make life any easier for the poor and hungry who lived in the shadow of it.
Get our weekly email