Home

Donate

Will Ireland be Big Tech’s Lapdog Yet Again?

Kris Shrishak / Mar 26, 2025

Main facade of Irish government buildings. (Wikipedia Commons)

There is a certain inevitability in how AI is framed and advertised. As if one has no choice but to accept — as the Head of Irish Government has — that AI is the “most profound economic revolution” that needs more data centers and more data.

This narrative serves the few at the cost of the many. The price is paid by the people in Ireland who should have a say, but don’t.

Data centers

In February, Ireland’s Commission for Regulation of Utilities (CRU) published a report proposing new rules for data centers. This report acknowledges that data centers in Ireland consume an above-EU average of 20 percent of the country’s electricity. Under high-demand scenarios, data centers are projected to consume more electricity than the entire industrial sector in the next five years.

This increase in electricity consumption swallows all the renewable energy Ireland produces. As a result, Ireland continues to rely on fossil fuels. The CRU mandates on-site power generation for new data centers, but does not mandate renewables. This is likely to further increase the use of fossil fuels, and consequently, emissions.

Although many data center operators have pledged to achieve climate neutrality by 2030, it is unlikely that this will be achieved. ‘Scope 2’ CO2 emissions from Meta’s data center in Ireland nearly doubled from 2020 to 2023. We know this because Meta publishes location-based emissions information, which provides important insights.

Disappointingly, the CRU does not require data centers to report their emissions. This does not bode well for a country that the Environmental Protection Agency has warned is not on track to meet its emissions reduction target by 2030.

Ireland cannot rely on nuclear energy like France does. At the AI Action Summit in Paris, French President Macron proudly cited France's 90 Terawatt surplus energy production in 2024 as sufficient to build additional data centers for AI.

However, environmental concerns about AI and data centers cannot be reduced to energy use. This one-dimensional approach fails to account for the impact of data centers on communities living near them. As documented in the Netherlands and Chile, communities are not consulted before the construction or expansion of data centers — even when their access to water and good health is at risk.

More data

Data is another key component. Ireland’s AI Advisory Council recently published an advice paper titled ‘The Impact of AI on Ireland’s Creative Sector’. The advice includes collective licensing that will allow AI companies “to enhance access to creative content while ensuring fair remuneration for rightsholders” to leverage “an EU Cultural Dataset for AI Training.” It will “simplify compliance [for AI companies such as OpenAI] with the AI Act's transparency obligations.”

This seemingly well-meaning advice benefits few AI companies and not the creative sector, as has been made abundantly clear in the UK. The access and exploitation of “culturally relevant materials for training” is important because the training data sources of these AI companies, which had been kept a secret, are now out in the open.

The advice might have been different if the Advisory Council, which includes OpenAI’s Associate General Counsel, had asked members from the creative sector.

In 2023, the author Douglas Preston speculated that "pirates had stolen our books, and then OpenAI stole them from the pirates, I guess." His guess was right.

In the documents released in a litigation against Meta, the company’s then research scientist and now a co-founder of Mistral AI, says “this [AI training using pirated books] is what OpenAI does with GPT3, what Google does with PALM, and what Deepmind does with Chinchilla…so we will do it to[o]”.

The Advisory Council recommends expanding Ireland’s basic income pilot “to creators affected by AI disruption could support their adaptation to new economic realities”. These “realities” have been framed and imposed by a few AI companies — the cost will be paid using public funds.

OpenAI has previously claimed to support creativity, which “is like claiming you're supporting the candy store by shoplifting." The Advisory Council seems to follow OpenAI’s approach.

What can the Irish Government do?

First, relevant stakeholders should be involved in AI policy discussions. The government should set up an AI Advisory Forum separate from the AI Council. This Advisory Forum should be established as part of Ireland’s upcoming law to implement the AI Act. At minimum, the forum's composition should include human rights experts, artists, creators, trade union members, and teachers.

Artists and creators are not the only ones currently at the mercy of a few AI companies and the narrative they wield. These AI companies want to “facilitate the destruction of the teaching profession” and yet the Irish government has shown no interest in involving teachers in the discussions around AI.

Second, support AI deployment only in the public interest. AI, on its own, is not in the public interest, nor is speculative AI or the investment in more data centers. Specific applications designed carefully for a specific domain with the involvement of relevant stakeholders could be in the public interest.

Third, be a leader, not a follower. Frugal AI can be in the public interest. Ireland should lead the way in supporting academic and industry initiatives that develop frugal and efficient AI, frugal energy consumption, and high-quality and less data-intensive useful AI models. Computer scientists revel in constraints and can design systems that utilize fewer resources. But they need the right incentives.

Big Tech’s interests are not in the public interest. Let Big Tech’s policies not be the Irish government’s policies. Else, the Irish Government — like the Irish Data Protection Commission — will become Big Tech’s lapdog.


Authors

Kris Shrishak
Dr. Kris Shrishak is a public interest technologist and an Enforce senior fellow at the Irish Council for Civil Liberties. He advises legislators on global AI governance (including the EU AI Act). His work focuses on privacy tech, anti-surveillance, emerging technologies, and algorithmic decision-ma...

Related

As Data Centers Push into States, Lawmakers are Pushing Right Back

Topics