The artificial intelligence (AI) bubble is not popping anytime soon. In fact, it might not be a bubble at all.
It has already embedded itself so deeply across a myriad of sectors that calling its explosive growth a fad feels like an understatement, analysts have argued in recent industry outlooks.
“There isn’t an industry [AI] won’t touch – similar to the cloud when it emerged,” Allianz Technology Trust lead manager Mike Seidenberg said in an October 2025 interview.
But this comes with a steep price. Massive data centers are being built globally to meet increasing computational demands, putting pressure on local grids and threatening to deprive surrounding communities of the freshwater they need.
In neighboring Thailand, a single hyperscale data center being built in the industrial hub of Chonburi is projected to extract up to 9,000 cubic meters of water a day to keep its servers cool, Mongabay reported.
This is equivalent to the annual water consumption of nearly 37,000 local residents, sparking fears of severe drought among farmers.
Here in the Philippines, PLDT’s data center arm VITRO unveiled in April 2025 the country’s first “AI-ready” hyperscale data center in Laguna. While the facility launched with an initial 50-megawatt (MW) capacity, telco tycoon Manuel V. Pangilinan pledged to expand operations to 500 MW to rival neighboring tech hubs.
Cramming large-scale data facilities in Metro Manila without proper planning could easily overwhelm local substations and exacerbate grid congestion, climate think tanks have warned.
While we cannot simply pull the plug on AI at this point, we can rethink the hardware that powers it.
A growing group of tech giants and startups believes the answer lies in a specialized hardware technology called photonics.
AI models are increasingly seen as major water consumers, a December 2025 study published in the journal Patterns noted.
The study estimated that global AI systems could have reached a water footprint of up to 764.6 billion liters in 2025. That is enough freshwater to supply the daily needs of the entire Metro Manila for over four months, according to demand projections cited in a 2019 water security report by the Philippine Institute for Development Studies.
This footprint refers to the total volume of freshwater required not just to physically cool down overheating servers, but also to generate the massive amounts of electricity that power them.
The carbon toll is equally staggering. AI’s carbon footprint hovered between 32.6 and 79.7 million tons of carbon dioxide last year, an output roughly equivalent to New York City.
In 2025, power consumption from data centers hit around 415 terawatt-hours (TWh), the International Energy Agency said in a recent flagship report. This is about 1.5% of the total global electricity consumption across all sectors in 2024.
These figures are poised to surge. Global electricity consumption for data centers is projected to double, reaching 945 TWh by 2030, the agency revealed.
Energy demand from accelerated servers alone – the specialized hardware primarily driving AI adoption – is expected to grow by a staggering 30% annually over the same period.
Private sector forecasts echo this steep climb. Projections from Goldman Sachs show that global power demand from data centers will increase by 165% by the year 2030, driven largely by the aggressive expansion of AI capabilities.
The core of this environmental crisis lies in a fundamental physics problem inside the data center.
For decades, the tech industry operated on the promise of Moore’s Law, which predicts that computing power doubles roughly every two years.
But as AI infrastructure scales at an unprecedented rate, this rule is reaching physical limits. While processors have become exponentially faster, the traditional copper wires used to move electrical signals between these supercharged chips are struggling to keep up.
“The central problem is not a lack of processing power but rather the physical limitations of moving vast amounts of data between processors and servers,” AIM Photonics said in an industry report.
Pushing massive amounts of data through copper creates electrical resistance, which in turn generates severe heat. In modern AI server clusters, the simple act of shuttling data back and forth can eat up to half of the total system energy, semiconductor firm POET Technologies said.
Every watt of electricity wasted on data movement and every drop of water used to cool down those hot copper wires is a direct blow to global sustainability goals.
This is where photonics enters the frame.
Photonics is a branch of technology that ditches traditional copper wiring in favor of lasers. Instead of transmitting electrons through metal, it uses pulses of light to transmit data across optical fibers.
Because light does not suffer from the same electrical resistance as copper, it generates significantly less heat. This means data centers require vastly smaller cooling systems and draw far less electricity from the grid.
To maximize these benefits, engineers are developing co-packaged optics (CPO). This involves bringing the light-emitting components directly next to the silicon computer chips.
By shortening the physical distance data has to travel, CPO drastically reduces energy leakage and boosts processing speeds.
Embracing light-based computing could help companies reduce their energy consumption by 90%, an enterprise brief from computing startup LumiAIres said.
Making AI greener has become a competitive edge for companies trying to survive an era of rising electricity costs, the firm added.
The industry is already moving these concepts out of the fab and into the data center.
In March, Silicon Valley-based startup Ayar Labs and cloud infrastructure provider Wiwynn detailed a strategic partnership to deliver optically connected rack-scale AI systems. Their architecture utilizes a specialized optical chiplet alongside a remote light source known as SuperNova, which operates under far less demanding thermal conditions by physically separating the laser from the hot electronic chips.
Boston-based competitor Lightmatter has also made significant strides. Earlier this year, the photonics startup demonstrated the successful sampling of its “Passage” CPO chiplet.
By bringing optical and electrical interfaces into a tightly integrated package, they aim to deliver unprecedented bandwidth density while halving the energy consumption of conventional interconnects.
Even chip behemoth NVIDIA is leaning into light. The company revealed a collaboration with foundry Tower Semiconductor to produce 1.6-terabit silicon photonics optical modules for future AI data centers, paving the way for its next-generation networking platforms.
Venture capitalists are fully aware of this paradigm shift, pouring billions of dollars into the optical computing sector.
In February, semiconductor giant Marvell announced the completion of its acquisition of Celestial AI, a pioneer in optical interconnect technology.
Meanwhile, optical networking firms like Lumentum and Coherent have increasingly aligned their manufacturing efforts with NVIDIA to supply the necessary transceivers for next-generation AI factories.
Lightmatter even saw its valuation triple in 2023 following heavy backing from investors like Google Ventures, Reuters reported.
The transition to photonics is now a necessary climate intervention.
Under the Paris Agreement, nations committed to limiting global warming to 1.5°C above pre-industrial levels. But AI’s current hardware trajectory threatens to blow past this critical threshold.
Traditional data centers continuously generate massive amounts of thermal waste. To prevent them from melting down, grid operators are often forced to burn more fossil fuels just to supply the electricity needed for industrial-grade cooling systems. This pumps millions of tons of greenhouse gases into the atmosphere while exhausting community water supplies.
But upgrading the hardware is only half the battle. Hardware efficiency alone will not save the climate if the power grids running these servers remain dirty.
The transition to light-based computing must go hand in hand with a massive pivot to renewable energy.
The Institute for Climate and Sustainable Cities urged policymakers and tech operators to build hyperscale data centers strategically next to untapped renewable energy zones to anchor this new digital demand entirely to clean power generation.
Allowing the tech industry to significantly increase the global carbon budget directly undermines these international climate pacts.
Technology companies and governments hold the power to dictate the environmental footprint of the next digital era. Replacing copper with light and powering it entirely with renewable energy offers a tangible scientific pathway to ensure that the AI revolution does not come at the expense of a livable planet. – fyt.ph