Back to Public Intelligence
PrincipalStructural AnalysisInfrastructure Systemstechnical-memo

AI, Energy, and the Geometry of Constraint

December 29, 20258 min readThe Principal

Public Intelligence Only — This report reflects generalized observations and views of Hampson Strategies as of the publish date. It is not investment, legal, or tax advice, and it is not a recommendation to engage in any transaction or strategy. Use is at your own discretion. For full disclosures, see our Disclosures page.

AI, Energy, and the Geometry of Constraint

Why Power and Water Are Symptoms, Not the Root Problem

Abstract

Rapid growth in AI workloads has intensified scrutiny of data-center energy and water consumption. Much of the public discourse frames this challenge as a problem of efficiency—implying that sufficiently optimized models, hardware, or cooling systems can substantially reduce power and water usage without altering system fundamentals.

This paper argues the opposite: energy and water consumption are emergent properties of compute geometry, not independent variables.

Claims of large, order-of-magnitude reductions in power or water usage are only meaningful when accompanied by explicit changes in where, when, and how heat is generated and dissipated across the system. Absent such structural reallocation, efficiency gains tend to relocate losses rather than eliminate them.

The goal of this paper is not to propose a single solution, but to establish a clear analytical framework for evaluating claims, constraints, and tradeoffs in AI infrastructure design.

1. The Constraint Landscape

Modern AI systems operate under a small set of non-negotiable physical constraints:

  • Energy conservation: All computational work ultimately manifests as heat.
  • Thermal dissipation limits: Heat must be removed at a rate proportional to power density.
  • Infrastructure coupling: Cooling, power delivery, and compute placement are inseparable at scale.
  • These constraints do not disappear with better algorithms or faster chips. They define the feasible solution space.

    Water usage in data centers is best understood within this context. It is not a primary driver of system behavior, but a secondary response to thermal density. Where heat is concentrated, aggressive cooling—often water-intensive—becomes unavoidable.

    2. Why "Efficiency" Is an Incomplete Framing

    Efficiency improvements are real and valuable. Techniques such as:

  • reduced numerical precision
  • sparsity and pruning
  • workload batching
  • specialized accelerators
  • can meaningfully reduce energy per operation within specific regimes.

    However, these improvements share a common limitation: they operate within an existing geometry.

    When overall demand grows faster than per-operation efficiency improves—as has been the case for AI workloads—total energy consumption continues to rise. The system becomes more efficient per unit of work while becoming more intensive in aggregate.

    This distinction is often blurred in public discussion.

    3. Where Large Power Reductions Can Exist

    Significant reductions in apparent power usage are possible under constrained conditions, including:

  • fixed-function inference workloads
  • latency-tolerant batch processing
  • accuracy-relaxed applications
  • highly specialized hardware with narrow scope
  • In these cases, energy reductions are achieved by narrowing the problem, not by escaping the underlying constraints.

    These approaches are valuable—but they do not generalize to the full spectrum of AI computation, particularly training, dynamic workloads, or heterogeneous task environments.

    4. The Geometry Problem

    At scale, AI infrastructure is governed less by individual components than by how those components are arranged.

    Three geometric properties dominate outcomes:

    1. Spatial concentration

    High compute density produces high thermal density. Cooling demand scales non-linearly with concentration.

    2. Temporal synchronization

    Synchronous peak demand drives over-provisioning of both power and cooling capacity.

    3. Centralized topology

    Central aggregation of compute forces centralized heat rejection.

    As long as these properties remain unchanged, reductions in power or water usage will be incremental, not transformational.

    5. Water as a Symptom

    Water consumption tracks thermal intensity, not intelligence.

    Cooling systems use water because it is an effective medium for moving heat away from dense sources. When thermal density rises, water usage rises. When thermal density falls—through spatial distribution, temporal smoothing, or alternative dissipation paths—water usage declines naturally.

    This leads to an important conclusion:

    Attempts to "solve" water usage directly, without altering compute geometry, treat the symptom rather than the cause.

    6. Structural Reallocation as the Primary Lever

    Meaningful system-level reductions in energy and water usage require structural reallocation, including:

  • redistributing compute spatially to lower peak thermal density
  • shifting workloads temporally to reduce synchronization spikes
  • aligning compute execution with ambient or environmental gradients
  • designing demand-matched throughput instead of peak-oriented provisioning
  • These approaches do not eliminate energy expenditure. They change where and how it is expressed, allowing the system to operate within physical limits with lower external resource intensity.

    7. Evaluating Claims Responsibly

    Any claim of large-scale power or water reduction should be evaluated against a simple set of questions:

  • Which constraint is being relaxed?
  • What tradeoff is being accepted?
  • Where does the displaced energy go?
  • Does the approach generalize beyond a narrow workload class?
  • If these questions cannot be answered explicitly, the claim is incomplete.

    Conclusion

    AI infrastructure does not face a standalone water problem, nor a purely efficiency problem. It faces a geometry problem—one rooted in how energy, heat, and computation are arranged under immutable physical constraints.

    Progress will not come from ignoring these constraints, but from designing systems that respect them.

    Efficiency matters. Cooling matters. But structure decides.

    Author's Note

    This paper is intended as a framing document. It does not prescribe a single implementation, but provides a lens through which competing approaches can be evaluated rigorously and honestly.

    Share:
    Talk with Us