Task list:
Determine the best way to reduce the amount of power and water consumed by LLM and AI systems.
Establish a more efficient algorithm to consume power more efficiently and therefore consume less water.
Reducing redundant computations in the algorithm to reduce power usage.
Adjust the amount of computation based on the complexity of the data inputted.
Establish an effective way to reduce the power output from the hardware.
Using more efficient GPUs, CPUs, and integrated circuits to consume power more efficiently.
Use various attributes of the final solution state to guide earlier decisions made along the solution path:
An accurate and efficient way of tracking water consumption from respective LLM that can be used to identify specific aspects of LLM systems that utilize excess water
Implement a compatible alternative cooling system that will either include, more efficient systems that require less water to cool the LLM, or more efficient CPUs and GPUs that won’t produce as much heat requiring less cooling
Ensure that the new cooling solution will have a reasonable ROI and can last given future regulation changes and advancements in LLM computing capabilities
Goals:
Average data center water consumption is between 1 million and 5 million gallons of water a day. We wish to get that highest consumption to at least 3.5 million gallons a day.
We wish for our design to not be water-based cooling, but due to practicality we shall move to a minimal water-based design rather than completely shutting out water.
Emphasis on water consumption rather than performance, keeping this priority in mind, we still wish to have a maximum of -5% performance degradation in relation to current water-based cooling systems.
Built for large data centers, specifically ones housing LLM’s due to their increased demand and scaling as technology advances.
Paths that would need to be eliminated to reach desired goals:
Increasing water usage for LLM data centers would have to be eliminated as we want to reduce the amount of water consumed by these data centers.
ROI must be the same or better than current industry standards so reducing our ROI of the LLM centers is non-negotiable because if our system is not performant then the reduction in emissions would not be helpful too
Cost limitations would have to be eliminated since in most cases the best technology tends to be much more efficient than older technologies, especially in the space of LLM. So eliminating cost barriers would be ideal to reach a decrease in water consumption
Maintaining relative efficiency ranging from 0%-5% reduction in performance is ideal therefore there is a major need to eliminate any sort of performance reduction greater than 5% as it would not be ideal in our system since we want our system to perform well when used.