It's odd to me that people talk about data centers in terms of megawatts, a measure of electric power. For one example among many, a Bloomberg article from March about Microsoft cancelling data center plans began:
Microsoft Corp. has walked away from new data center projects in the US and Europe that would have amounted to a capacity of about 2 gigawatts of electricity, according to TD Cowen analysts, who attributed the pullback to an oversupply of the clusters of computers that power artificial intelligence.
"Data center projects that...amounted to a capacity of about 2 gigawatts of electricity" is a nonsensical statement. The (technical) capacities of a data center have to do with storage, compute, transmission, and latency. I understand there's probably some Fermi calculation along the lines of converting electric power to compute capacity using the TDP of NVIDIA's latest GPUs or something like that. Nevertheless, it's misleading to speak this way, not to mention lazy. It is oversimplifying in a bad way, treating data centers as utilities that supply a commodity when that is just not the case (at least not with AI, where prices for services still fluctuate fairly wildly).
ֆᎮ⊰◜◟⋎◞◝⊱ֆᎮ
in reply to Anthony • • •Anthony
in reply to ֆᎮ⊰◜◟⋎◞◝⊱ֆᎮ • • •