In fairness your calculation looks at the most expensive element of the DC but ignores all of the associated parts required to utilize the H100: CPU, memory, cooling, etc. No to say that that flips the calculation (I don't have the answer), but it does leave a lot of power out.
Let's be generous and pretend the rest of the hardware is free but double the energy budget of the H100 to account for all of it along with cooling. You're still at only $1k/yr; $10k over 10 years, or 25% of the TCO (ignoring all other costs).
A H100 cost 30k when new, and uses 500W of power.
500W for a year is about 4500kWh, which at $0.10/kWh is $450/year if run at full utilization (unrealistic).
TCO of an AI data center should be entirely dominated by capex depreciation.