It would be stupid for us to not consider the consequences as tools like Open AI’s ChatGPT or Google’s Bard for example to proliferate and introduce machine intelligence to everyday people. That includes how our data centers are evolving amid the rapid growth in data that needs to be stored, processed, managed, and transferred. By Dr. Michael Lebby.
AI could be the Achilles heel for the data centers unable to evolve in the face of massive datasets required for AI. The artticle the focuses on:
- From the Agora to hyper connected global markets: the rise of AI and modulators
- Survival by the numbers: measuring the strain of AI
- Avoiding data traffic jams
- Gauging the impact of AI
- Alleviating data center strain
If we look at the growth of computing power in high computational processing systems over the past 60 years we know that this growth has initially increased or doubled every 3-5years. Then from about 2020 onwards, the growth has increased by over an order of magnitude, or 10X, to a doubling of computational power every 3-4 months (in terms of petaflops - which is a metric for computational processing magnitude).
While AI is expected to grow in maturity and acceleration in popularity, the impact on data centers is serious and will impart an incredible level of strain on the future of the data center architecture. Five negative impacts have been outlined in this article, with one alleviation being the implementation and design of very high-performance polymer optical modulators, which have already demonstrated a capability to modulator light faster, reduce power consumption and be available in a tiny footprint the size of a grain of salt. Good read!
[Read More]