Design & Reuse

AI Drives Data Centers to the Edge

eetimes.com, May. 11, 2020 – 

Data centers are expanding to the network edge to meet demand by artificial intelligence and other applications requiring fast response times not available from traditional data center architectures.

The problem with traditional architectures is their centralized framework. Data often travels hundreds of miles from the edge to computers, then back again. That's fine when you're dealing with email, Google, Facebook and other applications delivered via the cloud. Human brains are slow computers, unable to register the lag time between, say, clicking on an email message in a browser and the message opening.

But AI and other emerging applications – Internet of things (IoT), cloud-based gaming, virtual reality – require much faster network response times, otherwise known as "latency." That means data center processing must move to the network edge. Edge computing can take place in small data centers, roughly the size of shipping containers, rather than the warehouse-sized edifices that currently power the cloud.

Startups such as EdgeMicro and Vapor.io are deploying these "mini data centers."

Data center operators can still use their traditional structures, with fast networks and other hardware and software required to ensure the speedy response times needed for edge applications.

And edge data centers can reside on enterprise premises, or in exotic locations such as mines, ships and oilfields.

Click here to read more...