Page 14 - Connecting the Future How Connectivity and AI Unlock New Potential
P. 14

Connecting the Future: How Connectivity and AI Unlock New Potential








































                  1�2�7  Edge Computing and IoT Integration

                  As processing capacity and speed requirements grow, performing AI computing at consolidated
                  data centers and in the cloud becomes increasingly inefficient or impractical. Rather than accepting
                  transmission delays for these critical tasks, computing at the edge can reduce latency by allowing
                  data processing and analysis much closer to the data source, reducing transmission distance and
                  time required for data transfer. Devices such as smartphones and IoT smart sensors can also be
                  useful mid-points in this data transmission process. They enable both upstream applications that
                  filter prioritized data to centralized servers, and downstream applications that perform real-time
                  processing at the source.
                  Gartner predicts that 75% of data generated by companies will be created and processed outside
                  traditional data centers or cloud environments by 2025 as the computing power of smaller devices
                  continues to grow.  This expanded market for edge computation promises to offer proportionally
                                  33
                  greater impact to under-resourced markets and rural communities, where lower investment returns
                  on large data centers may hinder construction. By limiting data processing at the source rather
                  than consolidating it in data lakes, edge devices can reduce the investment costs of traditional data
                  infrastructure. The shortened journey of data processing also has security benefits: it minimizes
                  the risks of transmitting sensitive information, such as personal health records or financial transac-
                                         34
                  tions, over long distances.  However, it also underscores the importance of securing dispersed
                  edge computing devices and network endpoints, which can become vulnerable entry points if
                  not properly protected.
                  Still, powerful miniature computing elements will likely start at high price points, so private sector
                  development efforts should balance affordability with capability in AI-enabled edge devices to
                  improve access for a range of consumers rather than maximizing pure computing power.

                  1�2�8  Reconfiguring Network Architecture

                  Modernizing network architecture prepared for an AI economy can start with the reconfiguration of
                  existing network designs. The current cloud environment is often costly, involving purchasing many
                  routes and ports connected to different locations just to achieve the interconnectivity between
                  carrier-neutral data centers and multiple hyperscalers. This operation inevitably drives up the costs
                  and increases network latency. Some global communication and Internet service providers have
                  begun moving away from this classical design by upgrading their equipment and creating a direct




                                                           8
   9   10   11   12   13   14   15   16   17   18   19