Why Data Centers Are The ‘New Wow’ for Developers and Software Architects

Developers are always looking for the ‘next big thing’ for their careers. NVIDIA's Ami Badani shows IDN how 5G, AI and autonomous technologies are set to revolutionize data centers, app development – and programmer opportunities.

Tags: 5G, AI, apps, autonomous, cloud, data center, developers, Nvidia,

Ami Badani, Nvidia
Ami Badani
vp marketing
Nvidia


"The modern data center will require developers to architect software [for] GPUs, SmartNICs and programmable DPUs."

Intelligent Data
Summit
Analytics, Apps & Data for Success in the Digital Enterprise
July 22, 2021
Online Conference

“Adaptability is about the powerful difference between adapting to cope and adapting to win” - business strategist and author Max McKeown.

 

It’s a powerful message for businesses, as an explosion of rich content, AI and machine learning applications force companies large and small to dramatically reimagine how data centers process and store an increasing deluge of digital information.

 

Accelerating change across industries such as telecommunications, retail and financial services is creating a performance tax on traditional data centers, where the CPU handles all of the compute tasks. Cloud-based technologies such as virtualization and containers require a more sophisticated approach to networking, security, and storage.

 

This additional functionality consumes an increasing percentage of the server’s CPU cores, especially as the network speeds increase  Few companies can afford to pay that increasing tax on compute loads, so many server makers, including Asus, Atos, Dell Technologies and Lenovo, have announced plans to integrate Data Processing Units (DPUs) that will share aspects of the heavy lifting.

Requirements for the Modern Data Center Open Developer Opportunities

The modern data center will require developers to architect software in which certain tasks are offloaded from CPUs onto GPUs, SmartNICs and programmable DPUs. For example, AI training and inferencing, sophisticated graphics, and big data analysis can be performed much more efficiently by GPUs than by CPUs.

 

Similarly, workloads related to networking, security, and storage are much better suited to running on a DPU instead of the server CPU. This expanded application acceleration will be needed for all enterprise workloads to provide an extra measure of performance, efficiency, and protection against threats like hackers.

 

As virtualization and scalability become more prevalent, CPUs will benefit from the decreased load, which will allow businesses to accelerate services.

 

For software developers, these changes will open a vast new world of opportunities.

Two Key ‘Reimagine’ Opportunities for Software Architects, Developers

At NVIDIA, we see two key areas that will empower developers and software architects to reimagine how they work and help them deliver new and innovative processes to their employers: AI and Security at the Edge

 

Accelerating AI
Enterprises increasingly are trying to solve big problems to create new revenue opportunities. Machine learning and artificial intelligence are critical to that need.

 

Take deep learning recommendation engines in the world of retail. Developers can write applications that offload recommendation training algorithms from the CPU to the GPU and offload data searching, movement or encryption to the DPU. These offloads free  up the CPU to handle mission-critical tasks such as reducing customer churn and increasing sales by pairing the perfect sweater and shoes with a shirt a shopper may be considering purchasing.

 

Offloading AI work to the GPU (and AI infrastructure work to the DPU) frees the server CPU cores, which can then run many more new and interesting applications that are best handled by the CPU.

 

Security at the Edge
Companies will begin defining what “the edge” is and how much security is needed to stay reasonably safe at the edge. Autonomous driving is essentially a data center in the car, allowing the AI to make instantaneous decisions locally, while also being able to report back to the central data center for training and improved driving algorithms.

 

The same thing is occurring with robots in the factory, cameras in the warehouse, and medical equipment in the hospital, where there will be inference learning at the edge and iterative model training at the core. Just like 4G wireless spawned transformational change in transportation with Lyft and Uber, 5G will bring transformational deals and capabilities. For wireless carriers, new software will synchronize the timing of the tower signal and baseband units so that massive amounts of data can be downloaded regularly for inference learning.

 

As 5G networks roll out and edge servers become more prevalent, these new nodes will be vulnerable to breaches. Machine learning can be used to identify abnormal traffic on the network, signaling a breach, and then virtual firewall software running on DPUs can report and isolate it before it expands across the network.  

Conclusion: The Quest for Large-Scale, Big Data Processing Software Solutions

The key for developers looking for exciting new ways to solve large-scale big-data processing problems is to think about solution architectures that offload, accelerate and isolate server workloads.

 

This includes moving highly-parallel tasks to the GPU and moving infrastructure related workloads to the DPU. Security, networking, storage and server management tools like agentless telemetry are all ripe targets for software-defined, hardware-accelerated change.

 

Opportunities will abound. Companies that are reluctant to spend time and resources investing in their own AI-enabled applications or accelerated infrastructure, whether for financial reasons or otherwise, will begin turning to third-party providers for experimentation and implementation.

 

Developers will become key enablers of the modern data center by offerings innovations that open access to accelerated software, infrastructure and security. The opportunities are enormous.

 


Ami Badani is vice president of marketing at NVIDIA where she focuses on data center technologies and market-driven solutions.  She comes to NVIDIA after serving at CMO for Cumulus Networks, a provider of enterprise-class networking software, which was acquired by NVIDIA in 2020. 

 

[Ed. Note: NVIDIA is a provider of software libraries that take advantage of GPU accelerators and also offers an SDK for applications for DPUs.] 

 




back