Advancing Application Performance with NVMe Storage, Part 3

DZone 's Guide to

Advancing Application Performance with NVMe Storage, Part 3

On a more practical note, the final article in this series looks at how different industries can directly benefit from NVMe storage.

· Performance Zone ·
Free Resource

Image title

NVMe Storage Use Cases

NVMe storage's strong performance, combined with the capacity and data availability benefits of shared NVMe storage over local SSD, makes it a strong solution for AI/ML infrastructures of any size. There are several AI/ML focused use cases to highlight.

  • Financial Analytics – Financial services and financial technology (FinTech) are increasingly turning to automation and artificial intelligence to fuel their decision making processes for investments. Using a mix of historical data and financial modeling, one platform can provide the horsepower required for predicting future investment strategies for their financial customers.

  • Image Recognition in Manufacturing – Manufacturing has long used automation in their production lines to increase the output capacity of their production systems, scaling from hundreds of units to thousands or even millions of units per hour. The financial impact of a quality issue on the production line can be devastating if not caught in a timely manner. Real-time image recognition of photos of manufactured parts is essential to determining whether a part meets the quality standards required, as well as capturing systematic quality issues in real-time.

  • Car Services – Ride sharing apps have given rise to a new paradigm in public transit, allowing users and drivers to connect quickly and easily as needed. Ride-sharing companies use AI/ML for traffic modeling to position drivers where they are most needed based on both past and current ride sharing requests. This increases the drivers' potential revenue by reducing drive times as well as increases customer satisfaction through reduced wait times, both of which improve the revenue potential for the ride sharing company.

Beyond AI/ML, one vendor also provides more generalized computing services for their customers. They provide storage capacity for cloud services, using OpenStack and Kubernetes in conjunction with NVMe storage for high-performance storage. In addition, they also leverage NVMe storage for big data analytics, using Spark applications to perform multiple types of data analytics tasks, such as SQL, data mining and more.

Summary: Benefits of NVMe Storage for AI/ML

NVMe storage is an ideal solution for countless AI/ML workloads, especially machine learning for multiple applications. With NVMe storage, you can:

  • Create and manage larger shared data-sets for training – By separating out storage capacity from the compute nodes, data sets for machine learning training can scale up to 1PB. As the data set grows and more NVMe storage is brought online, performance grows as well, rather than being limited by legacy storage controller bottlenecks.

  • Overcome the capacity limitations of local SSDs in GPU nodes – With limited space for SSD media, GPU nodes have limited capacity to manage larger datasets. With NVMe storage, NVMe volumes can be dynamically provisioned over high-performance Ethernet or InfiniBand networks.

  • Accelerate epoch time of machine learning by as much as 10x – By leveraging high-performance NVMe-oF, NVMe storage eliminates the latency bottlenecks of older storage protocols and unleashes the parallelism inherent to the NVMe protocol. Every GPU node has direct, parallel access to the media at the lowest possible latency.

  • Improve the utilization of GPUs – Having GPUs rest idle due to slow access to data for processing is costly. By offloading storage access to the idle CPUs, and delivering storage performance at the speed of local SSD, NVMe storage ensures that the GPU-nodes are kept busy with fast access to data.

ai, big data, data storage, ml, nvme, peperformance

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}