AIStor extends the S3 API to include the ability to PROMPT objects

REDWOOD CITY, Calif., Nov. 13, 2024 /PRNewswire/ -- MinIO, the leader in high-performance object storage, today announced the release of AIStor, an evolution of its flagship Enterprise Object Store designed for the exascale data infrastructure challenges presented by modern AI workloads. AIStor provides new features, along with performance and scalability improvements, to enable enterprises to store all AI data in one infrastructure.

Recent research from MinIO underscores the importance of object storage in AI and ML workloads. Polling more than 700 IT leaders, MinIO found that the top three reasons motivating organizations to adopt object storage were to support AI initiatives and to deliver performance and scalability modeled after the public clouds. MinIO is the only storage provider solving this new class of AI-scale problems in private cloud environments. This has driven the company to build new AI-specific features, while also enhancing and refining existing functionality, specifically catered to the scale of AI workloads.

"The launch of AIStor is an important milestone for MinIO. Our object store is the standard in the private cloud and the features we have built into AIStor reflect the needs of our most demanding and ambitious customers," said AB Perisamy, co-founder and CEO at MinIO. "It is not enough to just protect and store data in the age of AI, storage companies like ours must facilitate an understanding of the data that resides on our software. AIStor is the realization of this vision and serves both our IT audience and our developer community."

While MinIO was already designed to manage exascale data, these advances make it easier to extend the power of AI to introduce new functionality across applications without needing to manage separate tools for analysis.

Get the latest news
delivered to your inbox
Sign up for The Manila Times newsletters
By signing up with an email address, I acknowledge that I have read and agree to the Terms of Service and Privacy Policy.

Of particular note is the introduction of a new S3 API, promptObject. This API enables users to "talk" to unstructured objects in the same way one would engage an LLM moving the storage world from a PUT and GET paradigm to a PUT and PROMPT paradigm. Applications can use promptObject through function calling with additional logic. This can be combined with chained functions with multiple objects addressed at the same time. For example, when querying a stored MRI scan, one can ask "where is the abnormality?" or "which region shows the most inflammation?" and promptObject will show it. The applications are almost infinite when considering this extension. This means that application developers can exponentially expand the capabilities of their applications without requiring domain-specific knowledge of RAG models or vector databases. This will dramatically simplify AI application development while simultaneously making it more powerful.

Additional new and enhanced capabilities in MinIO AIStor include:

  • AIHub: a private Hugging Face API compatible repository for storing AI models and datasets directly in AIStor, enabling enterprises to create their own data and model repositories on the private cloud or in air-gapped environments without changing a single line of code. This eliminates the risk of developers leaking sensitive data sets or models.
  • Updated Global Console: a completely redesigned user interface for MinIO that provides extensive capabilities for Identity and Access Management (IAM), Information Lifecycle Management (ILM), load balancing, firewall, security, caching and orchestration, all through a single pane of glass. The updated console features a new MinIO Kubernetes operator that further simplifies the management of large scale data infrastructure where there are hundreds of servers and tens of thousands of drives.
  • Support for S3 over Remote Direct Memory Access (RDMA): enables customers to take full advantage of their high-speed (400GbE, 800GbE, and beyond) Ethernet investments for S3 object access by leveraging RDMA's low-latency, high-throughput capabilities, and provides performance gains required to keep the compute layer fully utilized while reducing CPU utilization.
"AMD EPYC processors provide unparalleled core density, exceptional storage performance and I/O, aligned with high-performance compute requirements," said Rajdeep Sengupta, Director of Systems Engineering, AMD. "We have deployed the MinIO offering to host our big data platform for structured, unstructured, and multimodal datasets. Our collaboration with MinIO optimizes AIStor to fully leverage our advanced enterprise compute technologies and address the growing demands of data center infrastructure."

The AIStor launch follows a series of major announcements in the AI storage space starting with the release of its reference architecture for large scale data AI data infrastructure, MinIO DataPod. MinIO also demonstrated its compute extensibility with announcements supporting optimizations for Arm®-based chipsets and a customer win announcing it was powering the Intel® Tiber™ AI Cloud. Collectively, these announcements frame MinIO's leadership in the AI storage space and its place at the center of the AI ecosystem.

For more on the research see the MinIO blog.

To learn more about the AIStor, visit www.min.io or read more on the blog at blog.min.io/aistor

About MinIO

MinIO is pioneering high-performance object storage for AI/ML and modern data lake workloads. The software-defined, Amazon S3-compatible object storage system is used by more than half of the Fortune 500. With 1.5B+ Docker downloads, MinIO is the fastest-growing cloud object storage company and is consistently ranked by industry analysts as a leader in object storage. Founded in November 2014, the company is backed by Intel Capital, Softbank Vision Fund 2, Dell Technologies Capital, Nexus Venture Partners, General Catalyst and key angel investors.

Media Contact:

Tucker Hallowell

[email protected]