Simplismart has built the fastest inference engine, beating the likes of TogetherAI, FireworksAI, etc., with less than $1 million in funding; allows enterprises to deploy AI with control over cost and performance tradeoffs, helping them justify the ROI for their AI-powered projects

Bangalore, India, Oct. 17, 2024 (GLOBE NEWSWIRE) -- OpenAI is projected to generate over $10 billion in revenue next year, a clear sign that the adoption of generative AI is accelerating. Yet, most companies struggle to deploy large AI models in production. With the steep costs and complexities involved, nearly 90% of machine learning projects are estimated never to make it to production. Addressing this pressing issue, Simplismart is today announcing a $7m funding round for its infrastructure that enables organizations to deploy AI models seamlessly. Like the shift to cloud computing, which relied on tools like Terraform and mobile app development fueled by Android, Simplismart is positioning itself as the critical enabler for AI's transition into mainstream enterprise operations. 

The series A funding round was led by Accel with participation from Shastra VC, Titan Capital, and high-profile angels, including Akshay Kothari, Co-Founder of Notion. This tranche, more than ten times the size of their previous round, will fuel R&D and growth for their enterprise-focused MLOps orchestration platform.

The company was co-founded in 2022 by Amritanshu Jain, who tackled cloud infrastructure challenges at Oracle Cloud, and Devansh Ghatak, who honed his expertise on search algorithms at Google Search. In just two years, with under $1m in initial funding, Simplismart has outperformed public benchmarks by building the world's fastest inference engine. This engine allows organizations to run machine learning models at lightning speed, significantly boosting performance while driving down costs.

Simplismart's fast inference engine allows users to leverage optimized performance for all their model deployments. For example, Its software-level optimization helps run Llama3.1 (8B) at an impressive throughput of >440 tokens per second. While most competitors focus on hardware optimisations or cloud computing, Simplismart has engineered this breakthrough in speed within a comprehensive MLOps platform tailored for on-prem enterprise deployments - agnostic towards choice of model and cloud platform.

Get the latest news
delivered to your inbox
Sign up for The Manila Times newsletters
By signing up with an email address, I acknowledge that I have read and agree to the Terms of Service and Privacy Policy.

"Building generative AI applications is a core need for enterprises today. However, the adoption of generative AI is far behind the rate of new developments. It's because enterprises struggle with four bottlenecks: lack of standardized workflows, high costs leading to poor ROI, data privacy, and the need to control and customise the system to avoid downtime and limits from other services,” said Amritanshu Jain, Co-Founder and CEO at Simplismart

Simplismart's platform offers organizations a declarative language (similar to Terraform) that simplifies fine-tuning, deploying, and monitoring genAI models at scale. Third-party APIs often bring concerns around data security, rate limits, and utter lack of flexibility, while deploying AI in-house comes with its own set of hurdles: access to computing power, model optimisation, scaling infrastructure, CI/CD pipelines, and cost efficiency, all requiring highly skilled machine learning engineers. Simplismart's end-to-end MLOps platform standardizes these orchestration workflows, allowing the teams to focus on their core product needs rather than spending numerous manhours building this infrastructure.

Amritanshu Jain added: "Until now, enterprises could leverage off-the-shelf capabilities to orchestrate their MLOps workloads since the quantum of workloads, be it the size of data, model or compute required, was small. As the models get larger and the workload increases, it will be imperative to have command over the orchestration workflows. Every new technology goes through the same cycle: exactly what Terraform did for cloud, android studio for mobile, and Databricks/Snowflake did for data.”

"As GenAI undergoes its Cambrian explosion moment, developers are starting to realise that customizing & deploying open-source models on their infrastructure carries significant merit; it unlocks control over performance, costs, customizability over proprietary data, flexibility in the backend stack, and high levels of privacy/security”, said Anand Daniel, Partner at Accel. "We were happy to see that Simplismart's team saw this opportunity quite early, but what blew us away was how their tiny team had already begun serving some of the fastest-growing GenAI companies in production. It furthered our belief that Simplismart has a shot at winning in the massive but fiercely competitive global AI infrastructure market.”

Solving MLOps workflows will allow more enterprises to deploy genAI applications with more control. They want to manage the tradeoff between performance and cost to suit their needs. Simplismart believes that providing enterprises with granular Lego blocks to assemble their inference engine and deployment environments is key to driving adoption. 

Ends 

Notes to the editor

Media images can be found here

About Simplismart

Simplismart (https://www.simplismart.ai/) is a cloud/model-agnostic MLOps workflow orchestration platform that helps organizations fine-tune, deploy, and observe models at scale using a declarative standardized language similar to Terraform(developed by Hashicorp). Their suite is the fastest inference engine globally, enabling performant serving for open-source models across modalities. They fit in as the tooling layer to help organizations navigate the grid-search hell to monitor cost VS latency tradeoffs for genAI models in production. It also envelopes many other features that help enterprises manage their entire production pipeline with one single tool: scaling, benchmarking, managing SLAs, etc.

About Accel 

Accel is a global venture capital firm that aims to be the first partner to exceptional teams everywhere (Facebook, Flipkart, etc.), from inception through all phases of private company growth. Accel has been operating in India since 2008, and its investments include companies like BookMyShow, Browserstack, Flipkart, Freshworks, FalconX, Infra.Market, Chargebee, Clevertap, Cure Fit, Musigma, Moneyview, Mensa Brands, Myntra, Moglix, Ninjacart, Swiggy, Stanza Living, Urban Company, Zetwerk, and Zenoti, among many others. We help ambitious entrepreneurs build iconic global businesses. For more, visit: www.accel.com

CONTACT: For further information please contact the Simplismart press office: Bilal Mahmood on [email protected] or +44 (0) 771 400 7257.