Events and Webinars

Our Event Series

Sommelier

Turntable

Featured Event
14 - 17 Jun
Booth #L16, Hall 1, Paris Expo Porte de Versailles
Our Latest Talk
By Kanchan Ray, Dr. Sudipta Seal
video-icon 60 mins
About
nagarro
Discover more about us,
an outstanding digital
solutions developer and a
great place to work in.
Investor
relations
Financial information,
governance, reports,
announcements, and
investor events.
News &
press releases
Catch up to what we are
doing, and what people
are talking about.
Caring &
sustainability
We care for our world.
Learn about our ESG
initiatives.
Thinking about
becoming a Nagarrian?
Check our open positions
Africa & Asia-Pacific
Central & South America
Europe
Middle East
North America
talk to us
Welcome to digital product engineering
Thanks for your interest. How can we help?
 
 
success story

MLOps-enabled bottle manufacturing platform

Optimizing operations for beverage bottle manufacturing units with model training workflows, using Machine Learning Operations (MLOps) strategies and AI

challenge_icon
challenge

The changing needs of the market demand upgrading existing systems. The client's existing AI implementation was disconnected from the data ingested from the edge device into the AWS storage bucket; a partial model training pipeline was on the Azure cloud platform and the remaining was on-premise. The client needed to streamline the workflows and perform troubleshooting for edge scenarios. Additionally, a commercial model training platform license had to be attached to the ML solution, which was a manual process. Lastly, the pipeline had to be built as a multi-tenant platform to be used by the client’s customers.

process_icon
solution

Nagarro partnered with the client to build a custom, MLOps-enabled AI workflow for deployment on AWS. The solution was designed to be serverless, with the computing cost generated on a pay-as-you-go model. Data pre-processing was done on Spark EMR cluster and multiple model training was performed on AWS SageMaker. Currently, deployment on edge device (using SageMaker Neo, Edge Manager and IoT Greengrass) is in progress. In the last phase, data ingestion and deployment will be performed on the cloud.

solution_icon
outcome

Nagarro’s serverless MLOps implementation of the client’s AI model has provided the client full control of the various AI steps in the end-to-end pipeline. The custom-tailored pipeline components are configurable for the solution requirements. The defined multi-tenant application comprising of MLOps and DevOps best practices and infra-as-code schemes have allowed the client to evaluate the total cost of ownership of the platform and thereby, define the business strategies for the solution to be shared with end customers.