Model Serving

AI Red Hat

OpenShift AI helps you set up an MLOps “conveyer belt”

Many developers struggle with putting a model into deployment. The difficulty of updating models in production often leads to hesitancy and delays. The concept of MLOps deals with putting models into production without it being painful or risky. With Red Hat OpenShift AI, data scientists can create models with their preferred tooling, create pipelines to […]

Read More
AI Red Hat

OpenShift AI as a foundation

Red Hat’s AI platform enables the development and deployment of models across the public cloud, data center, and edge. It manages cluster resource requests, such as scaling up and down GPUs, and fosters collaboration between developers and data scientists. All by expanding the DevOps tooling provided within OpenShift, providing additional capabilities such as model serving […]

Read More
AI Open Source Red Hat

A collaborative platform with OpenShift AI

Red Hat OpenShift AI provides a secure, reliable and consistent platform across the hybrid cloud.  Data scientists, MLOps engineers, and application developers can work together on the same cloud-native platform, leading to faster business insights with trusted outcomes. Providing innovative AI/ML tooling from the open source world but with the support and security of Red […]

Read More
AI Generative AI Red Hat

Unlock the Power of Generative AI with the Right Platform Stack

Taneem Ibrahim and Selbi Nuyryev describe the work in the Open Data Hub community about building an AI/ML stack for generative AI and foundation models. With project CodeFlare and open source technology work with CAIkit, KubeRay and Text Generation Inference Service (TGiS), KServe to prompt-tune pre-trained models and provide resource efficiency across the hybrid cloud. […]

Read More
AI Generative AI Red Hat

Get the Scoop on OpenShift AI from Red Hat’s Chris Wright

Chris Wright, Red Hat CTO, describes Red Hat OpenShift AI and how it extends Red Hat’s application platform to help data scientists and developers collaborate to bring ai-enabled applications into the enterprise. Leveraging DevOps principles, OpenShift AI works across the entire model development lifecycle to help AI/ML teams build, deploy and monitor models across the […]

Read More
AI Red Hat

What is OpenShift AI?

Will McGrath gives a high-level overview of what is Red Hat OpenShift AI, a portfolio of products, built on OpenShift, that allows you to build, train, deploy and life-cycle manager models across the hybrid cloud. As a core offering within the OpenShift AI family, Red Hat OpenShift Data Science provides MLOps tooling such as model […]

Read More
Databricks MLOps

Model Serving on the Lakehouse

Learn about model serving from a Databricks Lakehouse in this video is from Databricks. Model Serving is built within the Databricks Lakehouse Platform and integrates with your lakehouse data, offering automatic lineage, governance and monitoring across data, features and model lifecycle. Simplify model deployment, reduce infrastructure overheads and accelerate time to production. With built-in auto-scaling […]

Read More