MLOps

AI MLOps Red Hat

AI 101: What’s the difference between DevOps and MLOps?

Prasanth Anbalagan (Senior Principal Technical Marketing Manager for AI) will share the differences and similarities between DevOps and MLOps.

Read More
AI MLOps

How to Navigate MLOps with Third-Party Foundation Models

In fast-moving ecosystem of artificial intelligence (AI) and machine learning (ML), the integration of third-party foundation models into the development pipeline has emerged as a pivotal strategy for accelerating innovation and enhancing model performance. However, effectively incorporating these advanced models into the Machine Learning Operations (MLOps) workflow requires a nuanced understanding of both the technical […]

Read More
AI Open Source Red Hat

Unlocking the Potential of AI: The Role of Consistent Platforms

AI isn’t going away. OpenShift AI helps customers solve some of their challenges by operationalizing both traditional and generative AI projects. Why should customers think of Red Hat for AI? Steven Huels, GM for Red Hat’s AI Business Unit, describes how OpenShift AI provides a consistent and flexible AI platform for an ever-changing ecosystem as […]

Read More
AI Generative AI Large Language Models MLOps

LLMOps Demystified: A Deep Dive into Large Language Model Operations

Machine learning operations (MLOps) is an important process to make sure Machine Learning applications remain operational, but before you apply the same process to your large language models (LLM), Martin explains why and how LLMs need to be treated differently and the process known as LLMOps

Read More
AI MLOps Open Source Red Hat

Breaking Barriers: How OpenShift AI Connects MLOps and DevOps

The use of generative AI to create meaningful services that help businesses and people has accelerated over the last year. OpenShift AI bridges the gap between data scientists who are creating the models in the MLOps world and the application developers creating applications in the DevOps world. Learn more: https://red.ht/openshift_ai

Read More
AI MLOps

What is AIOps and How is it Different from MLOps?

BAILeY explains AI Ops and what makes it different from ML Ops

Read More
AI Red Hat

OpenShift AI helps you set up an MLOps “conveyer belt”

Many developers struggle with putting a model into deployment. The difficulty of updating models in production often leads to hesitancy and delays. The concept of MLOps deals with putting models into production without it being painful or risky. With Red Hat OpenShift AI, data scientists can create models with their preferred tooling, create pipelines to […]

Read More
AI Red Hat

OpenShift AI as a foundation

Red Hat’s AI platform enables the development and deployment of models across the public cloud, data center, and edge. It manages cluster resource requests, such as scaling up and down GPUs, and fosters collaboration between developers and data scientists. All by expanding the DevOps tooling provided within OpenShift, providing additional capabilities such as model serving […]

Read More
AI Open Source Red Hat

A collaborative platform with OpenShift AI

Red Hat OpenShift AI provides a secure, reliable and consistent platform across the hybrid cloud.  Data scientists, MLOps engineers, and application developers can work together on the same cloud-native platform, leading to faster business insights with trusted outcomes. Providing innovative AI/ML tooling from the open source world but with the support and security of Red […]

Read More
AI Large Language Models MLOps

Foundation Models in the Modern Data Stack

This video is from MLOps.community. // Abstract As Foundation Models (FMs) continue to grow in size, innovations continue to push the boundaries of what these models can do on language and image tasks. This talk describes our work on applying foundation models to structured data tasks like data linkage, cleaning, and querying. We discuss the […]

Read More