Really!! Are there could native way to deploy, LCM VMs and add Self Serve on top ???? In this post I will describe an art of the possibility using the below tools: RHDH: Red Hat Developer Hub (Open source project: Backstage ) OCP Virtualization: Red Hat OpenShift Virtualization (Open source project: KubeVirt ) AAP: Red Hat Ansible Automation Platform (Open source project: Ansible / AWX ) RHEL BootC: Image mode for Red Hat Enterprise Linux (Open source project: bootc ) GitOps: Red Hat OpenShift GitOps (Open source project: ArgoCD ) Quay Registry or any other OCI compliant registry All of these projects can be run on Red Hat OpenShift (Open source project: OKD ) OR on other Kubernetes distribution or on VMs (you pick your underlying infra. For this post I have used OpenShift for simplicity of deployment, integrated tools and narrowly focusing on the usecases instead of the deployment of the tools). The main goal here is to: Easily deploy and lifecycle applications and stuffs ...
It is mid-2025 and the cogs of AI are at full speed. So we (I and Mobin) decided to do our own AI project. We called it "IntelliLogs". IntelliLogs at a glance: Demo: https://www.youtube.com/watch?v=OXMlORwyMQk In this post I will describe why we did what we did, what is it that we did and how we did it. I will share my personal experience. I am hoping this will, at least, be an interesting read. Table of contents: Why IntelliLogs What is IntelliLogs How IntelliLogs was developed Future of IntelliLogs Conclusion References Why IntelliLogs: Personal motivation 💪 to this were: Explore and experience what does an AI app look like from an architectural and engineering perspective Explore the realm of Huge LLMs (eg: GPT-4.1-170B, Gemini Pro etc) vs small LLMs (eg: granite-7b, gemma-4b) Explore the possibilities of model tuning / making a model without being a data scientist. How easy or hard it is, what tools available etc. We also wanted to tackle a "not too far from ...