Main

Argo Ecosystem: Argo CD, Argo Workflows, Argo Events, Argo Rollouts, Argo Everything

In this extra video for our Argo CD Lightning course, let's look at the Argo Ecosystem in general, and learn which components of this ecosystem help with which challenges in Cloud Native world. Hire us to solve all your DevOps, Cloud Native and Public Cloud challenges: https://mkdev.me 💌 Subscribe to mkdev dispatch and get a chance to get personalized mug: https://mkdev.me/dispatch 🎙️ Check out our bi-weekly podcast DevOps Accents, directly from mkdev co-founders: https://mkdev.me/podcast #containers #kubernetes #openshift #cicd #argoevents #automation #argocd #argoworkflows #argorollouts

mkdev

10 months ago

It was a sunny day in Munich, well, sunny by Munich standards, anyway. I was happily sitting in my  chair, feeling fulfilled by the fact that our Argo CD lightning course was complete and ready for  the public. I just needed a confirmation from Alex that the course is released on one popular  video course platform. And then I saw this. Apparently, Argo CD lightning course is  not long enough by flawed standards of some services. Which means, that it’s  time to talk about the Argo ecosystem. Argo
CD is just one of many tools  under the Argo umbrella. By this point, you should be well aware of what Argo CD is:  it’s a Git Ops deployment tool. If we simplify, the only thing Argo CD is doing is taking  Kubernetes manifests and applying them to one or more Kubernetes clusters. It can  not do anything more complex than this. But just a fancy kubectl apply is not enough  to build a complete majestic pipeline. We need to build, test, package and push the new  application version. To help with
these activities, there is an Argo Workflows  project. The idea behind it is to give you the building blocks to run any kind of  workflows on top of your Kubernetes clusters. Some of the use cases include machine  learning jobs and infrastructure automation, but most interesting to us is that it  can be used to build CI/CD pipelines. Argo Workflows has many batteries included - a  separate CLI, a Web Dashboard, access control, retries, various hooks, artifacts  support and so son. You can build
quite some powerful workflows on top of  this system, and documentation claims that can scale to thousands of workflows  a day, with tens of thousands nodes. The only battery that is not really included in  Argo Workflows is how to trigger those workflows. For sure, it’s fun to submit workflows with the  CLI, but that’s not how you do in the real life. In the real life, at least for the CI/CD  pipeline, you want your workflows to be triggered automatically, on some  event, like an opened Pull Re
quest. Worry not, Argo got you on this front as well.  There is a separate Argo Events project, the whole purpose of which is to receive and process events.  It supports more than 20 different event sources, starting from the simple ones, like GitHub  Webhook events, and including cloud-provider specific ones, like SQS or GCP PubSub. You can  even write your own custom event processors. Argo Events has a relatively simple  architecture. Even Sources process events from external systems and write
  them into the event bus, powered by NATS, or optionally Kafka. Sensors process those events and trigger  something. They can trigger a Lambda function, send Kafka Message and, most importantly,  they can trigger Argo Workflows. If you want a whole CI/CD system, you  will need both Argo Events and Argo Workflows. They share the single interface,  so from the end user they would look like a well integrated system, while underlying  infrastructure is a bit more decoupled. You’d need to configure
Argo Events to process  events from GitHub, Bitbucket or GitLab and to create Argo Workflows, which would then clone the  repository, build and test the code and, probably, push some changes to GitOps styled repository, so  that Argo CD picks up and rolls out the changes. Talking about rollouts, there is  another Argo project called Argo Rollouts. Rollouts is a way smaller project  than Workflows, Events or CD. In a nutshell, Argo Rollouts are a replacement for  Kubernetes Deployments. It’s a se
parate CustomResourceDefinition that gives you an ability  to perform canary and blue-green deployments, by integrating with external systems  for ingress and metrics. It also has a few smaller features like built-in deployment  notifications and simple deployment dashboard. Finally, if you are just looking for a  simpler process to update your deployments to use the new image version, there is a  separate Argo CD component named Argo CD Image Updater. It will connect to your  registry, discover
new tags and update your applications to use this new tag  - or, optionally, it will push a commit with the new image tag to the repository  where your Argo CD Application is defined. To sum it up, Argo ecosystem is quite large  and diverse. Argo CD is probably the most popular part of it, but as of today Argo covers  all the possible use cases around building and deploying your applications, as well as it’s  capable to be a generic job execution engine and event processor. You might start with
  just Argo CD for some GitOps automation, but at some point you might benefit from Argo  Workflows and Events. The obvious benefit is that you will use the collection of tools from  the same ecosystem, tools that are built to work with each other - which is a bit better than duct  taping multiple unrelated open source projects. That’s it for this video. If you liked the video, please press the  like button and subscribe to our YouTube channel. We also have a bi-weekly newsletter  mkdev dispatch
, about all the things DevOps and insights directly from co-founders of mkdev,  and we also have a bi-weekly podcast DevOps Accents, where we discuss industry news and trends, as  well as share our experience and learnings. Thanks for watching!

Comments