kubectl create namespace argo
kubectl apply -n argo -f https://raw.githubusercontent.com/argoproj/argo/stable/manifests/install.yaml
KubeCon 2018 in Seattle was the biggest KubeCon yet with 8000 developers attending. We connected with many existing and new Argoproj users and contributions, and gave away a lot of Argo T-shirts at our booth sponsored by Intuit!
We were also super excited to see KubeCon presentations about Argo by Argo developers, users and partners.
- CI/CD in Light Speed with K8s and Argo CD
- How Intuit uses Argo CD.
- Automating Research Workflows at BlackRock
- Why BlackRock created Argo Events and how they use it.
- Machine Learning as Code
- How Kubeflow uses Argo Workflows as its core workflow engine and Argo CD to declaratively deploy ML pipelines and models.
If you actively use Argo in your organization and your organization would be interested in participating in the Argo Community, please ask a representative to contact [email protected] for additional information.
Argoproj is a collection of tools for getting work done with Kubernetes.
- Argo Workflows - Container-native Workflow Engine
- Argo CD - Declarative GitOps Continuous Delivery
- Argo Events - Event-based Dependency Manager
- Argo Rollouts - Deployment CR with support for Canary and Blue Green deployment strategies
Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition).
- Define workflows where each step in the workflow is a container.
- Model multi-step workflows as a sequence of tasks or capture the dependencies between tasks using a graph (DAG).
- Easily run compute intensive jobs for machine learning or data processing in a fraction of the time using Argo Workflows on Kubernetes.
- Run CI/CD pipelines natively on Kubernetes without configuring complex software development products.
- Designed from the ground up for containers without the overhead and limitations of legacy VM and server-based environments.
- Cloud agnostic and can run on any Kubernetes cluster.
- Easily orchestrate highly parallel jobs on Kubernetes.
- Argo Workflows puts a cloud-scale supercomputer at your fingertips!
- DAG or Steps based declaration of workflows
- Artifact support (S3, Artifactory, HTTP, Git, raw)
- Step level input & outputs (artifacts/parameters)
- Loops
- Parameterization
- Conditionals
- Timeouts (step & workflow level)
- Retry (step & workflow level)
- Resubmit (memoized)
- Suspend & Resume
- Cancellation
- K8s resource orchestration
- Exit Hooks (notifications, cleanup)
- Garbage collection of completed workflow
- Scheduling (affinity/tolerations/node selectors)
- Volumes (ephemeral/existing)
- Parallelism limits
- Daemoned steps
- DinD (docker-in-docker)
- Script steps
As the Argo Community grows, we'd like to keep track of our users. Please send a PR with your organization name.
Currently officially using Argo:
- Adevinta
- Admiralty
- Adobe
- Alibaba Cloud
- BlackRock
- Canva
- Codec
- Commodus Tech
- CoreFiling
- Cratejoy
- Cyrus Biotechnology
- Datadog
- DataStax
- Equinor
- Fairwinds
- Gardener
- Gladly
- GitHub
- HOVER
- IBM
- InsideBoard
- Interline Technologies
- Intuit
- Karius
- KintoHub
- Localytics
- Maersk
- Max Kelsen
- Mirantis
- NVIDIA
- OVH
- Preferred Networks
- Quantibio
- Red Hat
- SAP Fieldglass
- SAP Hybris
- Styra
- Threekit
- Tiger Analytics
- Argo Ansible role: Provisioning Argo Workflows on OpenShift
- Argo Workflows vs Apache Airflow
- CI/CD with Argo on Kubernetes
- Running Argo Workflows Across Multiple Kubernetes Clusters
- Open Source Model Management Roundup: Polyaxon, Argo, and Seldon
- Producing 200 OpenStreetMap extracts in 35 minutes using a scalable data workflow
- Argo integration review
- TGI Kubernetes with Joe Beda: Argo workflow system
- Community meeting minutes and recordings
- Argo GitHub: https://github.com/argoproj
- Argo website: https://argoproj.github.io/
- Argo Slack: click here to join