Márton is a Google Developer Expert(GDE) On Cloud, a Champion in the Cloud Innovators program, top user on Stackoverflow with 195k reputation, software developer, international speaker & passionate about mountain hiking.
Active contributor for open-source solutions like Beanstalkd console, and Riak admin interface. Expert in Cloud Technologies, Scalability, HA, Serverless, Cloud Run, Vertex AI, database systems like BigQuery, Redis, MySQL, Elasticsearch. Mentor and consultant for startups.
Vertex AI: Pipelines for your MLOps workflows
In recent years, one of the biggest trends in applications development has been the rise of Machine Learning solutions, tools, and managed platforms. Vertex AI is a managed unified ML platform for all your AI workloads. On the MLOps side, Vertex AI Pipelines solutions let you adopt experiment pipelining beyond the classic build, train, eval, and deploy a model. It is engineered for data scientists and data engineers, and it’s a tremendous help for those teams who don’t have DevOps or sysadmin engineers, as infrastructure management overhead has been almost completely eliminated.
Based on practical examples we will demonstrate how Vertex AI Pipelines scores high in terms of developer experience, how fits custom ML needs, and analyze results. It’s a toolset for a fully-fledged machine learning workflow, a sequence of steps in the model development, a deployment cycle, such as data preparation/validation, model training, hyperparameter tuning, model validation, and model deployment. Vertex AI comes with all standard resources plus an ML metadata store, a fully managed feature store, and a fully managed pipelines runner.
Vertex AI Pipelines is a managed serverless toolkit, which means you don't have to fiddle with infrastructure or back-end resources to run workflows.