Mukesh Kumar
Member since 2022
Bronze League
3900 points
Member since 2022
This workload aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks of migrating data from PostgreSQL to CloudSQL using the Database Migration Service.
This accelerated on-demand course introduces participants to the comprehensive and flexible infrastructure and platform services provided by Google Cloud with a focus on Compute Engine. Through a combination of video lectures, demos, and hands-on labs, participants explore and deploy solution elements, including infrastructure components such as networks, virtual machines and applications services. You will learn how to use the Google Cloud through the console and Cloud Shell. You'll also learn about the role of a cloud architect, approaches to infrastructure design, and virtual networking configuration with Virtual Private Cloud (VPC), Projects, Networks, Subnetworks, IP addresses, Routes, and Firewall rules.
This course teaches you how to create an image captioning model by using deep learning. You learn about the different components of an image captioning model, such as the encoder and decoder, and how to train and evaluate your model. By the end of this course, you will be able to create your own image captioning models and use them to generate captions for images
This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.This course is estimated to take approximately 45 minutes to complete.
This course will help ML Engineers, Developers, and Data Scientists implement Large Language Models for Generative AI use cases with Vertex AI. The first two modules of this course contain links to videos and prerequisite course materials that will build your knowledge foundation in Generative AI. Please do not skip these modules. The advanced modules in this course assume you have completed these earlier modules.
The Generative AI Explorer - Vertex Quest is a collection of labs on how to use Generative AI on Google Cloud. Through the labs, you will learn about how to use the models in the Vertex AI PaLM API family, including text-bison, chat-bison, and textembedding-gecko. You will also learn about prompt design, best practices, and how it can be used for ideation, text classification, text extraction, text summarization, and more. You will also learn how to tune a foundation model by training it via Vertex AI custom training and deploy it to a Vertex AI endpoint.
This course will introduce you to the attention mechanism, a powerful technique that allows neural networks to focus on specific parts of an input sequence. You will learn how attention works, and how it can be used to improve the performance of a variety of machine learning tasks, including machine translation, text summarization, and question answering. This course is estimated to take approximately 45 minutes to complete.
Earn a skill badge by completing the Introduction to Generative AI, Introduction to Large Language Models and Introduction to Responsible AI courses. By passing the final quiz, you'll demonstrate your understanding of foundational concepts in generative AI. A skill badge is a digital badge issued by Google Cloud in recognition of your knowledge of Google Cloud products and services. Share your skill badge by making your profile public and adding it to your social media profile.
This course introduces Vertex AI Studio, a tool to interact with generative AI models, prototype business ideas, and launch them into production. Through an immersive use case, engaging lessons, and a hands-on lab, you’ll explore the prompt-to-product lifecycle and learn how to leverage Vertex AI Studio for Gemini multimodal applications, prompt design, prompt engineering, and model tuning. The aim is to enable you to unlock the potential of gen AI in your projects with Vertex AI Studio.
This course gives you a synopsis of the encoder-decoder architecture, which is a powerful and prevalent machine learning architecture for sequence-to-sequence tasks such as machine translation, text summarization, and question answering. You learn about the main components of the encoder-decoder architecture and how to train and serve these models. In the corresponding lab walkthrough, you’ll code in TensorFlow a simple implementation of the encoder-decoder architecture for poetry generation from the beginning.
This is an introductory-level microlearning course aimed at explaining what responsible AI is, why it's important, and how Google implements responsible AI in their products. It also introduces Google's 3 AI principles.
This is an introductory level micro-learning course that explores what large language models (LLM) are, the use cases where they can be utilized, and how you can use prompt tuning to enhance LLM performance. It also covers Google tools to help you develop your own Gen AI apps.
This course introduces diffusion models, a family of machine learning models that recently showed promise in the image generation space. Diffusion models draw inspiration from physics, specifically thermodynamics. Within the last few years, diffusion models became popular in both research and industry. Diffusion models underpin many state-of-the-art image generation models and tools on Google Cloud. This course introduces you to the theory behind diffusion models and how to train and deploy them on Vertex AI.
This is an introductory level microlearning course aimed at explaining what Generative AI is, how it is used, and how it differs from traditional machine learning methods. It also covers Google Tools to help you develop your own Gen AI apps.
This course provides comprehensive skills on VM migration, from the initial assessment through the final implementation through presentations, demonstrations, and whiteboard session.
This is the second course in the Data to Insights course series. Here we will cover how to ingest new external datasets into BigQuery and visualize them with Looker Studio. We will also cover intermediate SQL concepts like multi-table JOINs and UNIONs which will allow you to analyze data across multiple data sources. Note: Even if you have a background in SQL, there are BigQuery specifics (like handling query cache and table wildcards) that may be new to you. After completing this course, enroll in the Achieving Advanced Insights with BigQuery course.
Welcome to Cloud Composer, where we discuss how to orchestrate data lake workflows with Cloud Composer.
In this course, we see what the common challenges faced by data analysts are and how to solve them with the big data tools on Google Cloud. You’ll pick up some SQL along the way and become very familiar with using BigQuery and Dataprep to analyze and transform your datasets. This is the first course of the From Data to Insights with Google Cloud series. After completing this course, enroll in the Creating New BigQuery Datasets and Visualizing Insights course.
This quest offers hands-on practice with Cloud Data Fusion, a cloud-native, code-free, data integration platform. ETL Developers, Data Engineers and Analysts can greatly benefit from the pre-built transformations and connectors to build and deploy their pipelines without worrying about writing code. This Quest starts with a quickstart lab that familiarises learners with the Cloud Data Fusion UI. Learners then get to try running batch and realtime pipelines as well as using the built-in Wrangler plugin to perform some interesting transformations on data.
In this introductory-level course, you get hands-on practice with the Google Cloud’s fundamental tools and services. Optional videos are provided to provide more context and review for the concepts covered in the labs. Google Cloud Essentials is a recommendeded first course for the Google Cloud learner - you can come in with little or no prior cloud knowledge, and come out with practical experience that you can apply to your first Google Cloud project. From writing Cloud Shell commands and deploying your first virtual machine, to running applications on Kubernetes Engine or with load balancing, Google Cloud Essentials is a prime introduction to the platform’s basic features.
This course explores how to leverage Looker to create data experiences and gain insights with modern business intelligence (BI) and reporting.
This advanced-level quest is unique amongst the other catalog offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataprep, to Cloud Composer, this quest is composed of specific labs that will put your Google Cloud data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation, too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of the Engineer Data in the Google Cloud to receive an exclusive Google Cloud digital badge.
This course continues to explore the implementation of data load and transformation pipelines for a BigQuery Data Warehouse using Cloud Data Fusion.
Welcome to Cloud Data Fusion, where we discuss how to use Cloud Data Fusion to build complex data pipelines.
This advanced-level Quest builds on its predecessor Quest, and offers hands-on practice on the more advanced data integration features available in Cloud Data Fusion, while sharing best practices to build more robust, reusable, dynamic pipelines. Learners get to try out the data lineage feature as well to derive interesting insights into their data’s history.
In this course, you learn how to do the kind of data exploration and analysis in Looker that would formerly be done primarily by SQL developers or analysts. Upon completion of this course, you will be able to leverage Looker's modern analytics platform to find and explore relevant content in your organization’s Looker instance, ask questions of your data, create new metrics as needed, and build and share visualizations and dashboards to facilitate data-driven decision making.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
In this course you will get hands-on in order to work through real-world challenges faced when building streaming data pipelines. The primary focus is on managing continuous, unbounded data with Google Cloud products.
In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting. Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.
While the traditional approaches of using data lakes and data warehouses can be effective, they have shortcomings, particularly in large enterprise environments. This course introduces the concept of a data lakehouse and the Google Cloud products used to create one. A lakehouse architecture uses open-standard data sources and combines the best features of data lakes and data warehouses, which addresses many of their shortcomings.
Incorporating machine learning into data pipelines increases the ability to extract insights from data. This course covers ways machine learning can be included in data pipelines on Google Cloud. For little to no customization, this course covers AutoML. For more tailored machine learning capabilities, this course introduces Notebooks and BigQuery machine learning (BigQuery ML). Also, this course covers how to productionalize machine learning solutions by using Vertex AI.
Complete the intermediate Engineer Data for Predictive Modeling with BigQuery ML skill badge to demonstrate skills in the following: building data transformation pipelines to BigQuery using Dataprep by Trifacta; using Cloud Storage, Dataflow, and BigQuery to build extract, transform, and load (ETL) workflows; and building machine learning models using BigQuery ML.
Complete the introductory Prepare Data for ML APIs on Google Cloud skill badge to demonstrate skills in the following: cleaning data with Dataprep by Trifacta, running data pipelines in Dataflow, creating clusters and running Apache Spark jobs in Dataproc, and calling ML APIs including the Cloud Natural Language API, Google Cloud Speech-to-Text API, and Video Intelligence API.
Complete the introductory Implementing Cloud Load Balancing for Compute Engine skill badge to demonstrate skills in the following: creating and deploying virtual machines in Compute Engine and configuring network and application load balancers.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.