Sarvesh Sarvesh
Member since 2024
Diamond League
64920 points
Member since 2024
Complete the introductory Implementing Cloud Load Balancing for Compute Engine skill badge to demonstrate skills in the following: creating and deploying virtual machines in Compute Engine and configuring network and application load balancers.
Earn a skill badge by completing the The Basics of Google Cloud Compute skill badge course, where you learn how to work with virtual machines (VMs), persistent disks, and web servers using Compute Engine.
Complete the introductory Monitor and Manage Google Cloud Resources skill badge to demonstrate skills in the following: granting and revoking IAM permissions; installing monitoring and logging agents; creating, deploying, and testing an event-driven Cloud Run function.
Complete the introductory Monitoring in Google Cloud skill badge course to demonstrate skills in the following: using Cloud Monitoring tools to monitor resources on Google Cloud.
Earn a skill badge by completing the Analyze Sentiment with Natural Language API quest, where you learn how the API derives sentiment from text.
Complete the introductory Create a Secure Data Lake on Cloud Storage skill badge course to demonstrate skills in the following: securing and configuring a Cloud Storage bucket, using Gemini for text generation, managing IAM access control, and establishing a Dataplex lake for data governance.
Complete the introductory Secure BigLake Data skill badge course to demonstrate skills with IAM, BigQuery, BigLake, and Dataplex to create and secure BigLake tables.
Earn a Skill Badge by completing the Implement Cloud Storage and Data Protection Solutions course. Learn to create a Cloud Storage bucket, use the command line, and protect objects with Bucket Lock.
Earn a skill badge by completing the Implement Event-Driven Messaging and Automation Workflows skill badge course, where you learn how to use Pub/Sub through the Cloud console, how Cloud Scheduler jobs can save you effort, and when Pub/Sub Lite can save you money on high-volume event ingestion.
Earn a skill badge by completing the Configure your Workplace: Google Workspace for IT Admins quest, where you will get try out the Admin role for Workspace and learn to provision Groups, manage applications, security, and manage Meet. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge quest, and final assessment challenge lab, to receive a digital badge that you can share with your network.
Earn an introductory skill badge by completing the Implement Cloud Collaboration and Productivity Workflows course, where you will get introduced to Google's collaborative platform and learn to use Gmail, Calendar, Meet, Drive, Sheets, and AppSheet.
Complete the intermediate Use Functions, Formulas and Charts in Google Sheets skill badge course to demonstrate skills in the following: analyzing data with functions; visualizing data using charts; and searching, validating, formatting, and displaying data.
Earn a skill badge by completing the Analyze Speech and Language with Google APIs quest, where you learn how to use the Natural Language and Speech APIs in real-world settings.
Complete the intermediate Perform Predictive Data Analysis in BigQuery skill badge course to demonstrate skills in the following: creating datasets in BigQuery by importing CSV and JSON files; harnessing the power of BigQuery with sophisticated SQL analytical concepts, including using BigQuery ML to train an expected goals model on soccer event data and evaluate the impressiveness of World Cup goals.
Earn the intermediate Skill Badge by completing the Classify Images with TensorFlow on Google Cloud skill badge course where you learn how to use TensorFlow and Vertex AI to create and train machine learning models. You primarily interact with Vertex AI Workbench user-managed notebooks.
Complete the intermediate Engineer Data for Predictive Modeling with BigQuery ML skill badge to demonstrate skills in the following: building data transformation pipelines to BigQuery using Dataprep by Trifacta; using Cloud Storage, Dataflow, and BigQuery to build extract, transform, and load (ETL) workflows; and building machine learning models using BigQuery ML.
This course provides hands-on experience with Google Cloud's Search for Retail, focusing on practical skills in setting up and managing retail search functionalities using APIs and console configurations. Participants will engage with real-world scenarios to learn how to import product data, manage user events, configure search parameters, and optimize search results within a retail environment.
In this course, you'll use text embeddings for tasks like classification, outlier detection, text clustering and semantic search. You'll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) solutions, such as for question-answering systems, using Google Cloud's Vertex AI and Google Cloud databases.
Demonstrate the ability to create and deploy deterministic virtual agents using Dialgflow CX and augment responses by grounding results on your own data integrating with Vertex AI Agent Builder data stores and leveraging Gemini for summarizations. You will use the following technologies and Google Cloud services: Vertex AI Agent Builder Dialogflow CX Gemini
In this course you will get hands-on in order to work through real-world challenges faced when building streaming data pipelines. The primary focus is on managing continuous, unbounded data with Google Cloud products.
Demonstrate your ability to implement updated prompt engineering techniques and utilize several of Gemini's key capacilities including multimodal understanding and function calling. Then integrate generative AI into a RAG application deployed to Cloud Run. This course contains labs that are to be used as a test environment. They are deployed to test your understanding as a learner with a limited scope. These technologies can be used with fewer limitations in a real world environment.
בקורס נלמד על מודלים של דיפוזיה, משפחת מודלים של למידת מכונה שיצרו הרבה ציפיות לאחרונה בתחום של יצירת תמונות. מודלים של דיפוזיה שואבים השראה מפיזיקה, וספציפית מתרמודינמיקה. בשנים האחרונות, מודלים של דיפוזיה הפכו לפופולריים גם בתחום המחקר וגם בתעשייה. מודלים של דיפוזיה עומדים מאחורי הרבה מהכלים והמודלים החדשניים ליצירת תמונות ב-Google Cloud. בקורס הזה נלמד על התיאוריה שמאחורי מודלים של דיפוזיה, ואיך לאמן ולפרוס אותם ב-Vertex AI.
בקורס הזה נלמד על Generative AI Studio, מוצר ב-Vertex AI שעוזר ליצור אבות טיפוס למודלים של בינה מלאכותית גנרטיבית, כדי להשתמש בהם ולהתאים אותם לפי הצרכים שלכם. באמצעות הדגמה של המוצר עצמו, נלמד מהו Generative AI Studio, מהם הפיצ'רים והאפשרויות שלו, ואיך להשתמש בו. בסוף הקורס יהיה שיעור Lab מעשי לתרגול של מה שנלמד, ובוחן לבדיקת הידע.
Earn a skill badge by passing the final quiz, you'll demonstrate your understanding of foundational concepts in generative AI. A skill badge is a digital badge issued by Google Cloud in recognition of your knowledge of Google Cloud products and services. Share your skill badge by making your profile public and adding it to your social media profile.
Text Prompt Engineering Techniques introduces you to consider different strategic approaches & techniques to deploy when writing prompts for text-based generative AI tasks.
As the use of enterprise Artificial Intelligence and Machine Learning continues to grow, so too does the importance of building it responsibly. A challenge for many is that talking about responsible AI can be easier than putting it into practice. If you’re interested in learning how to operationalize responsible AI in your organization, this course is for you. In this course, you will learn how Google Cloud does this today, together with best practices and lessons learned, to serve as a framework for you to build your own responsible AI approach.
This course on Integrate Vertex AI Search and Conversation into Voice and Chat Apps is composed of a set of labs to give you a hands on experience to interacting with new Generative AI technologies. You will learn how to create end-to-end search and conversational experiences by following examples. These technologies complement predefined intent-based chat experiences created in Dialogflow with LLM-based, generative answers that can be based on your own data. Also, they allow you to porvide enterprise-grade search experiences for internal and external websites to search documents, structure data and public websites.
בקורס הזה תלמדו איך ליצור מודל הוספת כיתוב לתמונה באמצעות למידה עמוקה (Deep Learning). אתם תלמדו על הרכיבים השונים במודל הוספת כיתוב לתמונה, כמו המקודד והמפענח, ואיך לאמן את המודל ולהעריך את הביצועים שלו. בסוף הקורס תוכלו ליצור מודלים להוספת כיתוב לתמונה ולהשתמש בהם כדי ליצור כיתובים לתמונות
בקורס הזה נציג את הארכיטקטורה של טרנספורמרים ואת המודל של ייצוגים דו-כיווניים של מקודד מטרנספורמרים (BERT). תלמדו על החלקים השונים בארכיטקטורת הטרנספורמר, כמו מנגנון תשומת הלב, ועל התפקיד שלו בבניית מודל BERT. תלמדו גם על המשימות השונות שאפשר להשתמש ב-BERT כדי לבצע אותן, כמו סיווג טקסטים, מענה על שאלות והֶקֵּשׁ משפה טבעית. נדרשות כ-45 דקות כדי להשלים את הקורס הזה.
בקורס הזה לומדים בקצרה על ארכיטקטורת מקודד-מפענח, ארכיטקטורה עוצמתית ונפוצה ללמידת מכונה שמשתמשים בה במשימות של רצף לרצף, כמו תרגום אוטומטי, סיכום טקסט ומענה לשאלות. תלמדו על החלקים השונים בארכיטקטורת מקודד-מפענח, איך לאמן את המודלים האלה ואיך להשתמש בהם. בהדרכה המפורטת המשלימה בשיעור ה-Lab תקודדו ב-TensorFlow תרחיש שימוש פשוט בארכיטקטורת מקודד-מפענח: כתיבת שיר מאפס.
זהו קורס מבוא ממוקד שמטרתו להסביר מהי אתיקה של בינה מלאכותית, למה היא חשובה ואיך Google נוהגת לפי כללי האתיקה של הבינה המלאכותית במוצרים שלה. מוצגים בו גם 7 עקרונות ה-AI של Google.
בקורס נלמד על מנגנון תשומת הלב, שיטה טובה מאוד שמאפשרת לרשתות נוירונים להתמקד בחלקים ספציפיים ברצף הקלט. נלמד איך עובד העיקרון של תשומת הלב, ואיך אפשר להשתמש בו כדי לשפר את הביצועים במגוון משימות של למידת מכונה, כולל תרגום אוטומטי, סיכום טקסט ומענה לשאלות.
Complete the intermediate Build a Data Warehouse with BigQuery skill badge course to demonstrate skills in the following: joining data to create new tables, troubleshooting joins, appending data with unions, creating date-partitioned tables, and working with JSON, arrays, and structs in BigQuery.
In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting. Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.
This course covers how to implement the various flavors of production ML systems— static, dynamic, and continuous training; static and dynamic inference; and batch and online processing. You delve into TensorFlow abstraction levels, the various options for doing distributed training, and how to write distributed training models with custom estimators. This is the second course of the Advanced Machine Learning on Google Cloud series. After completing this course, enroll in the Image Understanding with TensorFlow on Google Cloud course.
Earn the intermediate skill badge by completing the Build and Deploy Machine Learning Solutions on Vertex AI skill badge course, where you learn how to use Google Cloud's Vertex AI platform, AutoML, and custom training services to train, evaluate, tune, explain, and deploy machine learning models.
Complete the intermediate Create ML Models with BigQuery ML skill badge to demonstrate skills in creating and evaluating machine learning models with BigQuery ML to make data predictions.
Dataprep is Google's self-service data preparation tool built in collaboration with Alteryx. Learn the basics of cleaning and preparing data for analysis and visualization, all in the Google ecosystem. In this course, you will learn how to connect Dataprep to your data in Cloud Storage and BigQuery, clean data using the interactive UI, profile the data, and publish your results back into the Google ecosystem. You will learn the basics of data transformation, including filtering values, reshaping the data, combining multiple datasets, deriving new values, and aggregating your dataset.
In this advanced-level quest, you will learn how to harness serious Google Cloud computing power to run big data and machine learning jobs. The hands-on labs will give you use cases, and you will be tasked with implementing big data and machine learning practices utilized by Google’s very own Solutions Architecture team. From running Big Query analytics on tens of thousands of basketball games, to training TensorFlow image classifiers, you will quickly see why Google Cloud is the go-to platform for running big data and machine learning jobs.
Complete the introductory Prepare Data for ML APIs on Google Cloud skill badge to demonstrate skills in the following: cleaning data with Dataprep by Trifacta, running data pipelines in Dataflow, creating clusters and running Apache Spark jobs in Dataproc, and calling ML APIs including the Cloud Natural Language API, Google Cloud Speech-to-Text API, and Video Intelligence API.
This course takes a real-world approach to the ML Workflow through a case study. An ML team faces several ML business requirements and use cases. The team must understand the tools required for data management and governance and consider the best approach for data preprocessing. The team is presented with three options to build ML models for two use cases. The course explains why they would use AutoML, BigQuery ML, or custom training to achieve their objectives.
This course explores the benefits of using Vertex AI Feature Store, how to improve the accuracy of ML models, and how to find which data columns make the most useful features. This course also includes content and labs on feature engineering using BigQuery ML, Keras, and TensorFlow.
This course covers building ML models with TensorFlow and Keras, improving the accuracy of ML models and writing ML models for scaled use.
While the traditional approaches of using data lakes and data warehouses can be effective, they have shortcomings, particularly in large enterprise environments. This course introduces the concept of a data lakehouse and the Google Cloud products used to create one. A lakehouse architecture uses open-standard data sources and combines the best features of data lakes and data warehouses, which addresses many of their shortcomings.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
The course begins with a discussion about data: how to improve data quality and perform exploratory data analysis. We describe Vertex AI AutoML and how to build, train, and deploy an ML model without writing a single line of code. You will understand the benefits of Big Query ML. We then discuss how to optimize a machine learning (ML) model and how generalization and sampling can help assess the quality of ML models for custom training.
This course introduces Google Cloud's AI and machine learning (ML) capabilities, with a focus on developing both generative and predictive AI projects. It explores the various technologies, products, and tools available throughout the data-to-AI lifecycle, empowering data scientists, AI developers, and ML engineers to enhance their expertise through interactive exercises.