Dominique Lambert
成为会员时间:2021
白银联赛
50328 积分
成为会员时间:2021
本課程說明如何使用 Google Agent Development Kit 建構複雜的多代理系統。您將建構配備工具的虛擬服務專員,並透過從屬關係和流程定義互動方式。您將在本機執行代理,並部署至 Vertex AI Agent Engine,透過代管代理流程執行;Agent Engine 則處理基礎架構決策和資源調度作業。 請注意,這些實驗室是根據這項產品的預先發布版製成。我們會進行維護更新,因此這些研究室將可能出現延遲。
NotebookLM is an AI-powered collaborator that helps you do your best thinking. After uploading your documents, NotebookLM becomes an instant expert in those sources so you can read, take notes, and collaborate with it to refine and organize your ideas. NotebookLM Pro gives you everything already included with NotebookLM, as well as higher utilization limits, access to premium features, and additional sharing options and analytics.
Gemini Enterprise 結合 Google 的搜尋和 AI 輔助功能,企業員工只要在單一搜尋列輸入關鍵字,就能查找文件儲存空間、電子郵件、對話、支援單處理系統和其他資料來源中的特定資訊。Gemini Enterprise 助理還能協助人員腦力激盪、研究資訊、列出文件大綱及執行其他動作,例如邀請同事加入日曆活動,加快完成知識型工作及各種協作作業。(請注意,Gemini Enterprise 先前稱為 Google Agentspace,本課程可能會提及產品舊稱。)
完成「在 Vertex AI 使用 Gemini API 探索生成式 AI」技能徽章中階課程,即可證明自己具備下列技能: 可運用 Gemini API 生成文字、分析圖片和影片來強化內容創作能力,還能使用函式呼叫技巧。 本課程將帶您瞭解如何善用進階的 Gemini 技術、使用多模態內容生成功能,並提升 AI 專案的潛力。
(This course was previously named Multimodal Prompt Engineering with Gemini and PaLM) This course teaches how to use Vertex AI Studio, a Google Cloud console tool for rapidly prototyping and testing generative AI models. You learn to test sample prompts, design your own prompts, and customize foundation models to handle tasks that meet your application's needs. Whether you are looking for text, chat, code, image or speech generative experiences Vertex AI Studio offers you an interface to work with and APIs to integrate your production application.
This course on Integrate Vertex AI Search and Conversation into Voice and Chat Apps is composed of a set of labs to give you a hands on experience to interacting with new Generative AI technologies. You will learn how to create end-to-end search and conversational experiences by following examples. These technologies complement predefined intent-based chat experiences created in Dialogflow with LLM-based, generative answers that can be based on your own data. Also, they allow you to porvide enterprise-grade search experiences for internal and external websites to search documents, structure data and public websites.
In this course, you'll use text embeddings for tasks like classification, outlier detection, text clustering and semantic search. You'll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) solutions, such as for question-answering systems, using Google Cloud's Vertex AI and Google Cloud databases.
This course explores Google Cloud technologies to create and generate embeddings. Embeddings are numerical representations of text, images, video and audio, and play a pivotal role in many tasks that involve the identification of similar items, like Google searches, online shopping recommendations, and personalized music suggestions. Specifically, you’ll use embeddings for tasks like classification, outlier detection, clustering and semantic search. You’ll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) systems and question-answering solutions, on your own proprietary data using Google Cloud’s Vertex AI.
這堂課程會介紹 AI 搜尋技術、工具和應用程式。主題涵蓋使用向量嵌入執行語意搜尋;結合語意和關鍵字做法的混合型搜尋機制;以及運用檢索增強生成 (RAG) 技術建構有基準的 AI 代理,盡可能減少 AI 幻覺。您可以實際使用 Vertex AI Vector Search,打造智慧型搜尋引擎。
This course will help ML Engineers, Developers, and Data Scientists implement Large Language Models for Generative AI use cases with Vertex AI. The first two modules of this course contain links to videos and prerequisite course materials that will build your knowledge foundation in Generative AI. Please do not skip these modules. The advanced modules in this course assume you have completed these earlier modules.
本課程說明如何使用深度學習來建立圖像說明生成模型。您將學習圖像說明生成模型的各個不同組成部分,例如編碼器和解碼器,以及如何訓練和評估模型。在本課程結束時,您將能建立自己的圖像說明生成模型,並使用模型產生圖像說明文字。
本課程將介紹擴散模型,這是一種機器學習模型,近期在圖像生成領域展現亮眼潛力。概念源自物理學,尤其深受熱力學影響。過去幾年來,在學術界和業界都是炙手可熱的焦點。在 Google Cloud 中,擴散模型是許多先進圖像生成模型和工具的基礎。課程將介紹擴散模型背後的理論,並說明如何在 Vertex AI 上訓練和部署這些模型。
這堂課程將說明變換器架構,以及基於變換器的雙向編碼器表示技術 (BERT) 模型,同時帶您瞭解變換器架構的主要組成 (如自我注意力機制) 和如何用架構建立 BERT 模型。此外,也會介紹 BERT 適用的各種任務,像是文字分類、問題回答和自然語言推論。課程預計約 45 分鐘。
本課程概要說明解碼器與編碼器的架構,這種強大且常見的機器學習架構適用於序列對序列的任務,例如機器翻譯、文字摘要和回答問題。您將認識編碼器與解碼器架構的主要元件,並瞭解如何訓練及提供這些模型。在對應的研究室逐步操作說明中,您將學習如何從頭開始使用 TensorFlow 寫程式,導入簡單的編碼器與解碼器架構來產生詩詞。
本課程將介紹注意力機制,說明這項強大技術如何讓類神經網路專注於輸入序列的特定部分。此外,也將解釋注意力的運作方式,以及如何使用注意力來提高各種機器學習任務的成效,包括機器翻譯、文字摘要和回答問題。
Earn a skill badge by passing the final quiz, you'll demonstrate your understanding of foundational concepts in generative AI. A skill badge is a digital badge issued by Google Cloud in recognition of your knowledge of Google Cloud products and services. Share your skill badge by making your profile public and adding it to your social media profile.
本課程會說明 Gemini in BigQuery,這是一套由 AI 輔助的功能,可協助「從資料到 AI」的工作流程。這些功能包含資料探索和準備、程式碼生成和疑難排解,以及工作流程探索和視覺化。本課程將透過概念解說、應用實例和實作實驗室,協助資料從業人員提升工作效率,並加速開發 pipeline。
Text Prompt Engineering Techniques introduces you to consider different strategic approaches & techniques to deploy when writing prompts for text-based generative AI tasks.
本課程會介紹 Vertex AI Studio。您可以運用這項工具和生成式 AI 模型互動、根據商業構想設計原型,並投入到正式環境。透過身歷其境的應用實例、有趣的課程及實作實驗室,您將能探索從提示到正式環境的生命週期,同時學習如何將 Vertex AI Studio 運用在多模態版 Gemini 應用程式、提示設計、提示工程和模型調整。這個課程的目標是讓您能運用 Vertex AI Studio,在專案中發揮生成式 AI 的潛能。
隨著企業持續擴大使用人工智慧和機器學習,以負責任的方式發展相關技術也日益重要。對許多企業來說,談論負責任的 AI 技術可能不難,如何付諸實行才是真正的挑戰。如要瞭解如何在機構中導入負責任的 AI 技術,本課程絕對能助您一臂之力。 您可以從中瞭解 Google Cloud 目前採取的策略、最佳做法和經驗談,協助貴機構奠定良好基礎,實踐負責任的 AI 技術。
這個入門微學習課程主要介紹「負責任的 AI 技術」和其重要性,以及 Google 如何在自家產品中導入這項技術。本課程也會說明 Google 的 7 個 AI 開發原則。
A Business Leader in Generative AI can articulate the capabilities of core cloud Generative AI products and services and understand how they benefit organizations. This course provides an overview of the types of opportunities and challenges that companies often encounter in their digital transformation journey and how they can leverage Google Cloud's generative AI products to overcome these challenges.
In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.
完成「使用 Dataplex 建構資料網格」技能徽章入門課程,即可證明您具備下列技能:使用 Dataplex 建構資料網格, 以利在 Google Cloud 維護資料安全性,並協助治理和探索資料。您將練習並測試自己的技能,包括在 Dataplex 為資產加上標記、指派 IAM 角色,以及評估資料品質。
In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting. Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
In this second installment of the Dataflow course series, we are going to be diving deeper on developing pipelines using the Beam SDK. We start with a review of Apache Beam concepts. Next, we discuss processing streaming data using windows, watermarks and triggers. We then cover options for sources and sinks in your pipelines, schemas to express your structured data, and how to do stateful transformations using State and Timer APIs. We move onto reviewing best practices that help maximize your pipeline performance. Towards the end of the course, we introduce SQL and Dataframes to represent your business logic in Beam and how to iteratively develop pipelines using Beam notebooks.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
This workload aims to upskill Google Cloud Partners about what Google Cloud VMware Engine is, design considerations, and basic implementation. It will also cover interoperability with on-premises vSphere and real world use cases for deployment. Finally, it will cover how VMware Engine compares to other cloud providers' offerings of VMware as a Service and future plans for VMware Engine.
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Learners will get hands-on practice using Vertex AI Feature Store's streaming ingestion at the SDK layer.
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Machine Learning Engineering professionals use tools for continuous improvement and evaluation of deployed models. They work with (or can be) Data Scientists, who develop models, to enable velocity and rigor in deploying the best performing models.
This course helps developers customize Chronicle and augment its abilities with third party integrations.
This course helps you understand how to use Chronicle to properly handle security incidents.
This course will familiarize you with the core functionality of Chronicle, including the user interface, connections, and settings.
Get hands-on experience applying and building rules for Chronicle. You learn what YARA-L is and how to customize & create event rules.
Learn the technical aspects you need to know about Chronicle and how it can help you detect and action threats.
Earn a DRI badge by completing the Infra Foundations - Implementing Least Privilege for Service Accounts quest, where you demonstrate your capabilities to manage service accounts, assign IAM roles, setting up and using impersonation and implementing logging sinks that target GCS buckets. When you complete this activity, you can earn the badge displayed above! View all the badges you have earned by visiting your profile page.
While the traditional approaches of using data lakes and data warehouses can be effective, they have shortcomings, particularly in large enterprise environments. This course introduces the concept of a data lakehouse and the Google Cloud products used to create one. A lakehouse architecture uses open-standard data sources and combines the best features of data lakes and data warehouses, which addresses many of their shortcomings.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.
完成「在 Google Cloud 使用 Terraform 建構基礎架構」技能徽章中階課程, 即可證明自己具備下列知識與技能:使用 Terraform 的基礎架構即程式碼 (IaC) 原則、運用 Terraform 設定佈建及管理 Google Cloud 資源、有效管理狀態 (本機和遠端),以及將 Terraform 程式碼模組化,以利重複使用和管理。
這堂課程可讓參加人員瞭解如何使用確實有效的設計模式,在 Google Cloud 中打造相當可靠且效率卓越的解決方案。這堂課程接續了「設定 Google Compute Engine 架構」或「設定 Google Kubernetes Engine 架構」課程的內容,並假設參加人員曾實際運用上述任一課程涵蓋的技術。這堂課程結合了簡報、設計活動和實作研究室,可讓參加人員瞭解如何定義業務和技術需求,並在兩者之間取得平衡,設計出相當可靠、可用性高、安全又符合成本效益的 Google Cloud 部署項目。
完成 建立 Google Cloud 網路 課程即可獲得技能徽章。這個課程將說明 部署及監控應用程式的多種方法,包括查看 IAM 角色及新增/移除 專案存取權、建立虛擬私有雲網路、部署及監控 Compute Engine VM、編寫 SQL 查詢、在 Compute Engine 部署及監控 VM,以及 使用 Kubernetes 透過多種方法部署應用程式。
歡迎參加「開始使用 Google Kubernetes Engine」課程。Kubernetes 是位於應用程式和硬體基礎架構之間的軟體層。如果您對這項技術感興趣,這堂課程可以滿足您的需求。有了 Google Kubernetes Engine,您就能在 Google Cloud 中以代管服務的形式使用 Kubernetes。 本課程的目標在於介紹 Google Kubernetes Engine (常簡稱為 GKE) 的基本概念,以及如何將應用程式容器化,以便在 Google Cloud 中執行。課程首先會初步介紹 Google Cloud,隨後簡介容器、Kubernetes、Kubernetes 架構和 Kubernetes 作業。
完成「在 Compute Engine 導入 Cloud Load Balancing」技能徽章入門課程,即可證明您具備下列技能: 在 Compute Engine 建立及部署虛擬機器, 以及設定網路和應用程式負載平衡器。
只要修完「在 Google Cloud 設定應用程式開發環境」課程,就能獲得技能徽章。 在本課程中,您將學會如何使用以下技術的基本功能,建構和連結以儲存空間為中心的雲端基礎架構:Cloud Storage、Identity and Access Management、Cloud Functions 和 Pub/Sub。
這堂隨選密集課程會向參加人員說明 Google Cloud 提供的全方位彈性基礎架構和平台服務。這堂課結合了視訊講座、示範和實作研究室,可讓參加人員探索及部署解決方案元素,包括安全地建立互連網路、負載平衡、自動調度資源、基礎架構自動化,以及代管服務。
這堂隨選密集課程會向參加人員說明 Google Cloud 提供的全方位彈性基礎架構和平台服務,並將重點放在 Compute Engine。這堂課程結合了視訊講座、示範和實作研究室,可讓參加人員探索及部署解決方案元素,例如網路、系統和應用程式服務等基礎架構元件。另外,這堂課也會介紹如何部署實用的解決方案,包括客戶提供的加密金鑰、安全性和存取權管理機制、配額與帳單,以及資源監控功能。
這堂隨選密集課程會向參加人員說明 Google Cloud 提供的全方位彈性基礎架構和平台服務,尤其側重於 Compute Engine。這堂課程結合了視訊講座、示範和實作研究室,可讓參加人員探索及部署解決方案元素,例如網路、虛擬機器和應用程式服務等基礎架構元件。您會瞭解如何透過控制台和 Cloud Shell 使用 Google Cloud。另外,您也能瞭解雲端架構師的職責、基礎架構設計方法,以及具備虛擬私有雲 (VPC)、專案、網路、子網路、IP 位址、路徑和防火牆規則的虛擬網路設定。
「Google Cloud 基礎知識:核心基礎架構」介紹了在使用 Google Cloud 時會遇到的重要概念和術語。本課程會透過影片和實作實驗室,介紹並比較 Google Cloud 的多種運算和儲存服務,同時提供重要的資源和政策管理工具。
Welcome Gamers! Today's game is all about experimenting with Big Query for Machine Learning! Use real life case studies to learn various concepts of BQML and have fun. Take labs to earn points. The faster you complete the lab objectives, the higher your score.