Traitement de données sans serveur avec Dataflow : pipelines d'analyse par lot avec Dataflow (Python) avis

6770 avis

ERROR: Professional Data Engineer Certification - Serverless Data Processing with Dataflow: Develop Pipelines - Serverless Data Processing with Dataflow - Batch Analytics Pipelines with Dataflow (Python) '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)'

Angel M. · Examiné il y a 10 heures

EZHUMALAI A. · Examiné il y a 12 heures

Ferdie O. · Examiné il y a 18 heures

spent two hours getting message not sufficient workers/resources, in zone/region, but restricted from selecting another.

Ferdie O. · Examiné il y a 19 heures

EZHUMALAI A. · Examiné il y a 1 jour

Vignesh T. · Examiné il y a 1 jour

The dataflow jobs failed with "Startup of the worker pool in us-east1 failed to bring up any of the desired 1 workers. This is likely a quota issue or a Compute Engine stockout. The service will retry." There was also an SSL Certificate error to the GCS bucket which I solved by "gcloud auth application-default login" and clicking the link and pasting the code from the link. Thirdly, part 2 of the lab required dill imports which I installed with the following command. pip install apache-beam[dill]

Sayed Fawad Ali S. · Examiné il y a 1 jour

spent two hours getting message not sufficient workers/resources, in zone/region, but restricted from selecting another.

Ferdie O. · Examiné il y a 3 jours

Had many errors running the pipeline due to missing certificates. Fixed it by adding: export GCE_METADATA_MTLS_MODE=none

Martin H. · Examiné il y a 4 jours

vicente b. · Examiné il y a 5 jours

The lab is very cool, but in all of the course labs I am hitting the problem that I cannot run dataflow jobs in the specified regions and zones because no workers are available in those zones (though I am running the jobs multiple times). It completely ruins the learning experience, because I am not able to finish any lab though they are very well prepared. Please, add the option to switch to different regions and zones or keep some compute quota for Qwik labs.

Jan K. · Examiné il y a 6 jours

ZONE_RESOURCE_POOL_EXHAUSTED getting this error frequently

Ashok K. · Examiné il y a 6 jours

Gustavo L. · Examiné il y a 7 jours

Allan L. · Examiné il y a 8 jours

Harsh A. · Examiné il y a 8 jours

Guilherme A. · Examiné il y a 14 jours

Wayne F. · Examiné il y a 14 jours

Luis Antonio C. · Examiné il y a 14 jours

Luis Antonio C. · Examiné il y a 15 jours

already submitted feedback for this lab, with the issues that i've encountered

Rafael D. · Examiné il y a 15 jours

WARNING:urllib3.connectionpool:Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)'))': /computeMetadata/v1/instance/service-accounts/default/?recursive=true Traceback (most recent call last): File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/solution/batch_user_traffic_pipeline.py", line 99, in <module> run() File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/solution/batch_user_traffic_pipeline.py", line 78, in run (p | 'ReadFromGCS' >> beam.io.ReadFromText(known_args.input_path) File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 808, in __init__ self._source = self._source_class( File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/textio.py", line 144, in __init__ super().__init__( File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 127, in __init__ self._validate() File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/options/value_provider.py", line 193, in _f return fnc(self, *args, **kwargs) File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filebasedsource.py", line 190, in _validate match_result = FileSystems.match([pattern], limits=[1])[0] File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystems.py", line 240, in match return filesystem.match(patterns, limits) File "/home/jupyter/training-data-analyst/quests/dataflow_python/3_Batch_Analytics/lab/df-env/lib/python3.10/site-packages/apache_beam/io/filesystem.py", line 779, in match raise BeamIOError("Match operation failed", exceptions) apache_beam.io.filesystem.BeamIOError: Match operation failed with exceptions {'gs://qwiklabs-gcp-04-497cbdcab972/events.json': RefreshError(TransportError("Failed to retrieve https://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Compute Engine Metadata server unavailable. Last exception: HTTPSConnectionPool(host='metadata.google.internal', port=443): Max retries exceeded with url: /computeMetadata/v1/instance/service-accounts/default/?recursive=true (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1017)')))"))}

Mykola O. · Examiné il y a 15 jours

Mara Malina F. · Examiné il y a 16 jours

Daniela L. · Examiné il y a 16 jours

Gabriela C. · Examiné il y a 17 jours

Luis Antonio C. · Examiné il y a 20 jours

Nous ne pouvons pas certifier que les avis publiés proviennent de consommateurs qui ont acheté ou utilisé les produits. Les avis ne sont pas vérifiés par Google.