Running Jobs on Managed Apache Spark Reviews

46090 reviews

Manh H. · Reviewed почти 2 лет ago

Praba V. · Reviewed почти 2 лет ago

Tristan C. · Reviewed почти 2 лет ago

Emily D. · Reviewed почти 2 лет ago

Leonel A. · Reviewed почти 2 лет ago

Manjunath K. · Reviewed почти 2 лет ago

Leonardo H. · Reviewed почти 2 лет ago

Hitesh K. · Reviewed почти 2 лет ago

Vishnuvardhan P. · Reviewed почти 2 лет ago

Mallikarjun S. · Reviewed почти 2 лет ago

kishore m. · Reviewed почти 2 лет ago

Leandro G. · Reviewed почти 2 лет ago

Digvijay P. · Reviewed почти 2 лет ago

Thomas G. · Reviewed почти 2 лет ago

Olivier M. · Reviewed почти 2 лет ago

Sotiria S. · Reviewed почти 2 лет ago

Sushrut A. · Reviewed почти 2 лет ago

Yuliia M. · Reviewed почти 2 лет ago

Apporv D. · Reviewed почти 2 лет ago

ok

Phoutthakone B. · Reviewed почти 2 лет ago

There where some erros in the last part.

SERGIO ENRIQUE Y. · Reviewed почти 2 лет ago

excellent experience

Diego C. · Reviewed почти 2 лет ago

Luigino N. · Reviewed почти 2 лет ago

Hello, I found an issue where I had to replace the name "sparktodp" by the correct name of my cluster ("cluster-e93e") in the command and script below: export DP_STORAGE="gs://$(gcloud dataproc clusters describe sparktodp --region=us-east1 --format=json | jq -r '.config.configBucket')" #!/bin/bash gcloud dataproc jobs submit pyspark \ --cluster sparktodp \ --region us-east1 \ spark_analysis.py \ -- --bucket=$1

Crhistian S. · Reviewed почти 2 лет ago

Great learning!

Jitendra J. · Reviewed почти 2 лет ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.