How to install python packages during runtime in dataproc serverless

In Dataproc, we have metadata option to install python packages in a dataproc cluster created through Composer.
How can we do the same for dataproc serverless while running a job?

 

0 1 951
1 REPLY 1

In Dataproc Serverless, unlike traditional Dataproc clusters, you cannot install Python packages through metadata during runtime. However, you can still manage Python dependencies using the following methods:

  1. Pre-installing Packages in a Custom Container Image:

    • Create a Docker image with the necessary Python packages and dependencies.
    • Push this image to Google Container Registry (GCR).
    • When submitting your Dataproc Serverless job, specify the custom image URI using the spark.dataproc.serverless.container.image property.
    • This approach ensures a consistent environment for your jobs and eliminates the need to install packages during each job execution.
  2. Installing Packages Within the Job Script:

    • Incorporate pip install commands within your job script using Python's subprocess module.
    • This method introduces additional overhead to job execution time and may lead to inconsistencies due to varying package versions.
    • It's a flexible approach but requires careful management of the installation process within the script.
  3. Leveraging Existing Dataproc Serverless Runtime Versions:

    • Dataproc Serverless provides pre-built runtime versions with a variety of Python packages pre-installed.
    • Select a runtime version that includes the packages you need.
    • This method is straightforward for dependency management but may not always offer the latest package versions.
  4. Utilizing Initialization Actions (If Supported):

    • As of April 2023, traditional initialization actions were not supported in Dataproc Serverless as they are in standard Dataproc clusters. If this has changed, you can define a script with pip install commands and specify it as an initialization action during job submission.
    • This would allow for dynamic package installation but may add complexity to job configuration.

When choosing the best approach, consider factors like the stability of your dependencies, the need for the latest package versions, and the impact on job execution time. Always refer to the latest Google Cloud documentation for current features and best practices.

For more information, you can visit: