googleapis.dataproc.v1 library

Classes

AcceleratorConfig
Specifies the type and number of accelerator cards attached to the instances of an instance. See GPUs on Compute Engine.
Binding
Associates members with a role.
CancelJobRequest
A request to cancel a job.
Cluster
Describes the identifying information, config, and status of a cluster of Compute Engine instances.
ClusterConfig
The cluster config.
ClusterMetrics
Contains cluster daemon metrics, such as HDFS and YARN stats.Beta Feature: This report is available for testing purposes only. It may be changed before final release.
ClusterOperation
The cluster operation triggered by a workflow.
ClusterOperationMetadata
Metadata describing the operation.
ClusterOperationStatus
The status of the operation.
ClusterSelector
A selector that chooses target cluster for jobs based on metadata.
ClusterStatus
The status of a cluster and its instances.
DataprocApi
Manages Hadoop-based clusters and jobs on Google Cloud Platform.
DiagnoseClusterRequest
A request to collect cluster diagnostic information.
DiagnoseClusterResults
The location of diagnostic output.
DiskConfig
Specifies the config of disk options for a group of VM instances.
Empty
A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for Empty is empty JSON object {}.
EncryptionConfig
Encryption settings for the cluster.
Expr
Represents an expression text. Example: title: "User account presence" description: "Determines whether the request has a user account" expression: "size(request.user) > 0"
GceClusterConfig
Common config settings for resources of Compute Engine cluster instances, applicable to all instances in the cluster.
GetIamPolicyRequest
Request message for GetIamPolicy method.
HadoopJob
A Cloud Dataproc job for running Apache Hadoop MapReduce (https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/MapReduceTutorial.html) jobs on Apache Hadoop YARN (https://hadoop.apache.org/docs/r2.7.1/hadoop-yarn/hadoop-yarn-site/YARN.html).
HiveJob
A Cloud Dataproc job for running Apache Hive (https://hive.apache.org/) queries on YARN.
InstanceGroupConfig
Optional. The config settings for Compute Engine resources in an instance group, such as a master or worker group.
InstantiateWorkflowTemplateRequest
A request to instantiate a workflow template.
Job
A Cloud Dataproc job resource.
JobPlacement
Cloud Dataproc job config.
JobReference
Encapsulates the full scoping used to reference a job.
JobScheduling
Job scheduling options.
JobStatus
Cloud Dataproc job status.
ListClustersResponse
The list of all clusters in a project.
ListJobsResponse
A list of jobs in a project.
ListOperationsResponse
The response message for Operations.ListOperations.
ListWorkflowTemplatesResponse
A response to a request to list workflow templates in a project.
LoggingConfig
The runtime logging config of the job.
ManagedCluster
Cluster that is managed by the workflow.
ManagedGroupConfig
Specifies the resources used to actively manage an instance group.
NodeInitializationAction
Specifies an executable to run on a fully configured node and a timeout period for executable completion.
Operation
This resource represents a long-running operation that is the result of a network API call.
OrderedJob
A job executed by the workflow.
ParameterValidation
Configuration for parameter validation.
PigJob
A Cloud Dataproc job for running Apache Pig (https://pig.apache.org/) queries on YARN.
Policy
Defines an Identity and Access Management (IAM) policy. It is used to specify access control policies for Cloud Platform resources.A Policy consists of a list of bindings. A binding binds a list of members to a role, where the members can be user accounts, Google groups, Google domains, and service accounts. A role is a named list of permissions defined by IAM.JSON Example { "bindings": [ { "role": "roles/owner", "members": "user:mike@example.com", "group:admins@example.com", "domain:google.com", "serviceAccount:my-other-app@appspot.gserviceaccount.com" }, { "role": "roles/viewer", "members": "user:sean@example.com" } ] } YAML Example bindings: [...]
ProjectsLocationsResourceApi
ProjectsLocationsWorkflowTemplatesResourceApi
ProjectsRegionsClustersResourceApi
ProjectsRegionsJobsResourceApi
ProjectsRegionsOperationsResourceApi
ProjectsRegionsResourceApi
ProjectsRegionsWorkflowTemplatesResourceApi
ProjectsResourceApi
PySparkJob
A Cloud Dataproc job for running Apache PySpark (https://spark.apache.org/docs/0.9.0/python-programming-guide.html) applications on YARN.
QueryList
A list of queries to run on a cluster.
RegexValidation
Validation based on regular expressions.
SetIamPolicyRequest
Request message for SetIamPolicy method.
SoftwareConfig
Specifies the selection and config of software inside the cluster.
SparkJob
A Cloud Dataproc job for running Apache Spark (http://spark.apache.org/) applications on YARN.
SparkSqlJob
A Cloud Dataproc job for running Apache Spark SQL (http://spark.apache.org/sql/) queries.
Status
The Status type defines a logical error model that is suitable for different programming environments, including REST APIs and RPC APIs. It is used by gRPC (https://github.com/grpc). The error model is designed to be: Simple to use and understand for most users Flexible enough to meet unexpected needsOverviewThe Status message contains three pieces of data: error code, error message, and error details. The error code should be an enum value of google.rpc.Code, but it may accept additional error codes if needed. The error message should be a developer-facing English message that helps developers understand and resolve the error. If a localized user-facing error message is needed, put the localized message in the error details or localize it in the client. The optional error details may contain arbitrary information about the error. There is a predefined set of error detail types in the package google.rpc that can be used for common error conditions.Language mappingThe Status message is the logical representation of the error model, but it is not necessarily the actual wire format. When the Status message is exposed in different client libraries and different wire protocols, it can be mapped differently. For example, it will likely be mapped to some exceptions in Java, but more likely mapped to some error codes in C.Other usesThe error model and the Status message can be used in a variety of environments, either with or without APIs, to provide a consistent developer experience across different environments.Example uses of this error model include: Partial errors. If a service needs to return partial errors to the client, it may embed the Status in the normal response to indicate the partial errors. Workflow errors. A typical workflow has multiple steps. Each step may have a Status message for error reporting. Batch operations. If a client uses batch request and batch response, the Status message should be used directly inside batch response, one for each error sub-response. Asynchronous operations. If an API call embeds asynchronous operation results in its response, the status of those operations should be represented directly using the Status message. Logging. If some API errors are stored in logs, the message Status could be used directly after any stripping needed for security/privacy reasons.
SubmitJobRequest
A request to submit a job.
TemplateParameter
A configurable parameter that replaces one or more fields in the template. Parameterizable fields: - Labels - File uris - Job properties - Job arguments - Script variables - Main class (in HadoopJob and SparkJob) - Zone (in ClusterSelector)
TestIamPermissionsRequest
Request message for TestIamPermissions method.
TestIamPermissionsResponse
Response message for TestIamPermissions method.
ValueValidation
Validation based on a list of allowed values.
WorkflowGraph
The workflow graph.
WorkflowMetadata
A Cloud Dataproc workflow template resource.
WorkflowNode
The workflow node.
WorkflowTemplate
A Cloud Dataproc workflow template resource.
WorkflowTemplatePlacement
Specifies workflow execution target.Either managed_cluster or cluster_selector is required.
YarnApplication
A YARN application created by a job. Application information is a subset of org.apache.hadoop.yarn.proto.YarnProtos.ApplicationReportProto.Beta Feature: This report is available for testing purposes only. It may be changed before final release.

Constants

USER_AGENT → const String
'dart-api-client dataproc/v1'

Exceptions / Errors

ApiRequestError
Represents a general error reported by the API endpoint.
DetailedApiRequestError
Represents a specific error reported by the API endpoint.