bigframes._config.BigQueryOptions#
- class bigframes._config.BigQueryOptions(credentials: Credentials | None = None, project: str | None = None, location: str | None = None, bq_connection: str | None = None, use_regional_endpoints: bool = False, application_name: str | None = None, kms_key_name: str | None = None, skip_bq_connection_check: bool = False, *, allow_large_results: bool = False, ordering_mode: Literal['strict', 'partial'] = 'strict', client_endpoints_override: dict | None = None, requests_transport_adapters: Sequence[Tuple[str, BaseAdapter]] = (), enable_polars_execution: bool = False)[source]#
Encapsulates configuration for working with a session.
- property allow_large_results: bool#
Checks the legacy global setting for allowing large results. Use
bpd.options.compute.allow_large_resultsinstead.Warning: Accessing
bpd.options.bigquery.allow_large_resultsis deprecated and this property will be removed in a future version. The configuration for handling large results has moved.- Returns:
The value of the deprecated setting.
- Return type:
- Type:
DEPRECATED
- property application_name: str | None#
The application name to amend to the user-agent sent to Google APIs.
The application name to amend to the user agent sent to Google APIs. The recommended format is
"application-name/major.minor.patch_version"or"(gpn:PartnerName;)"for official Google partners.Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.application_name = "my-app/1.0.0"
- Returns:
Application name as a string if exists; otherwise None.
- Return type:
None or str
- property bq_connection: str | None#
Name of the BigQuery connection to use in the form <PROJECT_NUMBER/PROJECT_ID>.<LOCATION>.<CONNECTION_ID>.
You either need to create the connection in a location of your choice, or you need the Project Admin IAM role to enable the service to create the connection for you.
If this option isn’t available, or the project or location isn’t provided, then the default connection project/location/connection_id is used in the session.
If this option isn’t provided, or project or location aren’t provided, session will use its default project/location/connection_id as default connection.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.bq_connection = "my-project.us.my-connection"
- Returns:
Name of the BigQuery connection as a string; otherwise None.
- Return type:
None or str
- property client_endpoints_override: dict#
Option that sets the BQ client endpoints addresses directly as a dict. Possible keys are “bqclient”, “bqconnectionclient”, “bqstoragereadclient”.
- property credentials: Credentials | None#
The OAuth2 credentials to use for this client.
Examples:
>>> import bigframes.pandas as bpd >>> import google.auth >>> credentials, project = google.auth.default() >>> bpd.options.bigquery.credentials = credentials
- Returns:
google.auth.credentials.Credentials if exists; otherwise None.
- Return type:
- property enable_polars_execution: bool#
If True, will use polars to execute some simple query plans locally.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.enable_polars_execution = True
- property kms_key_name: str | None#
Customer managed encryption key used to control encryption of the data-at-rest in BigQuery. This is of the format projects/PROJECT_ID/locations/LOCATION/keyRings/KEYRING/cryptoKeys/KEY.
For more information, see https://cloud.google.com/bigquery/docs/customer-managed-encryption Customer-managed Cloud KMS keys
Make sure the project used for Bigquery DataFrames has the Cloud KMS CryptoKey Encrypter/Decrypter IAM role in the key’s project. For more information, see https://cloud.google.com/bigquery/docs/customer-managed-encryption#assign_role Assign the Encrypter/Decrypter.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.kms_key_name = "projects/my-project/locations/us/keyRings/my-ring/cryptoKeys/my-key"
- Returns:
Name of the customer managed encryption key as a string; otherwise None.
- Return type:
None or str
- property location: str | None#
Default location for job, datasets, and tables.
For more information, see https://cloud.google.com/bigquery/docs/locations BigQuery locations.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.location = "US"
- Returns:
Default location as a string; otherwise None.
- Return type:
None or str
- property ordering_mode: Literal['strict', 'partial']#
Controls whether total row order is always maintained for DataFrame/Series.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.ordering_mode = "partial"
- Returns:
A literal string value of either strict or partial ordering mode.
- Return type:
Literal
- property project: str | None#
Google Cloud project ID to use for billing and as the default project.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.project = "my-project"
- Returns:
Google Cloud project ID as a string; otherwise None.
- Return type:
None or str
- property requests_transport_adapters: Sequence[Tuple[str, BaseAdapter]]#
Transport adapters for requests-based REST clients such as the google-cloud-bigquery package.
For more details, see the explanation in requests guide to transport adapters.
Examples:
Increase the connection pool size using the requests HTTPAdapter.
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.requests_transport_adapters = ( ... ("http://", requests.adapters.HTTPAdapter(pool_maxsize=100)), ... ("https://", requests.adapters.HTTPAdapter(pool_maxsize=100)), ... )
- property skip_bq_connection_check: bool#
Forcibly use the BigQuery connection.
Setting this flag to True would avoid creating the BigQuery connection and checking or setting IAM permissions on it. So if the BigQuery connection (default or user-provided) does not exist, or it does not have necessary permissions set up to support BigQuery DataFrames operations, then a runtime error will be reported.
Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.skip_bq_connection_check = True
- Returns:
A boolean value, where True indicates a BigQuery connection is not created or the connection does not have necessary permissions set up; otherwise False.
- Return type:
- property use_regional_endpoints: bool#
Flag to connect to regional API endpoints for BigQuery API and BigQuery Storage API.
Note
Use of regional endpoints is a feature in Preview and available only in regions “europe-west3”, “europe-west8”, “europe-west9”, “me-central2”, “us-central1”, “us-central2”, “us-east1”, “us-east4”, “us-east5”, “us-east7”, “us-south1”, “us-west1”, “us-west2”, “us-west3” and “us-west4”.
Requires that
locationis set. For [supported regions](https://cloud.google.com/bigquery/docs/regional-endpoints), for exampleeurope-west3, you need to specifylocation='europe-west3'anduse_regional_endpoints=True, and then BigQuery DataFrames would connect to the BigQuery endpointbigquery.europe-west3.rep.googleapis.com. For not supported regions, for exampleasia-northeast1, when you specifylocation='asia-northeast1'anduse_regional_endpoints=True, the global endpointbigquery.googleapis.comwould be used, which does not promise any guarantee on the request remaining within the location during transit.Examples:
>>> import bigframes.pandas as bpd >>> bpd.options.bigquery.location = "europe-west3" >>> bpd.options.bigquery.use_regional_endpoints = True
- Returns:
A boolean value, where True indicates that regional endpoints would be used for BigQuery and BigQuery storage APIs; otherwise global endpoints would be used.
- Return type: