bigframes.pandas.DataFrame.to_pickle#

DataFrame.to_pickle(path, *, allow_large_results=None, **kwargs) None[source]#

Pickle (serialize) object to file.

Examples:

>>> df = bpd.DataFrame({'col1': [1, 2], 'col2': [3, 4]})
>>> gcs_bucket = "gs://bigframes-dev-testing/sample_pickle_gcs.pkl"
>>> df.to_pickle(path=gcs_bucket)
Parameters:
  • path (str) – File path where the pickled object will be stored.

  • allow_large_results (bool, default None) – If not None, overrides the global setting to allow or disallow large query results over the default size limit of 10 GB.