python 2.7 - How to upload large file from desktop to BigQuery programatically? -



python 2.7 - How to upload large file from desktop to BigQuery programatically? -

trying upload big csv files desktop bigquery (using python). looking programmatic way. used cloud sdk shell looking web based custom solution.

the scenario : user can select csv file using ui developed in gwt(fileupload widget). there limit of 32mb on post body size.[ what can maximum "post" size can have? ] how send info selected csv file app engine python script insert bigquery ? tried multipart upload, how redirect python script rather the servlet. kindly suggest if possible redirect.

the whole web application needs deployed on app engine along gwt,python codes.

also there anyway develop finish web interface in python , utilize multipart upload.(this has hosted on app engine)

thanks in advance.

large files should uploaded google cloud storage, , loaded bigquery there. gcs supports resumable upload protocol allow upload big file in chunks, making upload process much more robust flaky connection issues. high level description of process can found here: https://cloud.google.com/storage/docs/concepts-techniques#resumable

you inquire doing within browser application. google cloud storage documentation javascript client can found here: https://cloud.google.com/storage/docs/json_api/v1/json-api-javascript-samples

with should able have client code upload file straight google cloud storage. there app engine application can load info bigquery.

python-2.7 google-bigquery

Comments

Popular posts from this blog

Delphi change the assembly code of a running process -

json - Hibernate and Jackson (java.lang.IllegalStateException: Cannot call sendError() after the response has been committed) -

C++ 11 "class" keyword -