Skip to content

BigQuery on python. upload_from_file job not done #3343

@patitonav

Description

@patitonav

I am running a test to upload a .json file into an bigquery table and i cant get the job done.
Here is my code :

def stream_data2(dataset_name, table_name, json_data):
    bigquery_client = bigquery.Client()
    dataset = bigquery_client.dataset(dataset_name)
    table = dataset.table(table_name)

    # Reload the table to get the schema.
    table.reload()
	
    job = table.upload_from_file(
        json_data, source_format='NEWLINE_DELIMITED_JSON', 
        encoding='UTF-8',
        create_disposition='CREATE_IF_NEEDED',
        write_disposition='WRITE_APPEND')
    pprint.pprint(job)


with open('clarin.json', 'rb') as json_data:
    stream_data2(DATASET_NAME, TABLE_NAME, json_data)

The job ended in fail with this error:

> Errors:
> file-00000000: JSON table encountered too many errors, giving up. Rows: 1; errors: 1. (error code: invalid)
> file-00000000: JSON parsing error in row starting at position 0: . No such field: custom. (error code: invalid)
> 

This has no sense becouse i create the table with a json file with the same structure.

Any suggestion?

Thanks

Metadata

Metadata

Assignees

Labels

api: bigqueryIssues related to the BigQuery API.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions