Error tokenizing data. C error: out of memory pandas python, large file csv Error tokenizing data. C error: out of memory pandas python, large file csv pandas pandas

Error tokenizing data. C error: out of memory pandas python, large file csv


try this bro:

mylist = []for chunk in  pd.read_csv('train_2011_2012_2013.csv', sep=';', chunksize=20000):    mylist.append(chunk)big_data = pd.concat(mylist, axis= 0)del mylist


You may try setting error_bad_lines = False when calling the csv file i.e.

import pandas as pddf = pd.read_csv('my_big_file.csv', error_bad_lines = False)


This error could also be caused by the chunksize=20000000. Decreasing that fixed the issue in my case.In ℕʘʘḆḽḘ's solution chunksize is also decreased which might have done the trick.