Hi, I have some problem while populating a table.
Using web2py and I'm moving from sqlite development to mysql production, I've been used to populate a table while initializing a fresh database directly from web2py looping over each record.
There are about 50.000 records with five fields of type varchar and int.
Moving to mysql I've found that my procedure failed due to timeout so I tried to use LOAD DATA LOCAL INFILE and i was successful with a short testing set, so I've generated a full set then loaded ok; the file size was 1.9 MB.
But for some reasons I need an additional field which is made by concatenating two other fields, in my original code I've set up things so that web2py computed the former. But now I'm injecting data directly and to avoid having to compute after the insert I've generated another full set with all fields so the file size grows to about 3.3MB.
And now I'm getting this error inmediately after starting the query... I've tried various things but the only way to perform a successful insert is to reduce the file size, I've been trying tweaking the connection like this:
mysql -hmysql.server -udido 'dido$db_here' -p --max_allowed_packet=16M --connect_timeout=20 --local-infile=1
...without luck, and now I'm out of options.
My question is, there is some configuration setting I can try to let this run or should I split the query in half? I don't know but I think that 50.0000 records and 3.3MB is not an huge size?
Thnaks, Diego