sql - Perl many insert statments into mysql database -


i have code perl looks this:

@valid = grep { defined($column_mapping{ $headers[$_] }) } 0 .. $#headers;  ...  $sql = sprintf 'insert tablename ( %s ) values ( %s )',   join( ',', map { $column_mapping{$_} } @headers[@valid] ),   join( ',', ('?') x scalar @valid); $sth = $dbh->prepare($sql);  ...  @row = split /,/, <input>;  $sth->execute( @row[@valid] ); 

(taken mob's answer previous question.)

that dynamically building sql insert statement csv data, , allowing csv data proper headers column mapping picked.

i have been looking examples on how insert statment multiple rows of data @ once.

my perl script needs run around few hundred million insert statments, , doing 1 @ time seems slow, since server running on has 6gb of ram , slowish internet connection.

is there way can upload more 1 row @ time of data? 1 insert statment uploads maybe 50 rows, or 100 rows @ once? cant find out how perl dbi.

my $sql_values = join( ' ', ('(?, ?, ?)') x scalar(@array) ); 

as said before, can flatten it.


Comments

Popular posts from this blog

mysql - FireDac error 314 - but DLLs are in program directory -

git - How to list all releases of public repository with GitHub API V3 -

c++ - Getting C2512 "no default constructor" for `ClassA` error on the first parentheses of constructor for `ClassB`? -