8000 GitHub - fr3nd/mysqlpdump: MySQL Parallel Dump
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

fr3nd/mysqlpdump

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Description

MySQL Parallel Dump

Multi threaded mysqldump is not an utopia any more. mysqlpdump can dump all your tables and databases in parallel so it can be much faster in systems with multiple cpu’s.

It stores each table in a different file by default. It can also generate the dump to stdout although this is not recommended because it can use all the memory in your system if your tables are big.

History

I saw an interesting post on MySQL Performance Blog with some suggestions to improve mysqldump.

Here is my effort to implement some of that suggestions.

Requeriments

  • Python 2.4
  • MySQL-python module

Usage

Simplest usage (will save a file for each table):

mysqlpdump.py -u root -p password

Save compressed files (gzip) to /tmp/dumps and pass "–skip-opt" to mysqldump:

mysqlpdump.py -u root -p password -d /tmp/dumps/ -g -P "--skip-opt"

Output to stdout and use 20 threads:

mysqlpdump.py -u root -p password -stdout -t 20

Be more verbose:

mysqlpdump.py -u root -p password -v

Exclude "mysql" and "test" table from dumping:

mysqlpdump.py -u root -p password -e mysql -e test

Only dump "mysql" table:

mysqlpdump.py -u root -p password -i mysql

Links

About

MySQL Parallel Dump

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0