Quantcast
Channel: MySQL Forums - Backup
Viewing all articles
Browse latest Browse all 537

Importing very large database (Wikipedia) (1 reply)

$
0
0
I'm trying to create a Wikipedia copy and I'm almost done, but have problems with the largest file.

Wikipedia offers database dumps for download, so everybody can create a copy of Wikipedia.

You can find example files here:
http://dumps.wikimedia.org/enwiki/20091103/

I've importet almost all of them but am stuck with the biggest one (pagelinks). I imported it for nearly five days, then I had to stop the import. I think the import is slow because of the settings for my MySQL Server, but I don't know what I should change. I'm using the standard Ubuntu MySQL config on a machine with a decent prozessor and 4 gig RAM. Could someone help me out with a suitable configuration for my system? I tried same configs for large servers from the net but the result was that my server didn't start because of an socket error and I had to start over from scratch...

If you can, please have a look into some of the dumps because I think the problem is related to how they are build as some of the smaller files took very long ti import while some of the larger files were imported in a few minutes.

Thanks in advance for any help!

Viewing all articles
Browse latest Browse all 537

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>