Mysql: create index on 1.4 billion records

Posted by SiLent SoNG on Server Fault See other posts from Server Fault or by SiLent SoNG
Published on 2010-05-10T16:58:28Z Indexed on 2010/05/11 3:14 UTC
Read the original article Hit count: 198

Filed under:
|

I have a table with 1.4 billion records. The table structure is as follows:

CREATE TABLE text_page (
    text VARCHAR(255),
    page_id INT UNSIGNED
) ENGINE=MYISAM DEFAULT CHARSET=ascii

The requirement is to create an index over the column text.

The table size is about 34G.

I have tried to create the index by the following statement:

ALTER TABLE text_page ADD KEY ix_text (text)

After 10 hours' waiting I finally give up this approach.

Is there any workable solution on this problem?

UPDATE: the table is unlikely to be updated or inserted or deleted. The reason why to create index on the column text is because this kind of sql query would be frequently executed:

SELECT page_id FROM text_page WHERE text = ?

UPDATE: I have solved the problem by partitioning the table.

The table is partitioned into 40 pieces on column text. Then creating index on the table takes about 1 hours to complete.

It seems that MySQL index creation becomes very slow when the table size becomes very big. And partitioning reduces the table into smaller trunks.

© Server Fault or respective owner

Related posts about mysql

Related posts about indexing