How to handle large table in MySQL ?

Posted by Frantz Miccoli on Stack Overflow See other posts from Stack Overflow or by Frantz Miccoli
Published on 2010-05-24T18:09:50Z Indexed on 2010/05/24 18:11 UTC
Read the original article Hit count: 226

Filed under:
|
|
|
|

I've a database used to store items and properties about these items. The number of properties is extensible, thus there is a join table to store each property associated to an item value.

CREATE TABLE `item_property` (
    `property_id` int(11) NOT NULL,
    `item_id` int(11) NOT NULL,
    `value` double NOT NULL,
    PRIMARY KEY  (`property_id`,`item_id`),
    KEY `item_id` (`item_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;

This database has two goals : storing (which has first priority and has to be very quick, I would like to perform many inserts (hundreds) in few seconds), retrieving data (selects using item_id and property_id) (this is a second priority, it can be slower but not too much because this would ruin my usage of the DB).

Currently this table hosts 1.6 billions entries and a simple count can take up to 2 minutes... Inserting isn't fast enough to be usable.

I'm using Zend_Db to access my data and would really be happy if you don't suggest me to develop any php side part.

Thanks for your advices !

© Stack Overflow or respective owner

Related posts about php

Related posts about mysql