GET 24/7 LIVE HELP NOW

Announcement

Announcement Module
Collapse
No announcement yet.

Is there a better way to page data than using LIMIT ?

Page Title Module
Move Remove Collapse
X
Conversation Detail Module
Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Is there a better way to page data than using LIMIT ?

    Hi,

    The subject says it all really. I am curious if, on relatively large tables, there is a better way to get 'paged' data than using the LIMIT function.

    For example I have a comments table, it contains around 300k comments. The table structure is:

    comment_id INT
    item_ref INT
    author_ref INT
    date_created DATE_TIME
    status ENUM('pending','live','deleted')

    The actual comment text is held in comment_body:

    comment_ref INT
    message TINYTEXT

    And finally I've a comment_stats table:

    comment_ref INT
    total_views INT

    I split the 'total views' out into a separate table because this table receives a large number of update requests, where-as the main comment and comment_body tables are heavily read, but rarely modified.

    Keeping the comment text itself out of the main comment table kept the table fixed. comment_body is LEFT JOINed on the comment_id to comment_ref.

    comment_id is my primary key (auto inc, unsigned)

    In order to page through the comments, which are displayed based on date_created, I'm using LIMIT X,Y in order to show 50 comments per page. The 'item_ref' field is the link to the item being commented on, so for example some items may only have a couple of comments, while others will have thousands.

    I program with PHP 5 (PDO extension) and I am having to run my query once with a COUNT(*) at the start in order to get back table totals, and then again with LIMIT at the end to get the block of rows that I want to display. Using explain shows that it is checking ALL comments matching item_ref, even though it only returns the 50 I require.

    Is there a better way to do this? I'm perfectly happy to move my query into a stored procedure if it will help (and use a cursor to loop through the data?) but is the problem more intrinsic than that? Perhaps the solution isn't MySQL based at all, maybe I should just be looking at caching say the first few pages worth of results for each item and not worry about the MySQL hit?

    Eitherway I'd truly appreciate your advice.

  • #2
    Sorry, forgot to add we're using MySQL 5.0.27

    Comment


    • #3
      You can try using SQL_CALC_FOUND_ROWS
      However you have to test it and see what performance you get and also be aware of :- http://bugs.mysql.com/bug.php?id=18454

      Comment


      • #4
        consider denormalising a little bit

        You do an expensive count, just to see how many pages there are.
        The paging isnt a performance problem, but the count potentially is.

        But if instead, you were to store num_comments in the items table, you could just figure out how many pages there are based on that, and jump straight to the query with a LIMIT

        Comment

        Working...
        X