Comments

  1. pahud says

    Hi, I’ve read the slides and have some questions about the LIMIT.

    As you said in “More about LIMIT”:
    Protect your application from large limits
    – People may not go to page 500 but search engine bots
    well may do.

    If I really need to do “select a,b,c from table where key=’val’ limit 1000,10″ for paging, is there any way to avoid slow query?

  2. says

    Pahud,

    In this case query will need to traverse these 1000 rows and discard them unless you have static data and you can pre-create positions for all key lookups.

    You can speed things up by using covering index – have key on (key,id) and fetch id only initially when fetch data for just those 10 rows you will display. This can be done as extra query (good if you have something like memcache) or you can have it done as a single query.

  3. runnr says

    Hi Peter,

    Although I think I understand the performance issue of using large limits, I don’t completely understand the solution you propose.. I think I have read about this somewhere.. cant really recall.. it will be great if you can shed some more light on your solution

    Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *