开发者

MySQL Query Needs Optimizations

开发者 https://www.devze.com 2023-03-29 03:18 出处:网络
I have this users table with: id : int (255) name: char (100) last_comment_target: int(100) last_comment_date: datetime

I have this users table with:

id : int (255)
name: char (100)
last_comment_target: int(100)
last_comment_date: datetime

This table has around 1.3mil rows.

PKEY and BTREE is on id, last_comment_target, and last_comment_date.

And, I am trying to perform a range query:

SELECT * FROM users 
WHERE id IN (1,2,3,5,...[around 500开发者_Python百科0 ids]) 
AND last_comment_target > 0 
ORDER BY last_comment_dt DESC LIMIT 0,20;

Sometimes the query can take as long as 3 seconds. I wonder if there are better ways to optimize this query. Or, if this query can be rewritten.

Thank you so much for your help.


SELECT u.* 
FROM 
    users u
    JOIN (
        SELECT 1 id 
        UNION ALL
        SELECT 2 id
        UNION ALL
        :
        :
        SELECT 5000 id
    ) ids ON ids.id = u.id
WHERE 
    last_comment_target > 0 
ORDER BY 
    last_comment_dt DESC 
LIMIT 0, 20;


Thanks everyone that has contributed.

@Karolis seems to point out that an alternative using join instead of range

So, basically:

SELECT * FROM users WHERE id IN (1,2,3,...[5000 ids]) AND last_comment_target > 0

yields in EXPLAIN statement a type of RANGE. The 5000 ids can be generated from another table.

When I switched the above to:

SELECT *
FROM users u
INNER JOIN user_friends uf ON u.id = uf.to_id
AND u.last_comment_target > 0
AND uf.from_id = [id];

It yields in EXPLAIN statement two types: ref and eq_ref which is faster than range in this query.

The query execution is reduced from 3+ seconds to around 0.2x seconds.

So, lesson learned from my end: TRY to use JOIN instead of RANGE if you have a table that you can derive from.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号