开发者

Block images, CSS or JS if the user did not visit my site

开发者 https://www.devze.com 2023-04-11 09:18 出处:网络
I suddenly get a lot of requests to my site (wget, curl etc...). I do not want any of these request to be executed u开发者_运维问答nless a user has visited my site at least once using a valid browser

I suddenly get a lot of requests to my site (wget, curl etc...).

I do not want any of these request to be executed u开发者_运维问答nless a user has visited my site at least once using a valid browser (like Firefox or Chrome).

Is there an Apache Module to do this? What can I do?


The concept is simple: create a token that will be used in the files you want to protect. This This token (key like "abc123" - saved in the session, not cookies) will be used in every file you load. This way, if the token does not match from the file you can redirect to a page not found or access denied.

Setup the token in the index.php:

<?php
  session_start();
  header("Cache-Control: no-cache, must-revalidate");
  header("Expires: Mon, 10 Oct 2005 05:00:00 GMT");
  $_SESSION['siteToken'] = "abc123";
?>

<html>
<head> 
  <link rel="stylesheet" type="text/css" href="/style.css" />
</head>
<body>

Now for CSS and JavaScript files you need to check the token to make sure it's set and it's the right value.

// style.css
<?php
  session_start();
  header("Content-type: text/css");
  header("Cache-Control: no-cache, must-revalidate");
  header("Expires: Mon, 10 Oct 2005 05:00:00 GMT");
  if($_SESSION["siteToken"] != "abc123") {
    session_regenerate_id();
    die(); // or redirect
  } 
?>
body { background-color: #000; color: #fff; }
etc...

You do the same thing for the JavaScript file.

The next step if you update your .htaccess to make sure the CSS and JavaScript file are parsed correctly:

RewriteEngine on
RewriteBase /
RewriteRule style.css style.php [NC,L]

In addition you can add this in your .htaccess to protect bad bots from querying files:

SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
SetEnvIfNoCase User-Agent "^libwww-perl" bad_bot
#etc...
Deny from env=bad_bot

Now as for the images. This solution will work as well but it will be slower than parsing the CSS and JavaScript files. The logic is the same but instead of echoing you have to read the file (using readfile). Also, you have to change the header based on the extension of the file.

The alternative for this will be to set this in your .htaccess file:

RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?domain.com [NC]
RewriteRule \.jpg$ - [NC,F,L]

Although, this is not bulletproof.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号