Setting Up an Apache2 Server for WordPress on Ubuntu: A Walkthrough
A guide to setting up an Apache2 server for a WordPress install
With a recent increase in Bad Bot traffic and DDoS attacks, alongside AI becoming more aggressive in crawling sites, we’ve had to explore non-destructive and sustainable methods to mitigate requests to sites that excessively drain server resources.
Using Cloudways servers provides us with a great UI to monitor traffic, activity, and error logs.


This UI equips us with all the necessary information to promptly address any issues. It provides a list of IPs generating an excessive number of requests, which we can then add to .htaccess using “Deny from [IP Address]” to alleviate the strain on resources, particularly CPU.
After restoring resource levels, we can proactively address server requests. In addition, while configuring our htaccess, we can block all well-known Bad Bot traffic by employing rewrite rules.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(BadBot|AnotherBadBot|YetAnotherBadBot).*$ [NC]
RewriteRule .* - [F,L]
</IfModule>
You can typically request lists of bad bots from CrowdSec. Depending on your security plugin, you can access these lists via an API key provided by CrowdSec, ensuring continuous updates.
Regarding IP blocking to prevent DDoS attacks, it’s a double-edged sword. From time to time, a bot may generate even more requests than usual. In such cases, you may add this IP to your “Deny from” list or even block a range of IPs if there are multiple corresponding addresses. Typically, blocking a range involves appending the offending IP with Classless Inter-Domain Routing (CIDR).
For this purpose, if the offending IP is 111.111.111.111 and there are other IPs such as 111.111.111.112, 111.111.111.113, 111.111.111.114, etc., we can block a range like so: 111.111.111.0/24. However, a major drawback arises when the requests are from a good bot, cycling through a larger range of IPs and generating more requests than your server can handle creating.
In such instances, we can limit the number of requests from any bot or IP. You can achieve this using the robots.txt or your .htaccess.
In the case of your robots.txt, the aim is to target good bot traffic rather than bad bot traffic, as only good bots tend to adhere to the directives in this file.
# START YOAST BLOCK
# ---------------------------
User-agent: *
Disallow:
# Prevent crawling of certain pages
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /cgi-bin/
Disallow: /comments/
Disallow: /*?s=
# Crawl delay for all bots (not all bots support this)
Crawl-delay: 10
# Specify sitemap location
Sitemap: **[https://example.com/sitemap_index.xml](https://example.com/sitemap_index.xml)**
# ---------------------------
# END YOAST BLOCK
# Specific bot directives
User-agent: AhrefsBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: MJ12bot
Disallow: /
# Additional specific bots to disallow
User-agent: BLEXBot
Disallow: /
User-agent: DotBot
Disallow: /
User-agent: Yandex
Disallow: /
/*?s=).For server blocking DDoS attacks within the htaccess, we’ll utilize rewrite rules to manage this, enabling us to handle these via FTP if any issues arise.
This will block specific bad bots based on User-Agent.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(SCspider|Textbot|s2bot|MJ12bot|YandexBot|SemrushBot|AspiegelBot|BLEXBot|webmeup-crawler|oBot|Semrush|SiteExplorer|BaiDuSpider).*$ [NC]
RewriteRule .* - [F,L]
</IfModule>
These methods should not significantly harm your site, but we can explore another potential threat to server resources: site cloning. In some cases, your site’s resources may be utilized elsewhere, or in the worst-case scenario, your entire site could be cloned, negatively impacting your rankings. However, in this context, we’re focusing on resource conservation, specifically addressing media hot linking.
This piece of code in your htaccess will block access to other sites using your medial links to show your work. Even if this is shown elsewhere, the request comes from your server and hit your resources, to to prevent this we add:
# Prevent hotlinking for images and PDFs,
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^https?://(www\.)?example\.co\.uk [NC]
RewriteCond %{HTTP_REFERER} !^https?://example\.co\.uk [NC]
RewriteRule \.(jpg|jpeg|png|gif|pdf)$ - [F,NC]
</IfModule>
Explanation of Preventing Hot linking:
What we have gone through in this post should be enough to protect yourself server side, but you may want to look into what can be done for more persistant DDoS attacks, like Implementing Rate Limiting, using Web Application Firewalls (WAFs), Geo-Blocking and IP Blacklisting or make use of your DNS Migration Service as these services can absorb and filter out malicious traffic before it reaches your server, effectively mitigating large-scale DDoS attacks.
If you want to move away from prebuilt or bought themes and want an award-winning, Belfast based web design and development studio to deliver a project of similar quality, get in touch with the Wibble team to find out how we can help with your website requirements.
Studio Location
Floor 2,Telephone
+44 (0)28 90 098 678
Get in touch with us today with the details below, or use the contact form to ask us a question and one of our Wibblers will get back to you.
"*" indicates required fields