Corporate WordPress-based websites are as open to malicious attack as any others, with getting on for 100,000 blogs already compromised in some way. Fortunately, with WordPress there are several easy steps you can take to ensure that you’re effectively safeguarded against such malpractices.
Passwords and usernames
So-called ‘brute force’ attacks happen all the time, and not just to WordPress sites. With these the unsophisticated hackers effectively just try their luck by guessing at passwords and usernames, so an obvious first step in thwarting them is to be a bit intelligent with choosing these. If your website has been consistently attacked in this way you can take a look at the list of words commonly used by hackers that WordPress stores on its site, and make sure you avoid these.
Most sites that get hacked in this way have easy-to-guess passwords and usernames, where just a modicum of creativity on the part of the owners would have prevented the penetration.
One common problem experienced by WordPress site owners is the number of visitors arriving on the site who are actually searching for something completely unrelated to its services and products.
For example, due to a pure coincidence in words used in particular posts, hundreds of visitors may turn up who are actually looking for some form of adult content, and a business obviously doesn’t want this sort of uninvited guest unless it is in that line of trade anyway. A Small Business Blog such as the one of Intuit, would be able to be hacked if these recommendations aren’t implemented.
An anthropology site may be replete with references to various ‘fetishes’ and ‘adult’ chimpanzees, but these keywords could as easily be picked up on by those searching for something else as by students of the subject.
Some visitors are simply not wanted and it is necessary to block certain searches if they persistently cause problems. A search engine such as Google will do what it likes with search words and phrases found on the site and there is not much control over it from that end. Fortunately, there are several other tools that can be used for filtering purposes.
Robots.txt can be used to effectively block indexing by search engines of specified URLs. The only drawback with this is that the code has to be changed by hand each time something is posted that you want to keep hidden by the search engines.
Search engine and SEO plug-ins sometimes allow for exclusions to be specified to what the search engines are indexing and thereby making available to unwanted visitors. WordPress also has its own internal indexing system for search engines. An easy way to access these innards is through the admin panel’s privacy settings. Here, you can stipulate that you want your website or blog to be visible to all and sundry, which includes of course the major search engines like Google, Technorati and Bing as well as the archivers. So if you don’t want this to be the case you can specify various filters to ensure that your unwanted visitors browse for their obscure objects of desire elsewhere.