If you own and/or manage a website, you'll need to protect it. There are three basic defense in depth principles to follow to prevent a potential hack, no matter the size or purpose of your site: prevent, detect, and recover. There is a common myth among many who own/manage websites that they are not likely to be targeted by hackers. I'm often asked, “Why would someone target my website?” or “Why would a hacker take over my site?” The answer is it depends, but ultimately there are lots of reasons your site could be a target, common reasons being financial gain, access to computing resources, hacktivism, or bragging rights. Some people believe that if a website’s traffic is low, there is little or no financial/personal information collected on the site, or the website is not well known, then it won't be of interest to hackers. I can understand the rationale and why many think their websites don’t offer enough value for hackers to make the effort. Unfortunately, that couldn’t be farther from reality. Tens of thousands of websites are hacked every day, and every website is a potential (and perhaps an eventual) target of a hack.
Another common misconception is there are humans behind computers manually hacking into each website, which further fuels the belief that no one would actually take the time to hack into a little-known blog. That’s probably true, except most hacks are performed by bots. Bots or automated software applications are used to scan the web, searching for known vulnerabilities. These bots are designed by hackers to find and exploit vulnerabilities found on your site to inject and distribute malware, send spam, steal traffic, host phishing pages, and more. So while you may not face a directly targeted attack, without taking the right steps, your site could easily be an attack of opportunity. While your site may not generate as much traffic as Google, it also likely does not have all the same defenses and security measures in place as Google. Hackers often take the approach of taking over many smaller sites that are easier to infiltrate to achieve larger scale. The ability of bots to indiscriminately scan a massive number of websites in very little time makes virtually every site on the internet a target.
Defense in depth
There is a principle in security called defense in depth. The idea is that layered security mechanisms increase the security of the system as a whole. If one layer fails, the next layer will continue to protect you. While there is no solution that can completely eliminate the risk of an attack, there are steps you can take to greatly reduce that risk. Consider the examples and recommendations below in each of the major areas of defense as you design and evaluate your layered security approach:
Enable automatic updates: It’s hard to believe, but well-known websites have faced preventable data breaches, affecting millions of users, through known vulnerabilities in web applications. Auto updates are crucial because they often include fixes to security holes discovered by the developer in the software and enables that software to be patched before the vulnerability may be exploited.
Use known open source code: Open source code offers some security and stability benefits beyond other types of code because anyone can view and modify it. Other users of the code often discover and fix errors or omissions that the author(s) of the code may have missed. Beyond that, open source code allows users to continue to improve and build on the code even if the original authors discontinue support.
Perform a code review: A code review is the process of inspecting a set code to find any possible errors or omissions in the development of the code and to improve it before it is employed. Performing this process can dramatically limit risks, improves the quality of code, and helps you become more familiar with the code you use and will likely need to maintain.
Use strong password and two-factor authentication: Always use strong passwords for your CMS, FTP, and other website tools, platforms, and applications. Each sign-in presents another area of vulnerability for your website. Weak passwords leave admin accounts vulnerable to automated brute force attacks. In addition to strong passwords, many site management tools offer a two-factor authentication or 2FA option, which should be enabled to ensure greater security for website admin areas.
Use HTTPS: Adopt Hypertext Transfer Protocol Secure (HTTPS) on your website, which secures data between the user’s device and your website via Transport Layer Security (TSL) protocol. TSL provides protection in three ways by, (1) encrypting the data exchanged between user’s device and the site, (2) detecting and preventing data from being modified or corrupted in transit, and (3) authenticating both the server and client before the data is exchanged.
Use anti-automation tools/techniques: Insufficient Anti-automation occurs when a process that is designed only to be performed manually or by a human (e.g. signing up for an account or posting on a forum) is automated on a website by an attacker. Without employing anti-automation tools or techniques on your site such as a CAPTCHA, a bot could repeatedly execute thousands of tasks or requests.
Traffic monitoring: Analyzing network traffic is an important way to keep a network secure. Because so much of the online traffic today in encrypted, you’ll need a tool that decrypts the traffic before analyzing for full visibility. Once the data is analyzed, the traffic can be blocked/rerouted if the data is malicious or corrupted, or re-encrypted it to maintain security before transport continues. Here’s How The Traffic Analysis System Works.
Web application firewall: A web application firewall (WAF) is placed in front of a web application to monitor all traffic to and from the application, and filters/blocks harmful traffic based on a set of rules to prevent common attacks such as security misconfigurations, cross-site scripting (XSS), and SQL injection.
Use antivirus: Antivirus can help keep your website secure by detecting and removing malware on your website, and includes various features such as website file scanning, file change monitoring, and code analysis, which can prevent hackers from gaining access to your administrator panel and prevent your website from infecting visitors.
Vulnerability scanning: Network vulnerability scanning is intended to detect weaknesses or areas of a computer, network, or other network equipment that can be exploited. Countermeasures are put in place of security holes to maintain the integrity of the network and/or network devices.
Backups: A backup or identical copy of your site data allows you to easily and quickly restore your data if the primary version is corrupted, lost, or otherwise inaccessible. You should conduct backups on a routine basis, ideally as often as you make changes to your site, to ensure you always have the latest version.
At StackPath, we provide services that help you stay safe online. Our web application firewall can detect and block automated services from reaching your site and, by doing so, prevent any attempt to scan for vulnerabilities on your site's software. Our network is fully secured and monitored regularly to prevent any unauthorized usage of protocols or services on StackPath assets. Our team of professionals is here to assist you with recommendations for best practices in web security. Our goal is to secure the internet.