By Richard Yew, Principal Product Manager, Verizon Digital Media Services
On a spring day just a few months ago, a company was hit by a massive distributed denial of service (DDoS) attack. Their website traffic spiked to 1,500 times the normal load and continued to experience spikes exceeding five million requests per minute for an incredible 18 hours.
Originating from Italy, Spain and France, in many ways, this was a textbook DDoS attack. It aimed to overload the website infrastructure, thereby denying or slowing down access to the site for legitimate visitors. However, there was one critical distinction between this attack and the headline-grabbing DDoS attacks of the recent past, such as the Mirai botnet; this attack targeted a different and much more vulnerable tier of the company’s website architecture.
In the not-so-distant past, brute force attacks against Layer 3 and 4, the network and transport layers of the internet, were devastating. These multi-gigabit, or even terabits-per-second attacks attempted to inflict damage through a shotgun approach – deploying a large number of attacks indiscriminately, hoping to hit vulnerable systems by chance.
But Layer 3 and 4 attacks are not as effective as they used to be because content delivery networks (CDNs) have massive capacity capable of absorbing sudden spikes in network traffic.
As defenses evolve, so do the threats from sophisticated attackers. They are now employing more targeted attacks against the application layer, or Layer 7. A Layer 7 attack may look like a legitimate HTTP request, and it doesn't take thousands of infected machines to launch; it only requires a small number of resources, an automated script and knowledge of a web application's bottlenecks. Though this type of attack takes a bit more expertise than a brute force Layer 3 or 4 attack, when executed well, its effectiveness and high ROI means that it is likely to become more common.
Out of all the elements of your website architecture, your database is likely to be the most fragile. It has a comparatively small transactions-per-second limit. When that limit is exceeded, the database overloads and freezes. When this happens, requests from legitimate users get queued up, causing delays or even timeouts.
Application layer DDoS attacks take advantage of this vulnerability. To find the right spot to target, an attacker will look for parts of your web application that often require queries to your application or database backend. On an e-commerce site, this might be a page that makes API calls to load a list of products, pricing, and product availability. On a password-protected site, that might be a login request that checks credentials from the request body against known usernames and passwords. But instead of making the request once or twice – a normal browsing behavior – a botnet involved in an application layer attack can make the request hundreds of thousands of times per second to overwhelm your backend services.
Without a mechanism to track the rate of HTTP requests from a client, each of these can appear to be a legitimate request, and a traditional DDoS mitigation system or firewall that inspects an individual HTTP request won't detect and mitigate it. To protect your website and applications from barrages of malicious requests, you need to employ a strategy known as rate limiting.
It's vital to control the rate of backend requests or login attempts at the edge to limit the damage from application DDoS attacks. The first step is to determine which parts of your website or application are most vulnerable to a DDoS attack. Once you have found those pages or API endpoints that involve backend queries, you can then determine the maximum allowed request rate for them.
The specific limit you choose may vary based on the type of application. For example, if you know that a typical user physically cannot submit login credential to your login endpoint more than five times per second, then you know that any user request more frequent than this is likely malicious. You can, for example, set the rate limit on your login endpoint at five per second and lock out any client, that is to say, IP or IP and user-agent pair that violates that rule. However, you might place a higher limit on how often a user can refresh your homepage or product details page since these pages make fewer backend queries or can be served from CDN cache. Even for the same URL, it might make sense to set a different limit for GET requests vs. POST requests.
You can also set different penalties for clients who violate rate limiting rules on different parts of the application. In one place, a violation might result in their subsequent requests getting blocked for five minutes; in another, you might redirect them to a CAPTCHA page.
Beyond e-commerce and financial services, another interesting use case for rate limiting is protecting intellectual property. Rate limiting for image downloads can protect an image-rich website from excessive content scraping. It can also help software or media download websites reduce unnecessary bandwidth usage by preventing repeated downloads. Since rate limiting is not specialized, it can be applied to all types of use cases.
Returning to the company I talked about at the beginning of this blog, rate limits were in place when they were attacked. As a result, policies were automatically enforced to mitigate the attack. Over the duration of the attack, rate limiting was estimated to prevent more than eight hours of company downtime. That's an entire business day that the company's website could have been unavailable!
For rate limiting to work best, you must be in control. That's why Verizon Digital Media Services' rate limiting products have a high degree of flexibility. You can set multiple rules that interact with each other, creating a sophisticated multilayer defense for application DDoS. Other advantages include a high degree of configurability, flexible sampling windows and an advanced real-time analytics dashboard. You can also manage your rate limiting configurations and you can access real-time logs via API.
With more Level 7 DDoS attacks all but a certainty, consider complementing your network layer protection with rate limiting. Take it from the company who avoided an eight-hour outage, you won't regret the investment.
Get in touch with us today to learn how our CDN's layered defenses protect your websites and mobile apps from cybercriminals who are working to steal your intellectual property, tarnish your brand image and damage your online business.