An increasing number of DDoS attacks are aimed directly at Layer 7 (applications). They tend to be disguised to look like real traffic. Layer 7 is often targeted after attacks at the Layer 3 and Layer 4 level have failed. The ability to mitigate a Layer 3 or Layer 4 attack always comes down to the simple equation of who has more network capacity: the attack element or the mitigation service.
Application Layer 7 attacks, however, are often structured to overload particular elements of application server infrastructure. They are often effective as even simple DoS attacks, such as targeting login pages with random user IDs and password, can overload CPUs and databases to a critical level. In terms of mitigation, success does not depend on how large your capacity is, but instead on how smart and sophisticated your security technology is and how it can be put to effective use.
Mitigation of Layer 7 attacks largely relies upon accurately profiling incoming traffic in order to distinguish between humans, human-like bots and web browsers, which have been hijacked. The mitigation process is therefore more complicated than the actual attack. If done correctly, the solution will remain hidden, leading to a lack of headlines and public attention over the issue.
Sometimes, attackers will use a volumetric application layer attack, such as a HTTP flood, as a distraction ruse to mask other more highly targeted attacks. Relatively primitive bots using a high network utilization can be used to power extremely large volumetric attacks while using minimum computing resources. Protection services with a similar high capacity can stop volumetric attacks at the most external layer, which filters out the white noise that such an attack can generate, allowing IT staff to focus on protecting their entire system.
Client classification can be used, for instance, to identify and filter out these types of bot by comparing signatures and looking at its attributes, such as IP address, cookie support variations, JavaScript footprint, etc. This helps protection services discern what is human traffic vs. what is bot traffic, between “good” and “bad” bots, and identify AJAX and APIs. Good bots, for instance, are monitoring tools, search engines, or others that are necessary for ops, SEO and other critical site functionality.
Always ask your provider if they provide Layer 7 protection as part of their DDoS protection.