Akamai's 'Content Protector' safeguards against web scraping
Akamai Technologies has unveiled a pioneering product aimed at preventing malicious scraping attacks while ensuring the flow of beneficial data traffic that enterprises rely on to expand their businesses. The innovative solution titled 'Content Protector' is designed to thwart evasive scrapers, optimising detection rates whilst minimising false negatives with no surge in false positives.
Content scraping has increasingly become a persistent issue for online enterprises. While scraper bots are essential components of digital commerce ecosystems, often aiding in content detection, product highlighting on comparison sites and data assimilation for customer communication, they are also being exploited for detrimental activities. Issues range from competitive undercutting and pre-assault surveillance during inventory hoarding attacks to fraudulent replication of goods and website content.
A major concern firms face is the constant pinging of their sites by scrapers, causing a decline in website performance and driving away frustrated customers. Moreover, scrapers of recent years have evolved in complexity, making detection and mitigation considerably more challenging.
Content Protector is purpose-built for entities aiming to safeguard their intellectual property, protect their reputation and maintain their revenue potential against the threat of these sophisticated scrapers, offering features tailored to identify and combat such intrusions.
Its capabilities range from protocol-level assessment that evaluates the engendered connection between the client and the server across the different strata of the Open Systems Interconnection (OSI) model, thereby verifying that the parameters established align with standard web browsers and mobile applications. At the application level, it assesses if the client can execute business logic written in JavaScript, collecting device and browser traits and user preferences and comparing these with protocol-level data for consistency.
An interaction analysis aims to distinguish human users from bot traffic by monitoring standard user-peripheral interactions like touchscreen activity, keyboard usage, and mouse movements. The user journey through the website is also observed, with a particular focus on targeting behaviour normal for botnets. Finally, a risk classification system offers deterministic and actionable risk evaluations, categorising traffic as low, medium, or high risk based on anomalies detected during assessment.
Senior Vice President and General Manager of Application Security at Akamai, Rupesh Chokshi, highlighted the damaging implications of content scraping on businesses. He expressed, "This includes competitors undercutting your offers, slower sites that lead customers to get frustrated and leave, and brand damage from counterfeiters passing off subpar goods as your legitimate merchandise." He added that Content Protector "helps demonstrate the direct business value of security while enabling business leaders to grow their digital businesses."