Azure Active Directory Application Proxy crawler robots behavior

As part of our continuous effort to improve the security posture of applications that are published by Azure AD Application Proxy, we have started to block Web crawler robots from indexing and archiving your applications.

Every time a Web crawler robot tries to retrieve the robots settings for a published application, the proxy will reply with a robots.txt file that have the following content:

        User-agent: *

        Disallow: /


No action is needed to turn this on. All Application Proxy customers will automatically get this functionality.

Comments (0)

Skip to main content