Qualys validaterequest ‘finding’ is an Annoying PCI Problem

Uh oh. A post about compliance. That means it’s a rant, because I think compliance is dumb. I love parts of the security community, like Defcon/Bsides/CTF/the movie hackers and stuff like that, but I also hate more security people than not. Sorry.

A lot of sites use qualys scans as part of PCI, so doubtless I’m not the only person who has run into this. If you’re running ASP.net 3 you’ll get the error here: https://community.qualys.com/docs/DOC-3495

Some excerpts are

What versions of Microsoft ASP.NET are vulnerable?
Microsoft has confirmed that ASP.NET versions 1 and 2 are both vulnerable.

Additionally, Qualys has confirmed that ASP.NET version 3 is also vulnerable, as it includes the vulnerable component from version 2 by default. We have tested this in our Labs and confirmed the exploit works on a fully patched version 3.

What versions of Microsoft ASP.NET are not vulnerable?
ASP.NET version 4 is not vulnerable, as it does not use the vulnerable ‘ValidateRequest’ Filter.

Applications that have been securely coded, and have custom filtering in place above and beyond the ValidateRequest Filter, may not be vulnerable.

Since this is being detected based upon the .NET Framework Version, shouldn’t this be reported as a Potential Vulnerability?
After an in-depth investigation, including discussions with the original publisher, the vendor, and a thorough review of the two published CVE’s, we have decided that QID 90780 would be better represented as a Potential Vulnerability, and so we have reclassified it as such.

Our detection is based on the remote capability to identify the active framework running on the system. While this does accurately validate the framework version, it does not accurately confirm the presence of XSS, which applies to a higher layer and would be dependent upon several other factors such as web application coding practices, input sanitization, form submissions, etc.

In many cases, although someone may be running the vulnerable framework, they may have additional custom built filters in place which mitigate the risk and ensure that XSS is not possible on the target system.

In summary, the presence of the Vulnerable Framework actively running is the basis of this vulnerability, which could potentially allow additional attacks such as cross-site scripting. However, this detection is not actively confirming the presence of cross-site scripting, and so we believe this is most accurately marked as a Potential Vulnerability.

As a side note, since PCI Requires that both Actual & Potential Vulnerabilities be remediated the same, this is still a PCI Failing Vulnerability.

So while upgrading versions of .net is a great idea, especially to 4.5 which has some awesome security improvements, this qualys scan issue seems totally bogus.

  1. I talk about how validaterequest works here. It is not meant to stop all flavors of xss. However, even if validaterequest completely failed and was bypassable in all cases (which it’s not afaik) why should this by itself cause a site to fail PCI, when most other frameworks do not have this sort of WAF in place at all? If validaterequest were always bypassable, this means the web application is put on the same footing as other frameworks. Should all Java sites fail because they don’t have something like validaterequest built in?
  2. So, Qualys labs has determined version 3 is vulnerable but version 4 is not, huh? It turns out that right now, when I put all versions in ilspy, from .net 2 to .net 4.5, validaterequest has not changed. At all. Here is what it looks like for all versions of .net. Maybe the problem is somewhere else, but I’m skeptical.
  3. // System.Web.CrossSiteScriptingValidation
    internal static bool IsDangerousString(string s, out int matchIndex)
    {
    	matchIndex = 0;
    	int startIndex = 0;
    	while (true)
    	{
    		int num = s.IndexOfAny(CrossSiteScriptingValidation.startingChars, startIndex);
    		if (num < 0)
    		{
    			break;
    		}
    		if (num == s.Length - 1)
    		{
    			return false;
    		}
    		matchIndex = num;
    		char c = s[num];
    		if (c != '&')
    		{
    			if (c == '<' && (CrossSiteScriptingValidation.IsAtoZ(s[num + 1]) || s[num + 1] == '!' || s[num + 1] == '/' || s[num + 1] == '?'))
    			{
    				return true;
    			}
    		}
    		else
    		{
    			if (s[num + 1] == '#')
    			{
    				return true;
    			}
    		}
    		startIndex = num + 1;
    	}
    	return false;
    }
    
    

If I were to guess, I’d say this scan check is just checking the HTTP headers and we have no way to know what the root issue really is. This is one of my biggest issues with scanners in general.

More than once, working with multiple groups within multiple companies, there have been scrambles to upgrade .NET versions to address this “potential vulnerability”. Pentest reports have been delivered where this was reported as the #1 issue with an important rating and the impact spelled out like it was xss. Drives me crazy. I even tweeted about it and everything. End rant.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: