Debugging this issue cost me some time today. Enough that I'll never forget how IE handles cookies on (sub)domains that contain underscores.
In hindsight, it seems obvious. In fact, there's even an Internet Explorer FAQ that describes how IE should react when it's presented with a domain or subdomain that contains underscores. Except at the time, I had no idea this was even related.
My problem was that session cookies in PHP would work in Chrome and Firefox, but just refuse to work in Internet Explorer. Even the very latest version of IE, Internet Explorer 11. It's the kind of bug that appears to be by design and will stick around until the end of IE times.
Maybe Project Spartan aka Microsoft Edge will change this ancient behaviour?
What Is This Bug Of Which You Speak?
If the domain or subdomain your web application is running on contains an underscore, Internet Explorer will refuse to store cookies. Any kind of cookie. From session cookies to persistent cookies. Your webserver will reply with a
Set-Cookie header and the client will happily ignore it.
This kind of domain name works:
This kind of domain does not:
If you're working with sessions and session cookies, that's a problem. Every page refresh, the client responds with an empty
Cookie: header so the server generates a new
Set-Cookie header on every request.
Cookies just don't work in IE if your (sub)domain contains an underscore.
What's The Cause?
It's a cookie vulnerability. From 2001. For which we still experience the consequences.
The original CVE fixed this particular problem:
This patch eliminates three vulnerabilities affecting Internet Explorer. The first involves how IE handles URLs that include dotless IP addresses.
If a web site were specified using a dotless IP format (e.g., http://031713501415 rather than http://22.214.171.124), and the request were malformed in a particular way, IE would not recognize that the site was an Internet site. Instead, it would treat the site as an intranet site, and open pages on the site in the Intranet Zone rather than the correct zone.
This would allow the site to run with fewer security restrictions than appropriate.
So why does the fix for that CVE still affect us today?
As part of the fix in kb316112, Microsoft introduced stricter validation for domain names in DNS. That essentially means all domain names must follow the DNS RFC. Its origin dates back to RFC606 (1973) and RFC608 (1974).
Guess what the original DNS syntax does not contain? That's right: underscores.
So Microsoft started preventing cookies on anything that contains invalid DNS characters.
Internet Explorer blocks cookies from a server if the server name contains other characters, such as an underscore character ("_").
Security Patch MS01-055
Here's where I think they went wrong, though.
Underscores are indeed not allowed in host names, but they are allowed in domain names. The difference is the interpretation of a "host name" vs. a "domain name".
RFC2181, published in 1997, clearly states this.
The DNS itself places only one restriction on the particular labels that can be used to identify resource records. That one restriction relates to the length of the label and the full name. [...]
Implementations of the DNS protocols must not place any restrictions on the labels that can be used. In particular, DNS servers must not refuse to serve a zone because it contains labels that might not be acceptable to some DNS client programs.
To me, it seems like Microsoft introduced a wrong kind of validation and mixed host names with domain names.
So How Do I Fix It?
Just send a Pull Request to the IE11 codebase with the fix!
Well, since that's obviously not an option, there really is no alternative but to avoid underscores in your (sub)domains on the internet. This can be especially annoying for auto-generated subdomains (where I experienced it), where underscores could accidentally be introduced and break things in unexpected ways, for IE users.
For the future, I hope Microsoft reviews this policy of setting cookies on domains that include an underscore. Maybe they're correct in following the RFC & standards (although this is debatable), but they're the only browser that appears to be doing so.
At what point should you abandon principle and instead follow the masses in adopting non-standard practices?