私のアプリケーションの認証コード内で、FormsAuthentication
最も複雑な部分を処理するために使用して、アプリケーションを実行している特定のマシンの環境問題の原因を BinaryFormatter に絞り込みました。
On some machines the auth process completes properly and my users are logged in. On others, however, the BinaryFormatter
produces separate results from the same inputs (virtually identical, unless I'm missing something), thus breaking the auth process and users can never log in.
In the correct environment, it produces a serialized string with a length of about 373. In the bad environment, the serialized string produced is 5,024. Herein lies the problem.
Here's how the code is being run:
var formatter = new BinaryFormatter();
var buffer = new MemoryStream();
formatter.Serialize(buffer, HttpContext.Current.User);
This in turn mucks up the rest of the authentication process, because it essentially creates a cookie with about 40,000+ bytes of data, which never creates a cookie (needs to be 4,096 bytes or less to be accepted by the browser).
My question, and it's not easily testable (tell me about it) - what could be different between the two machines to cause serialization differences? Both are being developed on Windows 7 in Visual Studio and running on the built-in Cassini server, but are there other common gotchas that would make Serialize
return such vastly different results?