First let me give you a little of the background. At work, we are building a web application. Among other things, we’d like it to be fast. We made many painstaking decisions to build it to be fast. My last post talked about some of the things that we were going to undertake to make sure that it was fast through the pipe.
However, we were having a problem. Page loads were taking over 5 seconds and that was after the initial hit penalty that ASP.Net gives you. I put tracing information in our handler and it turns out that all of our framework code was executing in .05 seconds. That wasn’t the problem. The step of getting the base handler (for the hand off to ASP.Net) however, was taking over 5 seconds.
We had another interesting problem. The ASP.Net Cache object didn’t work. It would be alive for the duration of a page hit, but on the next hit it was gone. My boss suggested that the pages were compiling every time, and we ultimately did see that in the %windir%\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files directory. The question was, “Why?”.
We had wanted the site files to be dynamically compiled for ease of updating, while our framework code remained in pre-compiled .dlls. To troubleshoot, I recommended that we pre-compile the website as well and see what happened. Page hits got down to half a second a piece. Interesting.
Shortly after, though, I got a brainstorm. We were writing log files into a folder in the bin directory (a holdover convention from a previous application) and every time that directory changed (on every page hit), ASP.Net would sense that the bin directory changed (since the log folder was in it and a part of it) and think that the site had changed and now needed a recompile. It makes perfect sense in retrospect!
We moved the log directory outside of the bin, went back to dynamic compilation, and got the same performance improvement that we got with the pre-compiled version – much to my relief.