I have a server with 5 GB of memory that is dedicated to a single web application. When the worker process gets up to about 400-500 MB, users eventually get an Out of Memory exception followed by Server Application Unavailable and then the site becomes available again after a few refreshes. I know there is a setting in machine.config to limit memory usage to 60% but that should still allow much more memory than it currently caps out at and I've tried increasing that to 90% to no avail.
Anyone have any suggestions as to a cause or a way to troubleshoot this?
Thanks.