On Apr 09, Ludovic Dubost wrote :
Right.. the XWikiContext with the classCache and
archiveCache (which are
needed) is not made for being use for too long without cleaning up..
Maybe there should be some automated cleaning up when it gets too big..
A simple non invasive solution, would be to use a LRUMap instead
of a simple HashMap.
It can be set to a maximum size and will use a Least Recently Used elimination
when full and adding a new object. See patch below.
Index: core/src/main/java/com/xpn/xwiki/XWikiContext.java
===================================================================
--- core/src/main/java/com/xpn/xwiki/XWikiContext.java (revision 2320)
+++ core/src/main/java/com/xpn/xwiki/XWikiContext.java (working copy)
@@ -31,6 +31,7 @@
import com.xpn.xwiki.web.*;
import com.xpn.xwiki.validation.XWikiValidationStatus;
import org.apache.xmlrpc.server.XmlRpcServer;
+import org.apache.commons.collections.map.LRUMap;
import java.net.URL;
import java.util.*;
@@ -65,11 +66,13 @@
private String wikiOwner;
private XWikiDocument wikiServer;
private int cacheDuration = 0;
+ private int classCacheSize = 10;
+ private int archiveCacheSize = 10;
// Used to avoid recursive loading of documents if there are recursives usage of
classes
- private Map classCache = new HashMap();
+ private Map classCache = new LRUMap(classCacheSize);
// Used to avoir reloading archives in the same request
- private Map archiveCache = new HashMap();
+ private Map archiveCache = new LRUMap(archiveCacheSize);
--
Pablo