Hi,
I have a Groovy script in my XWiki 7.1.1 site that reads files sent to it. I've been
using the fileUpload plugin as shown below.
{{groovy}}
import java.io.InputStream;
import java.io.OutputStream;
import org.apache.commons.io.IOUtils;
import org.apache.commons.fileupload.FileItem
// Function to actually read the file data from file.getInputStream();
def processFile(String filename, FileItem file, String title) { ... }
def params = request.getParameterNames().toList()
def fileUpload = xwiki.fileupload
def results = [:]
if (!(params.contains("title"))) {
results.put('error',"No title specified")
} else {
fileUpload.loadFileList(2147483648L, 100000, (String)
(xcontext.getContext().getWiki().Param("xwiki.upload.tempdir")))
FileItem fileItem = fileUpload.getFileItems().find
{it.getFieldName().equals("newfile")}
if (fileItem == null) {
results.put('error',"Couldn't load file - did you forget to choose
one?")
} else {
results = processFile(fileUpload.getFileName("newfile"), fileItem,
request.getParameter('title'))
}
}
// Handle display of results based on how script was called
def jsontool = new org.xwiki.velocity.tools.JSONTool()
if (request.getParameter('outputSyntax') == 'plain' ||
request.getParameter('xpage') == 'plain') {
response.setContentType('application/json')
if (results.error) {
response.setStatus(400)
}
print jsontool.serialize(results)
} else {
if(results.error) {
println "{{error}}${results.error}{{/error}}"
} else {
println "{{success}}File processed{{/success}}"
}
println "Return to [[Original
page>>${request.getHeader('referer')}]]";
}
{{/groovy}}
Files are sent to this page via a form:
<form action="FileProcessor" method="post"
enctype="multipart/form-data">
<input type="file" name="newfile" />
<input type="hidden" name="title" value="foo" />
<button type="submit">Upload</button>
</form>
The files I need to read can sometimes be large (~1GB), which is why the call to
fileUpload.loadFileList explicitly gives a large maximum file size (10GB, although I
don't expect files to reach this size for the time being). However, this limit seems
to get ignored: when files exceed the limit specified in the Maximum Upload Size setting
on xwiki/bin/edit/XWiki/XWikiPreferences?editor=object the script fails at the step
(!(params.contains("title"))) { ...
This is despite the title definitely being set. My guess is the an exception gets thrown
by the File Upload plugin that causes the parameters to fail being processed - when I
increase the upload limit from 32MB to 50MB the problem goes away for a 47MB file, for
instance.
So problem 1: is there a way to get fileUpload.loadFileList to respect the file limit I
specify? I don't want to allow very large files to be included in attachments, so it
would be nice if I can leave the default setting lower.
Then comes problem 2: Even if I do set the maximum size to something very large, I get a
similar problem for files over about 500Mb. I've tried testing just uploading such
files as attachments and I end up with a page saying "waiting for server
confirmation", and then "an error occurred while uploading the file".
Digging around in the Tomcat logs I see exceptions containing:
Caused by: org.apache.commons.fileupload.FileUploadBase$IOFileUploadException: Processing
of multipart/form-data request failed. No space left on device
at
org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:351)
~[commons-fileupload-1.3.1.jar:1.3.1]
at
com.xpn.xwiki.plugin.fileupload.FileUploadPlugin.loadFileList(FileUploadPlugin.java:248)
~[xwiki-platform-legacy-oldcore-7.1.2.jar:na]
... 47 common frames omitted
Caused by: java.io.IOException: No space left on device
at java.io.FileOutputStream.writeBytes(Native Method) ~[na:1.7.0_79]
at java.io.FileOutputStream.write(FileOutputStream.java:345) ~[na:1.7.0_79]
at
org.apache.commons.io.output.ThresholdingOutputStream.write(ThresholdingOutputStream.java:129)
~[commons-io-2.4.jar:2.4]
at org.apache.commons.fileupload.util.Streams.copy(Streams.java:107)
~[commons-fileupload-1.3.1.jar:1.3.1]
at org.apache.commons.fileupload.util.Streams.copy(Streams.java:70)
~[commons-fileupload-1.3.1.jar:1.3.1]
at
org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:347)
~[commons-fileupload-1.3.1.jar:1.3.1]
... 48 common frames omitted
Looking at this I see that there is probably an earlier call to loadFileList with default
values, which is probably why my first problem was occurring. I can't see where this
is being called - probably the crucial info is in one of those 48 omitted frames. My guess
is that it's in com.xpn.xwiki.web.Utils.handleMultipart
Also, looking in
https://github.com/xwiki/xwiki-platform/blob/7392653655d1a8524c67851393b0ac…
it appears that all files in a request are processed using
org.apache.commons.fileupload.FileUploadBase.parseRequest() which writes all files to
file, which I'm guessing is the ultimate problem once files get too big to store in
temporary storage. Is there any way that I can use the streaming API
http://commons.apache.org/proper/commons-fileupload/streaming.html instead?
Thanks,
Bryn