I am using piped output streams to convert OutputStream
to InputStream
because the AWS java sdk does not allow puting objects on S3 using OutputStreams
I'm using the code below, however, this will intermittently just hang. This code is in a web application. Currently there is no load on the application...I am just trying it out on my personal computer.
ByteArrayOutputStream os = new ByteArrayOutputStream();
PipedInputStream inpipe = new PipedInputStream();
final PipedOutputStream out = new PipedOutputStream(inpipe);
try {
String xmpXml = "<dc:description>somedesc</dc:description>"
JpegXmpRewriter rewriter = new JpegXmpRewriter();
rewriter.updateXmpXml(isNew1,os, xmpXml);
new Thread(new Runnable() {
public void run () {
try {
// write the original OutputStream to the PipedOutputStream
println "starting writeto"
os.writeTo(out);
out.close();
println "ending writeto"
} catch (IOException e) {
System.out.println("Some exception)
}
}
}).start();
ObjectMetadata metadata1 = new ObjectMetadata();
metadata1.setContentLength(os.size());
client.putObject(new PutObjectRequest("test-bucket", "167_sample.jpg", inpipe, metadata1));
}
catch (Exception e) {
System.out.println("Some exception")
}
finally {
isNew1.close()
os.close()
}
Instead of bothering with the complexities of starting another thread, instantiating two concurrent classes, and then passing data from thread to thread, all to solve nothing but a minor limitation in the provided JDK API, you should just create a simple specialization of the ByteArrayOutputStream
:
class BetterByteArrayOutputStream extends ByteArrayOutputStream {
public ByteArrayInputStream toInputStream() {
return new ByteArrayInputStream(buf, 0, count);
}
}
This converts it to an input stream with no copying.