I am trying to upload file upto 20 GB to Database Table, where FILE_DATA column is enabled by FileStream. But I can only upload upto 1GB. From 2GB onwards out of memory exception occurs
SampleFileStream .java
@Entity
@Table(name = "SAMPLE_FILE_STREAM")
public class SampleFileStream extends QDmsDomain
{
private static final long serialVersionUID = 1L;
@Id
@Column(name = "ID")
@GeneratedValue
private UUID uuid;
@NotNull
@Column(name = "FILE_NAME")
private String fileName;
@Lob
@Column(name = "FILE_DATA")
private byte[] fileData;
Service class
@Override
public void saveFile(MultipartFile file)
{
SampleFileStream sampleFileStream = new SampleFileStream();
sampleFileStream.setFileName(file.getOriginalFilename());
try
{
sampleFileStream.setFileData(file.getBytes());
} catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
SampleFileStream savedSampleFileStream = sampleFileStreamRepo.save(sampleFileStream);
}
I am getting out of memory while calling file.getBytes(). I know that this is causing due to the whole Data is taking into the memory. Is there any other possible way to write without calling file.getBytes(). There were solutions using Filepath, but I think it is not possible while using Filestream .
The above code is just for testing if working of Filestream
Thanks in advance