使用 Spring MVC 的 CSV 导出方法中出现内存不足错误

Out of Memory Error in CSV Export methods with Spring MVC

我们的应用程序在生成 CSV 文件时遇到内存不足的问题。特别是在超过 10k 行的大型 CSV 文件上。我们正在使用 Spring Boot 2.0.8 和 SuperCSV 2.4.0。

处理这些情况的正确方法是什么,以便我们的 Spring MVC API 不会因 OutOfMemoryException.

而崩溃

SuperCSV 会是导致此问题的原因吗?我想这不是,但以防万一。

我一直在阅读有关 @Async 的内容,在这个方法上使用它来打开一个单独的线程是个好主意吗?

假设我在控制器中有以下方法:

@RequestMapping(value = "/export", method = RequestMethod.GET)
public void downloadData(HttpServletRequest request,HttpServletResponse response) throws SQLException, ManualException, IOException, NoSuchMethodException, InvocationTargetException, IllegalAccessException {

    List<?> data = null;
    data = dataFetchService.getData();

    ICsvBeanWriter csvWriter = new CsvBeanWriter(response.getWriter(), CsvPreference.STANDARD_PREFERENCE);

    //these next lines handle the header
    String[] header = getHeaders(data.get(0).getClass());
    String[] headerLocale = new String[header.length];
    for (int i = 0; i < header.length; i++)
        {
            headerLocale[i] = localeService.getLabel(this.language,header[i]);
        }

        //fix for excel not opening CSV files with ID in the first cell
        if(headerLocale[0].equals("ID")) {
            //adding a space before ID as ' ID' also helps
            headerLocale[0] = headerLocale[0].toLowerCase();
        }

    csvWriter.writeHeader(headerLocale);

    //the next lines handle the content
    for (Object line : data) {
        csvWriter.write(line, header);
    }

    csvWriter.close();
    response.getWriter().flush();
    response.getWriter().close();
}

代码:

data = dataFetchService.getData();

看起来它可能会消耗大量内存。该列表的大小可能有数百万条记录。或者如果多个用户同时导出,这将导致内存问题。

由于 dataFetchService 由 Spring 数据存储库提供支持,因此您应该获取它将 return 记录的数量,然后一次获取一个 Pagable 数据。

示例:如果 table 中有 20,000 行,您应该一次获取 1000 行数据 20 次,然后慢慢构建您的 CSV。

您还应该按某种顺序请求您的数据,否则您的 CSV 可能会以随机顺序结束。

考虑在您的存储库

上实施 PagingAndSortingRepository

示例应用程序

Product.java

import javax.persistence.Entity;
import javax.persistence.Id;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;

@Entity
@Data
@NoArgsConstructor
@AllArgsConstructor
public class Product {

    @Id
    private long id;
    private String name;
}

ProductRepository.java

import org.springframework.data.repository.PagingAndSortingRepository;

public interface ProductRepository extends PagingAndSortingRepository<Product, Integer> {
}

MyRest.java

import java.io.IOException;
import java.util.List;
import javax.servlet.http.HttpServletResponse;
import lombok.RequiredArgsConstructor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.supercsv.io.CsvBeanWriter;
import org.supercsv.io.ICsvBeanWriter;
import org.supercsv.prefs.CsvPreference;

@RestController
@RequiredArgsConstructor
public class MyRest {

    @Autowired
    private ProductRepository repo;

    private final int PAGESIZE = 1000;

    @RequestMapping("/")
    public String loadData() {
        for (int record = 0; record < 10_000; record += 1) {
            repo.save(new Product(record, "Product " + record));
        }
        return "Loaded Data";
    }

    @RequestMapping("/csv")
    public void downloadData(HttpServletResponse response) throws IOException {
        response.setContentType("text/csv");
        String[] header = {"id", "name"};
        ICsvBeanWriter csvWriter = new CsvBeanWriter(response.getWriter(), CsvPreference.STANDARD_PREFERENCE);

        csvWriter.writeHeader(header);

        long numberRecords = repo.count();
        for (int fromRecord = 0; fromRecord < numberRecords; fromRecord += PAGESIZE) {
            Pageable sortedByName = PageRequest.of(fromRecord, PAGESIZE, Sort.by("name"));
            Page<Product> pageData = repo.findAll(sortedByName);
            writeToCsv(header, csvWriter, pageData.getContent());
        }
        csvWriter.close();
        response.getWriter().flush();
        response.getWriter().close();
    }

    private void writeToCsv(String[] header, ICsvBeanWriter csvWriter, List<Product> pageData) throws IOException {
        for (Object line : pageData) {
            csvWriter.write(line, header);
        }
    }

}

1)调用

加载数据
curl http://localhost:8080

2) 下载 CSV

curl http://localhost:8080/csv

您应该尝试使用 setFetchSize which only brings limited rows at one time using cursors at the database end. This increases the network round trips but since I am streaming the download it does not matter much to the user as they continuously get the file. I am also using servlet 3.0 Async feature to free the container worker thread and give this task to another Spring managed thread pool. I am using this for Postgresql database and it works like charm. MySQL and Oracle jdbc drivers also support this. I am using raw JDBCTemplate 获取块中的数据以进行数据访问,并将我的自定义结果集转换为 csv 转换器以及即时 zip 转换器。 要在 Spring 数据存储库上使用它,请在此处查看。