java一次性查询几十万,几百万数据解决办法
java查询一次性查询几十万,几百万数据解决办法
很早的时候写工具用的一个办法。
当时是用来把百万数据打包成rar文件。
所以用了个笨办法。希望高手指导一下,有什么好方法没有啊
1、先批量查出所有数据,例子中是一万条一批。
2、在查出数据之后把每次的数据按一定规则存入本地文件。
3、获取数据时,通过批次读取,获得大批量数据。
此方法参见:
http://yijianfengvip.blog.163.com/blog/static/175273432201191354043148/
以下是查询数据库。按批次查询
public static void getMonthDataList() { ResultSet rs = null; Statement stat = null; Connection conn = null; List<DataBean> list = new ArrayList<DataBean>(); try { conn = createConnection(); if(conn!=null){ SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd"); SimpleDateFormat timesdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); String nowDate = sdf.format(new Date()); Config.lasttimetext = timesdf.format(new Date()); String lastDate = sdf.format(CreateData.addDaysForDate(new Date(), 30)); stat = conn.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE,ResultSet.CONCUR_UPDATABLE); int lastrow = 0; int datanum = 0; String countsql = "SELECT count(a.id) FROM trip_special_flight a" + " where a.dpt_date >= to_date('"+nowDate+"','yyyy-mm-dd') " + "and a.dpt_date <= to_date('"+lastDate+"','yyyy-mm-dd') and rownum>"+lastrow+" order by a.get_time desc"; rs = stat.executeQuery(countsql); while (rs.next()) { datanum = rs.getInt(1); } int onerun = 10000; int runnum = datanum%onerun==0?(datanum/onerun):(datanum/onerun)+1; for(int r =0;r<runnum;r++){ System.out.println("getMonthDataList--"+datanum+" 开始查询第"+(r+1)+"批数据"); String sql = "SELECT * FROM (SELECT rownum rn, a.dpt_code, a.arr_code,a.dpt_date,a.airways,a.flight," + "a.cabin,a.price FROM trip_special_flight a" + " where a.dpt_date >= to_date('"+nowDate+"','yyyy-mm-dd') " + "and a.dpt_date <= to_date('"+lastDate+"','yyyy-mm-dd') order by rownum asc) WHERE rn > "+lastrow; stat.setMaxRows(onerun); stat.setFetchSize(1000); rs = stat.executeQuery(sql); String text = ""; int i = 1; while (rs.next()) { text += rs.getString(2)+"|"+rs.getString(3)+"|"+rs.getDate(4)+"|"+rs.getString(5)+"|"+rs.getString(6)+"|"+rs.getString(7)+"|"+rs.getString(8)+"||"; if(i%1000==0){ FileUtil.appendToFile(Config.tempdatafile, text); text = ""; } i++; } if(text.length()>10){ FileUtil.appendToFile(Config.tempdatafile, text); } lastrow+=onerun; } } } catch (Exception e) { e.printStackTrace(); } finally { closeAll(rs, stat, conn); } }
-----java一次性查询几十万,几百万数据解决办法
存入临时文件之后,再用读取大量数据文件方法。
设置缓存大小BUFFER_SIZE,Config.tempdatafile是文件地址
来源博客
http://yijianfengvip.blog.163.com/blog/static/175273432201191354043148/
package com.yjf.util; import java.io.File; import java.io.RandomAccessFile; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; public class Test { public static void main(String[] args) throws Exception { final int BUFFER_SIZE = 0x300000; // 缓冲区为3M File f = new File(Config.tempdatafile); // 来源博客http://yijianfengvip.blog.163.com/blog/static/175273432201191354043148/ int len = 0; Long start = System.currentTimeMillis(); for (int z = 8; z >0; z--) { MappedByteBuffer inputBuffer = new RandomAccessFile(f, "r") .getChannel().map(FileChannel.MapMode.READ_ONLY, f.length() * (z-1) / 8, f.length() * 1 / 8); byte[] dst = new byte[BUFFER_SIZE];// 每次读出3M的内容 for (int offset = 0; offset < inputBuffer.capacity(); offset += BUFFER_SIZE) { if (inputBuffer.capacity() - offset >= BUFFER_SIZE) { for (int i = 0; i < BUFFER_SIZE; i++) dst[i] = inputBuffer.get(offset + i); } else { for (int i = 0; i < inputBuffer.capacity() - offset; i++) dst[i] = inputBuffer.get(offset + i); } int length = (inputBuffer.capacity() % BUFFER_SIZE == 0) ? BUFFER_SIZE : inputBuffer.capacity() % BUFFER_SIZE; len += new String(dst, 0, length).length(); System.out.println(new String(dst, 0, length).length()+"-"+(z-1)+"-"+(8-z+1)); } } System.out.println(len); long end = System.currentTimeMillis(); System.out.println("读取文件文件花费:" + (end - start) + "毫秒"); } }
读取大量数据文件方法。
原文:http://blog.csdn.net/yjflinchong/article/details/7287648
相关推荐
yangkang 2020-11-09
lbyd0 2020-11-17
sushuanglei 2020-11-12
85477104 2020-11-17
KANSYOUKYOU 2020-11-16
wushengyong 2020-10-28
lizhengjava 2020-11-13
星月情缘 2020-11-13
huangxiaoyun00 2020-11-13
luyong0 2020-11-08
腾讯soso团队 2020-11-06
Apsaravod 2020-11-05
PeterChangyb 2020-11-05
gaobudong 2020-11-04
wwwjun 2020-11-02
gyunwh 2020-11-02
EchoYY 2020-10-31
dingyahui 2020-10-30