<small id='PoSzH'></small><noframes id='PoSzH'>

      • <bdo id='PoSzH'></bdo><ul id='PoSzH'></ul>

      <legend id='PoSzH'><style id='PoSzH'><dir id='PoSzH'><q id='PoSzH'></q></dir></style></legend>

        <i id='PoSzH'><tr id='PoSzH'><dt id='PoSzH'><q id='PoSzH'><span id='PoSzH'><b id='PoSzH'><form id='PoSzH'><ins id='PoSzH'></ins><ul id='PoSzH'></ul><sub id='PoSzH'></sub></form><legend id='PoSzH'></legend><bdo id='PoSzH'><pre id='PoSzH'><center id='PoSzH'></center></pre></bdo></b><th id='PoSzH'></th></span></q></dt></tr></i><div id='PoSzH'><tfoot id='PoSzH'></tfoot><dl id='PoSzH'><fieldset id='PoSzH'></fieldset></dl></div>
      1. <tfoot id='PoSzH'></tfoot>
      2. MySQL:按块检索大选择

        MySQL : retrieve a large select by chunks(MySQL:按块检索大选择)
          <bdo id='JVxxg'></bdo><ul id='JVxxg'></ul>

                <tbody id='JVxxg'></tbody>
              <i id='JVxxg'><tr id='JVxxg'><dt id='JVxxg'><q id='JVxxg'><span id='JVxxg'><b id='JVxxg'><form id='JVxxg'><ins id='JVxxg'></ins><ul id='JVxxg'></ul><sub id='JVxxg'></sub></form><legend id='JVxxg'></legend><bdo id='JVxxg'><pre id='JVxxg'><center id='JVxxg'></center></pre></bdo></b><th id='JVxxg'></th></span></q></dt></tr></i><div id='JVxxg'><tfoot id='JVxxg'></tfoot><dl id='JVxxg'><fieldset id='JVxxg'></fieldset></dl></div>

                <small id='JVxxg'></small><noframes id='JVxxg'>

                  <tfoot id='JVxxg'></tfoot>
                1. <legend id='JVxxg'><style id='JVxxg'><dir id='JVxxg'><q id='JVxxg'></q></dir></style></legend>
                  本文介绍了MySQL:按块检索大选择的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我选择了更多

                  7000 万行

                  我想在win2012 R2上将选择的数据保存到一个大的csv文件中

                  I'd like to save the selected data into the one large csv file on win2012 R2

                  问:如何通过 chanks 从 MySQL 中检索数据以获得更好的性能?

                  Q: How to retrive the data from MySQL by chanks for better performance ?

                  因为当我尝试保存一个大选择时,我得到了

                  because when I try to save one the large select I got

                  内存不足错误

                  推荐答案

                  您可以尝试使用 LIMIT 功能.如果你这样做:

                  You could try using the LIMIT feature. If you do this:

                  SELECT * FROM MyTable ORDER BY whatever LIMIT 0,1000
                  

                  您将获得前 1,000 行.第一个 LIMIT 值 (0) 定义结果集中的起始行.它是零索引的,所以 0 表示第一行".第二个 LIMIT 值是要检索的最大行数.要获得接下来的几组 1,000,请执行以下操作:

                  You'll get the first 1,000 rows. The first LIMIT value (0) defines the starting row in the result set. It's zero-indexed, so 0 means "the first row". The second LIMIT value is the maximum number of rows to retrieve. To get the next few sets of 1,000, do this:

                  SELECT * FROM MyTable ORDER BY whatever LIMIT 1000,1000 -- rows 1,001 - 2,000
                  SELECT * FROM MyTable ORDER BY whatever LIMIT 2000,1000 -- rows 2,001 - 3,000
                  

                  等等.当 SELECT 不返回任何行时,您就完成了.

                  And so on. When the SELECT returns no rows, you're done.

                  但这本身还不够,因为在您一次处理 1K 行时对表所做的任何更改都会破坏订单.要及时冻结结果,首先将结果查询到临时表中:

                  This isn't enough on its own though, because any changes done to the table while you're processing your 1K rows at a time will throw off the order. To freeze the results in time, start by querying the results into a temporary table:

                  CREATE TEMPORARY TABLE MyChunkedResult AS (
                    SELECT *
                    FROM MyTable
                    ORDER BY whatever
                  );
                  

                  旁注:最好事先确保临时表不存在:

                  Side note: it's a good idea to make sure the temporary table doesn't exist beforehand:

                  DROP TEMPORARY TABLE IF EXISTS MyChunkedResult;
                  

                  无论如何,一旦临时表就位,就从那里拉出行块:

                  At any rate, once the temporary table is in place, pull the row chunks from there:

                  SELECT * FROM MyChunkedResult LIMIT 0, 1000;
                  SELECT * FROM MyChunkedResult LIMIT 1000,1000;
                  SELECT * FROM MyChunkedResult LIMIT 2000,1000;
                  .. and so on.
                  

                  我将留给您创建逻辑,该逻辑将在每个块之后计算极限值并检查结果的结尾.我还建议使用比 1,000 条记录大得多的数据块;这只是我从空中挑选出来的一个数字.

                  I'll leave it to you to create the logic that will calculate the limit value after each chunk and check for the end of results. I'd also recommend much larger chunks than 1,000 records; it's just a number I picked out of the air.

                  最后,最好在完成后删除临时表:

                  Finally, it's good form to drop the temporary table when you're done:

                  DROP TEMPORARY TABLE MyChunkedResult;
                  

                  这篇关于MySQL:按块检索大选择的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  本站部分内容来源互联网,如果有图片或者内容侵犯了您的权益,请联系我们,我们会在确认后第一时间进行删除!

                  相关文档推荐

                  How to update a record using sequelize for node?(如何使用节点的 sequelize 更新记录?)
                  How to provide a mysql database connection in single file in nodejs(如何在 nodejs 中的单个文件中提供 mysql 数据库连接)
                  Looping Over Result Sets in MySQL(在 MySQL 中循环结果集)
                  How to get Top 5 records in SqLite?(如何获得 Sqlite 中的前 5 条记录?)
                  SQL Server SELECT where any column contains #39;x#39;(SQL Server SELECT,其中任何列包含“x)
                  Default row order in SELECT query - SQL Server 2008 vs SQL 2012(SELECT 查询中的默认行顺序 - SQL Server 2008 与 SQL 2012)

                    <tbody id='NyxYV'></tbody>

                    • <bdo id='NyxYV'></bdo><ul id='NyxYV'></ul>

                    • <small id='NyxYV'></small><noframes id='NyxYV'>

                        • <legend id='NyxYV'><style id='NyxYV'><dir id='NyxYV'><q id='NyxYV'></q></dir></style></legend>
                            <tfoot id='NyxYV'></tfoot>

                            <i id='NyxYV'><tr id='NyxYV'><dt id='NyxYV'><q id='NyxYV'><span id='NyxYV'><b id='NyxYV'><form id='NyxYV'><ins id='NyxYV'></ins><ul id='NyxYV'></ul><sub id='NyxYV'></sub></form><legend id='NyxYV'></legend><bdo id='NyxYV'><pre id='NyxYV'><center id='NyxYV'></center></pre></bdo></b><th id='NyxYV'></th></span></q></dt></tr></i><div id='NyxYV'><tfoot id='NyxYV'></tfoot><dl id='NyxYV'><fieldset id='NyxYV'></fieldset></dl></div>