MySQL ERROR 1118 (42000) 行大小太大

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/25163614/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-31 12:57:13  来源:igfitidea点击:

ERROR 1118 (42000) Row size too large

mysqlsql

提问by Dee

I know that this question has been asked (and answered) many times, but none of them appear to be the same problem that I am seeing...

我知道这个问题已经被问过(并回答过)很多次了,但它们似乎都不是我所看到的相同问题......

The table that is giving me problems only has two columns: the first field is an integer, the second field is longtext. Here is a portion of a dump file from MySQL 5.5.30:

给我问题的表只有两列:第一个字段是整数,第二个字段是长文本。这是 MySQL 5.5.30 转储文件的一部分:

1 - MySQL dump 10.13  Distrib 5.5.30, for Linux (x86_64)
2 --
3 -- Host: localhost    Database: mydatabasename
4 -- ------------------------------------------------------
5 -- Server version   5.5.30-log

32 DROP TABLE IF EXISTS `large_file`;
33 /*!40101 SET @saved_cs_client     = @@character_set_client */;
34 /*!40101 SET character_set_client = utf8 */;
35 CREATE TABLE `large_file` (
36   `id` int(11) NOT NULL AUTO_INCREMENT,
37   `data` longtext,
38   PRIMARY KEY (`id`)
39 ) ENGINE=InnoDB AUTO_INCREMENT=59 DEFAULT CHARSET=latin1;
40 /*!40101 SET character_set_client = @saved_cs_client */;

43 -- Dumping data for table `large_file`
44 --
45 
46 LOCK TABLES `large_file` WRITE;
47 /*!40000 ALTER TABLE `large_file` DISABLE KEYS */;
48 INSERT INTO `large_file` VALUES(38,'GyUtMTIzNDVYQ... <large data> ...);
49 /*!40000 ALTER TABLE `large_file` ENABLE KEYS */;
50 UNLOCK TABLES;

As you can see this dump file came from MySQL 5.5.30, and I can import this data into 5.5.30. But, when I try to import into 5.6.x, I get the ERROR 1118 (42000) Row size too largeerror.

如您所见,此转储文件来自 MySQL 5.5.30,我可以将此数据导入 5.5.30。但是,当我尝试导入 5.6.x 时,出现ERROR 1118 (42000) Row size too large错误。

The data going into the large_file table, is (relatively) large, values range in size from 15 MB to about 25 MB. The data is all ASCII (base 64 encoded).

进入 large_file 表的数据(相对)较大,值的大小范围从 15 MB 到大约 25 MB。数据都是 ASCII(base 64 编码)。

Other posters have had issues with very large number of columns, but I only have two columns in this table.

其他海报有很多列的问题,但我在这个表中只有两列。

The longtext type should be capable of storing approx 4 GB, and this has been the case with 5.5.30, but I am finding migration to 5.6.x to be difficult.

longtext 类型应该能够存储大约 4 GB,5.5.30 就是这种情况,但我发现迁移到 5.6.x 很困难。

Can anyone offer insight into why this is happening? Or, how I can work around it?

任何人都可以深入了解为什么会发生这种情况吗?或者,我该如何解决它?

Thanks in advance!

提前致谢!

采纳答案by Brian Ridgeway

Check that the innodb_log_file_size setting is sufficiently large -- 10 times the largest BLOB data size found in the rows in the table plus the length of other variable length fields.

检查 innodb_log_file_size 设置是否足够大——是表中行中发现的最大 BLOB 数据大小加上其他可变长度字段的长度的 10 倍。

The following is from MySQL 5.6 Release Notes

以下来自MySQL 5.6 发行说明

InnoDB Notes

InnoDB 注释

  • Important Change: Redo log writes for large, externally stored BLOB fields could overwrite the most recent checkpoint. The 5.6.20 patch limits the size of redo log BLOB writes to 10% of the redo log file size. The 5.7.5 patch addresses the bug without imposing a limitation. For MySQL 5.5, the bug remains a known limitation.

    As a result of the redo log BLOB write limit introduced for MySQL 5.6, innodb_log_file_size should be set to a value greater than 10 times the largest BLOB data size found in the rows of your tables plus the length of other variable length fields (VARCHAR, VARBINARY, and TEXT type fields). Failing to do so could result in “Row size too large” errors. No action is required if your innodb_log_file_size setting is already sufficiently large or your tables contain no BLOB data. (Bug #16963396, Bug #19030353, Bug #69477)

  • 重要更改:对外部存储的大型 BLOB 字段的重做日志写入可能会覆盖最近的检查点。5.6.20 补丁将重做日志 BLOB 写入的大小限制为重做日志文件大小的 10%。5.7.5 补丁解决了该错误,但没有施加限制。对于 MySQL 5.5,该错误仍然是一个已知限制。

    由于为 MySQL 5.6 引入了重做日志 BLOB 写入限制,因此应将 innodb_log_file_size 设置为大于表行中发现的最大 BLOB 数据大小加上其他可变长度字段(VARCHAR、VARBINARY)长度的 10 倍的值, 和 TEXT 类型字段)。不这样做可能会导致“行大小太大”错误。如果您的 innodb_log_file_size 设置已经足够大或者您的表不包含 BLOB 数据,则不需要任何操作。(错误 #16963396、错误 #19030353、错误 #69477)

回答by Solomous

I had this issue with MYSQL 5.7 (OSX 10.11).

我在 MYSQL 5.7 (OSX 10.11) 上遇到了这个问题。

The following worked although it may not be ideal.

以下工作虽然可能不理想。

In my.cfnadd:

另外my.cfn

innodb_strict_mode = 0    

回答by ?? ???

ERROR 1118 (42000) at line 1852: Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
[mysqld]
innodb_log_file_size = 512M
innodb_strict_mode = 0

ubuntu 16.04 edit path : nano /etc/mysql/mysql.conf.d/mysqld.cnf

it work!!…

这行得通!!…

Click To Document

点击文档

回答by Muralidhar Bopparaju

Ran into an identical problem importing BLOB data from 5.5 to 5.6.22 where the blobs were aprox: 70MB maximum. However, while increasing the innodb_log_file_size did the trick, in my case, I had to increase it to 10GB (I tried in increments of 1GB starting from 1GB) which is way more than 10 times the max BLOB size.

在将 BLOB 数据从 5.5 导入到 5.6.22 时遇到相同的问题,其中 BLOB 大约为:最大 70MB。然而,虽然增加 innodb_log_file_size 可以解决问题,但在我的情况下,我不得不将其增加到 10GB(我尝试从 1GB 开始以 1GB 为增量),这是最大 BLOB 大小的 10 倍以上。