MySQL 5.7.12 导入无法从具有 CHARACTER SET 'binary' 的字符串创建 JSON 值

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/38078119/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-31 21:54:36  来源:igfitidea点击:

MySQL 5.7.12 import cannot create a JSON value from a string with CHARACTER SET 'binary'

mysqljsonimportmysql-json

提问by Danny Bevers

I exported my database with JSON columns in it. After I migrated to a new server, my import crashed every time with an error like:

我导出了包含 JSON 列的数据库。迁移到新服务器后,我的导入每次都崩溃,并出现如下错误:

cannot create a JSON value from a string with CHARACTER SET 'binary'

无法从具有 CHARACTER SET 'binary' 的字符串创建 JSON 值

On stackoverflow, I found this post but didn't work for me: mysqlimport issues "set @@character_set_database=binary" which prevents loading json values

在 stackoverflow 上,我找到了这篇文章,但对我不起作用: mysqlimport 问题“set @@character_set_database=binary”这会阻止加载 json 值

The file is 2GB and isn't possible to open the file.

该文件为 2GB,无法打开该文件。

Anyone has an idea to import my database file?

任何人都有导入我的数据库文件的想法?

回答by Lorcan O'Neill

You can apply a regex to the SQL text which you exported which will convert your binary strings into an insertable format. This was my quick and dirty fix when I faced this issue

您可以将正则表达式应用于您导出的 SQL 文本,这会将您的二进制字符串转换为可插入格式。当我遇到这个问题时,这是我快速而肮脏的修复

(X'[^,\)]*')
CONVERT( using utf8mb4)

Applying this regex means

应用这个正则表达式意味着

INSERT INTO json_table (json_column) VALUES (X'7B22666F6F223A2022626172227D');

will now become

现在将成为

INSERT INTO json_table (json_column) VALUES (CONVERT(X'7B22666F6F223A2022626172227D' using utf8mb4));

回答by Henry

I had this problem dealing with exports made by Sequel Pro. I unchecked the Output BLOB fields as hexoption and the problem went away. Visually inspecting the export showed legible JSON instead of binary.

我在处理 Sequel Pro 的出口时遇到了这个问题。我取消选中该Output BLOB fields as hex选项,问题就消失了。目视检查导出显示清晰的 JSON 而不是二进制。

回答by swayamraina

I faced the same issue today. Below were the findings for my case,

我今天遇到了同样的问题。以下是我的案例的调查结果,

I asked one of my friend to generate an SQL dump for me to import. He used sequel-proto generate the dump (export database). When I did the import it threw an error

Cannot create a JSON value from a string with CHARACTER SET 'binary'

So, there was an issue with the generated dump, all the jsonfields were converted to some raw format i.e. instead of value being

"{'key1':'value1', 'key2':'value2'}"

it was,

X'nfdsklsdsklnfjkbvkjsdbvkjhdfsbvkjdsbnvljkdsbvkjhdfbvkjdfbvjkdfb'

So, when importing the dump i.e. running the insertstatements mysqlcould not process the data as it was not of jsontype.

Here is a link to the bugreported
https://github.com/sequelpro/sequelpro/issues/2397

You need to uncheckthe Output BLOB fields as hexoption.

我让我的一位朋友生成一个 SQL 转储供我导入。他用来sequel-pro生成转储(导出数据库)。当我进行导入时,它抛出了一个错误

Cannot create a JSON value from a string with CHARACTER SET 'binary'

所以,生成的转储存在问题,所有json字段都被转换为某种原始格式,即,而不是 原来的值

"{'key1':'value1', 'key2':'value2'}"



X'nfdsklsdsklnfjkbvkjsdbvkjhdfsbvkjdsbnvljkdsbvkjhdfbvkjdfbvjkdfb'

因此,在导入转储时,即运行insert语句mysql无法处理数据,因为它不是json类型的。

这是报告错误的链接
https://github.com/sequelpro/sequelpro/issues/2397

您需要取消选中Output BLOB fields as hex选项。

回答by Lnr

This worked for me, (I had control of the export to the sql file as well). There're lots of caveats; e.g. I knew that the fields would never be bigger than 1000 and wouldn't contain any non-ascii chars. Please do comment and tell me all the whys this is so bad tho :)

这对我有用,(我也可以控制导出到 sql 文件)。有很多警告;例如,我知道这些字段永远不会大于 1000,并且不会包含任何非 ascii 字符。请发表评论并告诉我所有的原因,这太糟糕了 :)

Before export

出口前

alter table <table> modify <json_column> varchar(1000);

Then after import

然后导入后

alter table <table> modify <json_column> json;

回答by Kyogo Mochida

vim version For Lorcan O'Neill's answer

vim 版本对于 Lorcan O'Neill 的回答

vi xxxx.sql
:%s/\(X'[^,\)]*'\)/CONVERT( using utf8mb4)/g

回答by Peter

For those using Sequel Pro around June 2019, in addition to unchecking the "Output BLOB fields as hex option" (as mentioned above) - you also need to use the nightly build, which added support for JSON types 2 years ago. This support has not yet made it to the official release.

对于那些在 2019 年 6 月左右使用 Sequel Pro 的用户,除了取消选中“将 BLOB 字段输出为十六进制选项”(如上所述)之外,您还需要使用夜间构建,它在 2 年前增加了对 JSON 类型的支持。此支持尚未正式发布。

回答by Emre

This odd issue was occurring when running a simple UPDATE query:

运行简单的 UPDATE 查询时发生了这个奇怪的问题:

update some_table set json_attr = '{"test":168}' where id = 123456;

Restarting MySQL fixed it. Was not able to pinpoint the cause.

重新启动 MySQL 修复了它。无法查明原因。

Edit: We are using Aurora. It looks like it was related to us having a weird configuration where the same instance handled both master & slave/reader connections.

编辑:我们正在使用 Aurora。看起来这与我们有一个奇怪的配置有关,在这个配置中,同一个实例同时处理了主从/读取器连接。

回答by Swarup Bam

change collation to utf8_general_ci. worked for me.

将排序规则更改为 utf8_general_ci。为我工作。

回答by Andrew Burns

I had this problem with a dump. i was able to fix it by changing the line in the dump file from:

我在转储时遇到了这个问题。我能够通过更改转储文件中的行来修复它:

/*!40101 SET NAMES binary*/;

to

/*!40101 SET NAMES utf8mb4*/;

回答by Moonchild

For the ones like me arived here using Symfony 4 / Doctrine : For some reasons the same entity can be resolved in a longtext MySQL type storing JSON; or a json MySQL type storing json. Manually setting longtext MySQL type resolved the problem in my particular case.

对于像我这样使用 Symfony 4 / Doctrine 来到这里的人:由于某些原因,可以在存储 JSON 的长文本 MySQL 类型中解析相同的实体;或存储 json 的 json MySQL 类型。手动设置 longtext MySQL 类型解决了我的特殊情况下的问题。