首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >使用mongoimport解析JSON失败

使用mongoimport解析JSON失败
EN

Stack Overflow用户
提问于 2011-06-20 07:35:28
回答 4查看 6K关注 0票数 5

在通过Github API在管道中运行mongoimport时,我得到了一个Assertion: 10340:Failure parsing JSON string错误,如下所示:

代码语言:javascript
复制
lsoave@ubuntu:~/rails/github/gitwatcher$ curl https://api.github.com/users/lgs/repos | mongoimport -h localhost -d gitwatch_dev -c repo -f repositories
connected to: localhost
 % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                Dload  Upload   Total   Spent    Left  Speed
 0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0Mon Jun 20 00:56:01 Assertion: 10340:Failure parsing JSON string near: [
100 22303  100 22303    0     0  31104      0 --:--:-- --:--:-- --:--:--  111k
0x816d8a1 0x8118814 0x84b357a 0x84b5bb8 0x84adc65 0x84b2ee1 0x60bbd6 0x80f5bc1
mongoimport(_ZN5mongo11msgassertedEiPKc+0x221) [0x816d8a1]
mongoimport(_ZN5mongo8fromjsonEPKcPi+0x3b4) [0x8118814]
mongoimport(_ZN6Import9parseLineEPc+0x7a) [0x84b357a]
mongoimport(_ZN6Import3runEv+0x1a98) [0x84b5bb8]
mongoimport(_ZN5mongo4Tool4mainEiPPc+0x1ce5) [0x84adc65]
mongoimport(main+0x51) [0x84b2ee1]
/lib/tls/i686/cmov/libc.so.6(__libc_start_main+0xe6) [0x60bbd6]
mongoimport(__gxx_personality_v0+0x3f1) [0x80f5bc1]
exception:Failure parsing JSON string near: [
[
...
...
Mon Jun 20 00:45:20 Assertion: 10340:Failure parsing JSON string near: "name": "t
0x816d8a1 0x8118814 0x84b357a 0x84b5bb8 0x84adc65 0x84b2ee1 0x126bd6 0x80f5bc1
mongoimport(_ZN5mongo11msgassertedEiPKc+0x221) [0x816d8a1]
mongoimport(_ZN5mongo8fromjsonEPKcPi+0x3b4) [0x8118814]
mongoimport(_ZN6Import9parseLineEPc+0x7a) [0x84b357a]
mongoimport(_ZN6Import3runEv+0x1a98) [0x84b5bb8]
mongoimport(_ZN5mongo4Tool4mainEiPPc+0x1ce5) [0x84adc65]
mongoimport(main+0x51) [0x84b2ee1]
/lib/tls/i686/cmov/libc.so.6(__libc_start_main+0xe6) [0x126bd6]
mongoimport(__gxx_personality_v0+0x3f1) [0x80f5bc1]
exception:Failure parsing JSON string near: "name": "t
"name": "tentacles"
...
...

在这里查看完整的跟踪:http://pastie.org/2093486。无论如何,我从Github API得到的json格式似乎还可以( curl https://api.github.com/users/lgs/repos ):

代码语言:javascript
复制
[
 {
    "open_issues": 0,
    "watchers": 3,
    "homepage": "http://scrubyt.org",
    "language": null,
    "forks": 1,
    "pushed_at": "2009-02-25T22:49:08Z",
    "created_at": "2009-02-25T22:22:40Z",
    "fork": true,
    "url": "https://api.github.com/repos/lgs/scrubyt",
    "private": false,
    "size": 188,
    "description": "A simple to learn and use, yet powerful web scraping toolkit!",
    "owner": {
     "avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2
Fgravatar-140.png",
     "login": "lgs",
     "url": "https://api.github.com/users/lgs",
     "id": 1573
    },
    "name": "scrubyt",
    "html_url": "https://github.com/lgs/scrubyt"
 },
...
...
]

下面是一个代码片段:http://www.pastie.org/2093524

如果我尝试指定csv格式,它会起作用:

代码语言:javascript
复制
lsoave@ubuntu:~/rails/github/gitwatcher$ curl https://api.github.com/users/lgs/repos | mongoimport -h localhost -d gitwatch_dev -c repo -f repositories --type csv
connected to: localhost
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 22303  100 22303    0     0  23914      0 --:--:-- --:--:-- --:--:--  106k
imported 640 objects
lsoave@ubuntu:~/rails/github/gitwatcher$ 
EN

回答 4

Stack Overflow用户

回答已采纳

发布于 2011-06-21 02:37:54

好的,这就是可能发生的事情。首先,我删除了JSON中的所有换行符,以便将错误数从n(其中n=行数)减少到1。然后,我不得不将JSON Array包装在另一个变量中,然后它就可以工作了。我认为JSON是为使用mongoexport而设计的,所以你很可能不能用它来导入任何任意的mongoimport。但是,如果你愿意,我所做的将是你在调用import实用程序之前必须在代码中做的事情。

我在测试的时候只用了一条记录。这是没有换行符的记录。

代码语言:javascript
复制
[{"url":"https://api.github.com/repos/lgs/scrubyt", "pushed_at": "2009-02-25T22:49:08Z","homepage": "http://scrubyt.org",  "forks": 1,"language": null,"fork": true,"html_url": "https://github.com/lgs/scrubyt","created_at": "2009-02-25T22:22:40Z", "open_issues": 0,"private": false,"size": 188,"watchers": 3,"owner": {"url": "https://api.github.com/users/lgs","login": "lgs","id": 1573,"avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"},"name": "scrubyt","description": "A simple to learn and use, yet powerful web scraping toolkit!"}]

然后我用somedata包装了它(你可以在这里使用任何名字):

代码语言:javascript
复制
{somedata:[{"url":"https://api.github.com/repos/lgs/scrubyt", "pushed_at": "2009-02-25T22:49:08Z","homepage": "http://scrubyt.org",  "forks": 1,"language": null,"fork": true,"html_url": "https://github.com/lgs/scrubyt","created_at": "2009-02-25T22:22:40Z", "open_issues": 0,"private": false,"size": 188,"watchers": 3,"owner": {"url": "https://api.github.com/users/lgs","login": "lgs","id": 1573,"avatar_url": "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"},"name": "scrubyt","description": "A simple to learn and use, yet powerful web scraping toolkit!"}]}

我还看到了Mongo的唱片。

代码语言:javascript
复制
> db.repo.findOne()
{
    "_id" : ObjectId("4dff91d29c73f72483e82ef2"),
    "somedata" : [
        {
            "url" : "https://api.github.com/repos/lgs/scrubyt",
            "pushed_at" : "2009-02-25T22:49:08Z",
            "homepage" : "http://scrubyt.org",
            "forks" : 1,
            "language" : null,
            "fork" : true,
            "html_url" : "https://github.com/lgs/scrubyt",
            "created_at" : "2009-02-25T22:22:40Z",
            "open_issues" : 0,
            "private" : false,
            "size" : 188,
            "watchers" : 3,
            "owner" : {
                "url" : "https://api.github.com/users/lgs",
                "login" : "lgs",
                "id" : 1573,
                "avatar_url" : "https://secure.gravatar.com/avatar/9c7d80ebc20ab8994e51b9f7518909ae?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-140.png"
            },
            "name" : "scrubyt",
            "description" : "A simple to learn and use, yet powerful web scraping toolkit!"
        }
    ]
}

希望这能有所帮助!

票数 2
EN

Stack Overflow用户

发布于 2012-01-20 19:10:10

通过使用"mongoimport --jsonArray ...“,它为我工作了

票数 25
EN

Stack Overflow用户

发布于 2012-10-04 14:20:20

在删除任何'\n‘之后,这对我来说工作得很好。你可以在linux cat file.json中使用tr | tr -d '\n‘> file.json

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/6405727

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档