Batch Importer – Part 1

Data is everywhere… all around us, but sometimes the medium it is stored in can be a problem when analyzing it. Chances are you have a ton of data sitting around in a relational database in your current application… or you have begged, borrowed or scraped to get the data from somewhere and now you want to use Neo4j to find how this data is related.

Michael Hunger wrote a batch importer to load csv data quickly, but for some reason it hasn’t received a lot of love. We’re going to change that today and I’m going to walk you through getting your data out of tables and into nodes and edges.

Let’s clone the project and jump in.

git clone git://github.com/jexp/batch-import.git
cd batch-import

It uses Maven, so if you haven’t already go ahead and install it.

sudo apt-get install maven2

Now let’s assemble the project per the instructions:

mvn clean compile assembly:single

If you did it right, you should see:

[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 47 seconds
[INFO] Finished at: Tue Feb 28 15:50:14 UTC 2012
[INFO] Final Memory: 13M/33M
[INFO] ------------------------------------------------------------------------

Awesome… let’s create some test data. Michael packed in a data generator, let’s compile it and run it.

javac ./src/test/java/TestDataGenerator.java -d .
java TestDataGenerator

It will take a little while, and then you should see this:

Creating 7500000 and 41242882 Relationships took 13 seconds.

Really where?

ls -al
-rw-r--r--  1 max max  111388909 2012-02-28 16:11 nodes.csv
-rw-r--r--  1 max max 1217775358 2012-02-28 16:11 rels.csv

So what’s in nodes.csv?

head -5 nodes.csv

Node    Rels    Property
0       4       TEST
1       0       TEST
2       1       TEST
3       1       TEST

The format is property_1, property_2, property_3 separated by tabs… and rels.csv:

head -5 rels.csv

Start   Ende    Type    Property
5496772 6842185 FIVE    Property
7416995 6166503 FOUR    Property
6712458 6853172 THREE   Property
1291639 296708  TWO     Property

The format is start node reference, end node reference number, relationship type, property_1 also separated by tabs.

Now we are ready to try out this test data. Run the command:

java -server -Xmx4G -jar target/batch-import-jar-with-dependencies.jar target/db nodes.csv rels.csv 

…and go grab a soda or cup of coffee unless you happen like watching dots on the screen, as this will take a minute or 3 depending on your hardware. If you are doing this test on an EC2 c1.medium instance it ain’t gonna work (trust me I know), so do it on a box with at least 4 GB of RAM:

Importing 7500000 Nodes took 17 seconds
Lots of dots....
Importing 41242882 Relationships took 164 seconds
203 seconds

Ok so where is it?

ls -al target/db

-rw-r--r-- 1 max max   67500025 2012-02-28 08:58 neostore.nodestore.db
-rw-r--r-- 1 max max 1998458182 2012-02-28 08:58 neostore.propertystore.db
-rw-r--r-- 1 max max 1361015130 2012-02-28 08:58 neostore.relationshipstore.db
...and a bunch of other files.

Great. Now assuming you have my Neography gem installed, let’s get a fresh copy of Neo4j and put these in there.

echo "require 'neography/tasks'" >> Rakefile
rake neo4j:install
mv target/db neo4j/data/graph.db
rake neo4j:start

Go to your Neo4j Dashboard and take a look:

Now everything should be working correctly. In part 2 of this series, I’ll show you how to write some SQL queries to get your data into Neo4j.

Tagged , , , , ,

46 thoughts on “Batch Importer – Part 1

  1. […] you’ve been following along, we got Michael’s Batch Importer, compiled it, created some test data, ran it and saw […]

  2. […] Batch Importer – Part 1: CSV files. […]

  3. […] the end of February, we took a look at Michael Hunger’s Batch Importer. It is a great tool to load millions of nodes and relationships into Neo4j quickly. The only thing […]

  4. […] recall, I’ve had three blog posts about the Batch Importer. In the first one, I showed you how to install the Batch Importer, in the second one, I showed you how to use data in your relational database to generate the csv […]

  5. Hi Max,

    My thesis work requires filling a Neo4j server instance with at least 1M nodes(+ their relationships) as quickly as possible. (I am using Neo4j server instead of embedded as I need to communicate between servers running on different machines)

    I tried REST Api Batch Ops(via Neography) but I realised that it is not the way to go. Then I found out your entry and now I am trying to use batch-importer. It works, but it takes too much time. My testbed is a AWS Large instance with 7.5GB ram, 2virtual cores.

    As a comparison; you have written that “Importing 7500000 Nodes took 17 seconds”, the same value for me is 8 times larger, 138 seconds.

    Batch importer is running for 2.5 hours, still puttings dots but the last and only thing it printed out was “Importing 7500000 Nodes took 138 seconds”.

    Do you have any idea what slows down the operation?
    Could you please your test configuration…

    Thanks a lot for your great blog and for neography…

    • Hi Vokan,

      did you ever solve this issue? I’m facing the exact same problem, I want to add a lot of data into a remote Neo4j Server instance and I don’t want to / can’t shut down the DB for that or taking the embedded approach. Did have any luck in the end?

      Thanks!

      Erik

  6. maxdemarzi says:

    Volkan,

    2.5 hours? Something is not right. Do your nodes and relationships have a ton of properties? Can you check inside the graph.db folder being created and see the file sizes growing? Are you indexing (that’s a bit slower than creating nodes and relationships)? Post your answers on the neo4j google forum and we’ll figure this out.

    Thanks,
    Max

  7. Keith Strickland says:

    Hi Max,

    I was trying to install this using maven as your instructions suggest but I’m getting the following error:

    C:\Users\GBS\git\batch-import>mvn clean compile assembly:single
    [INFO] Scanning for projects…
    [INFO]
    [INFO] ————————————————————————
    [INFO] Building Simple Batch Importer 0.1-SNAPSHOT
    [INFO] ————————————————————————
    [WARNING] The POM for org.neo4j:neo4j-kernel:jar:1.8-SNAPSHOT is missing, no dependency information available
    [WARNING] The POM for org.neo4j:neo4j-lucene-index:jar:1.8-SNAPSHOT is missing, no dependency information available
    [INFO] ————————————————————————
    [INFO] BUILD FAILURE
    [INFO] ————————————————————————
    [INFO] Total time: 0.453s
    [INFO] Finished at: Sat Oct 20 18:38:24 EDT 2012
    [INFO] Final Memory: 6M/77M
    [INFO] ————————————————————————
    [ERROR] Failed to execute goal on project batch-import: Could not resolve dependencies for project org.neo4j:batch-impor
    t:jar:0.1-SNAPSHOT: The following artifacts could not be resolved: org.neo4j:neo4j-kernel:jar:1.8-SNAPSHOT, org.neo4j:ne
    o4j-lucene-index:jar:1.8-SNAPSHOT: Failure to find org.neo4j:neo4j-kernel:jar:1.8-SNAPSHOT in http://m2.neo4j.org/conten
    t/repositories/snapshots was cached in the local repository, resolution will not be reattempted until the update interva
    l of Neo4j Snapshots has elapsed or updates are forced -> [Help 1]
    [ERROR]
    [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
    [ERROR] Re-run Maven using the -X switch to enable full debug logging.
    [ERROR]
    [ERROR] For more information about the errors and possible solutions, please read the following articles:
    [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
    C:\Users\GBS\git\batch-import>

    I’ve searched and can’t seem to find anything concerning the error above. Hopefully you can point me in the right direction.

    Thanks

  8. Enzo says:

    Hi
    Tried to compile TestDataGenerator. Initially couldn’t find the file, then found it in /src/test/java/org/neo4j/batchimport/TestDataGenerator.java.
    But then got compile errors:
    ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java:3: package org.junit does not exist
    import org.junit.Ignore;
    ^
    ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java:14: cannot find symbol
    symbol: class Ignore
    @Ignore
    ^
    Help please!!
    Enzo

  9. […] are many technical tools out there (definitely look here, here and here, but I needed something simple. So my friend and colleague Michael Hunger came to the […]

  10. Seb says:

    Hi,

    is there a release (JAR file) available somewhere? Building it is such a pain…thanks!

    • maxdemarzi says:

      You can grab this one from my public dropbox => https://dl.dropbox.com/u/57740873/batch-import-jar-with-dependencies.jar

      • mentatseb says:

        Thanks! For those who rarely use Maven projects it’s a real help :)

        Anyway here is how to do with Netbeans:
        – clone the project
        – open it in Netbeans
        – right-click on the project name, select Properties, then the Actions panel
        – select Build with Dependencies and add this goal to the Execute Goals settings: ‘assembly:single’
        – add also the property Skip Tests
        – press OK
        – right-click again on the project name, select Resolve Problems on the bottom to download the dependencies.
        – right-click again and select Build with Dependencies

        cheers,
        Seb

  11. […] so instead of typing out a million node graph, we’ll build a graph generator and use the batch importer to load it into Neo4j. What I want to create is a set of files to feed to the batch-importer. A […]

  12. […] 有需要技术教程教我们如何做(比如batch-import,batch importer part,import),但我需要一些简单的方法,所以我的朋友和同事 Michael Hunger前来帮助我,提供了一些方法用于创建一个Excel将数据导入Neo4j. […]

  13. Max, thanks for all these tutorials. Have you noticed that batch-import tool does not support UTF-8 encoding? No accents, no non-English characters at all, this is a massive problem for many of us. I have already raised the issue in github, do you have any idea how to make it work?

  14. tameem says:

    Hi,
    I am trying to import 117,000,000+ nodes as well as their relationships and indices on a server using the batchimport jar and running it using netbeans. We did it before but indexing wasn’t implemented correctly so we are kind of debugging and running again to check why indexing isn’t working while trying as much as possible what we want to try on a smaller example (2M nodes without relationships) and then trying the same thing on the big files. The problem is that running this on a server with the big files takes more than 27 hours for each run and we end up with it not working and “oh maybe this is why, run again on a small example, great looks like we found it, run on the big files, 27 hours after: oh not working again”. My question is: is there a way to speed up the running time on this big example with the aforementioned number of nodes?

    • maxdemarzi says:

      There is batch.properties file when you run the batch import. The defaults are for a small graph. Tweak these:


      use_memory_mapped_buffers=true
      neostore.nodestore.db.mapped_memory=100M
      neostore.relationshipstore.db.mapped_memory=500M
      neostore.propertystore.db.mapped_memory=1G
      neostore.propertystore.db.strings.mapped_memory=200M
      neostore.propertystore.db.arrays.mapped_memory=0M
      neostore.propertystore.db.index.keys.mapped_memory=15M
      neostore.propertystore.db.index.mapped_memory=15M

      To much larger values depending on your expected graph size.

      • tameem says:

        Hello,
        Thanks for your answer. I don’t know what’s going wrong in my machine though as after changing the values to larger values, importing the nodes is taking more than two and a half hours and it had been taking one hour before so it became slower.

  15. tameem says:

    Hello, I am having problems executing queries on an already established graph that has 118 million nodes and 140 million relationships. In the beginning it was a memory problem then I changed the initmemory and maxmemory options to proper values (on a server with 250GB of RAM) which made life much better but then while running the very same queries again that proved this memory change to be effective, they are throwing a memory heap exception which is driving me crazy. I think the problem is in the buffer size. The neo4j website speaks about Xmx and the fact it should be increased but I think there is nothing EXPLICITLY written about how and where to change this value of the heap. Last thing I tried after some guesses on this extremely vague info they give on the website about that, I added a wrapper.java.additional=Xmx and wrapper.java.additional=Xss unfortunately to no avail. It even got worse as much as this linux command “cat /proc/meminfo” is concerned as the “buffers” show a smaller value than before. Any directions about how to effectively change the buffer size?

  16. […] Maxdemarzi’s blog left us a detailed steps of importing data as a fresh start, with the tools provided by Michael Hunger. […]

  17. Daniel says:

    hello people, I have an issue when importing, because there comes a point where I get an error that says “The requested operation can not be performed on a file with a user mapped section open”, anyone could help me?

  18. Shelley says:

    I tried your command “java -server -Xmx4G -jar target/batch-import-jar-with-dependencies.jar target/db nodes.csv rels.csv ” in order to try to import the nodes and relationships. I also received an error trying to create the test data, which I resolved by TestDataGenerator.java on eclipse (by importing the project as a maven project). Is there a way I can do the imports in a similar manner in eclipse?

    • Daniel says:

      run this command to generate the test data: mvn clean test-compile exec:java -Dexec.mainClass=org.neo4j.batchimport.TestDataGenerator -Dexec.classpathScope=test -Dexec.args=sorted

  19. femvestor says:

    From what I understand in the Neo4j documentations you can either have your neo4j embedded or you can call it through REST. Can you create your neo4f in an embedded environment (Java API) and then access it through REST?

  20. vaibhav jain says:

    I am trying batch-import but getting this error.I am new to neo4j can you help me to insert bulk data into neo4j…Actually i want to do performance testing of neo4j.

    javac ./src/test/java/TestDataGenerator.java -d .
    javac: file not found: ./src/test/java/TestDataGenerator.java
    Usage: javac
    use -help for a list of possible options

  21. Ravinder says:

    Hi,
    I tried to import data from a csv file and it ran successfully for 100 nodes/records. But when i try to import 300 nodes/records it import only 100 nodes. I don’t know why this is happening. Is there any setting that checks the number of nodes to import ?

  22. neonewbie says:

    Hello I’m running the 2.0 branch. After installation I run the maven command but execution fails on the same files with
    Caused by: java.lang.IllegalStateException: Index users not configured.
    at org.neo4j.batchimport.Importer.importNodes(Importer.java:102)

    I thought the program would set up the index automatically? I even tried creating the index manually CREATE INDEX ON :users(name) but still fails on that piece of code

    Any suggestions? the indexing functionality looks interesting.

  23. […] DeMarzi did a great series of blog posts on the Neo4j batch […]

  24. kxmehdi says:

    Hi all,
    I am trying to find an importer to load .owl file into neo4j?
    Can anyone help me on this.

  25. fsalvador23 says:

    Just to let you know that in Neo4j 2.0 We must set “allow_store_upgrade=true” in neo4j.properties. Under conf folder. Cheers.

  26. Hi,
    I have used a different nodes.csv and rels.csv file.
    link: https://gist.github.com/qmaruf/ed69acf8625ac577d578

    Everything seems fine and after importing it shows the following message:


    maruf@leopard:~/Desktop/bi/batch-import$ java -server -Xmx4G -jar target/batch-import-jar-with-dependencies.jar target/db nodes.csv rels.csv
    Usage: Importer data/dir nodes.csv relationships.csv [node_index node-index-name fulltext|exact nodes_index.csv rel_index rel-index-name fulltext|exact rels_index.csv ….]
    Using: Importer target/db nodes.csv rels.csv
    Using Existing Configuration File
    Importing 4 Nodes took 0 seconds
    Importing 4 Relationships took 0 seconds
    Total import time: 1 seconds

    view raw

    neo4j

    hosted with ❤ by GitHub

    But there is not data in db. I have tried executing the following cypher query. “START n=node(*) RETURN n;” and it returns 0 row. It should show at least 4 nodes according to nodes.csv.

    Am I missing something?
    Eagerly waiting for help.

    Thanks

    • maxdemarzi says:

      Did you copy the graph.db directory made by the batch importer into your neo4j/data directory and restart it?

  27. alexmaddoc says:

    Cannot compile the data generator.. give me errors:
    root@srv:/home/alex/batch-import# javac ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java -d .
    ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java:3: error: package org.junit does not exist
    import org.junit.Ignore;
    ^
    ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java:14: error: cannot find symbol
    @Ignore
    ^
    symbol: class Ignore
    ./src/test/java/org/neo4j/batchimport/TestDataGenerator.java:29: error: cannot find symbol
    System.out.println(“Using: TestDataGenerator “+nodes+” “+relsPerNode+” “+ Utils.join(types, “,”)+” “+(sorted?”sorted”:””));
    ^
    symbol: variable Utils
    location: class TestDataGenerator
    3 errors

    When trying to run the included generate.sh script, I’m getting:

    root@srv:/home/alex/batch-import# ./generate.sh
    —————————————————
    constituent[0]: file:/usr/share/maven2/lib/maven-debian-uber.jar
    —————————————————
    java.lang.StringIndexOutOfBoundsException: String index out of range: -1
    at java.lang.AbstractStringBuilder.setLength(AbstractStringBuilder.java:173)
    at java.lang.StringBuffer.setLength(StringBuffer.java:170)
    at org.apache.maven.cli.CLIManager.cleanArgs(CLIManager.java:271)
    at org.apache.maven.cli.CLIManager.parse(CLIManager.java:224)
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:119)
    at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:60)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
    at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
    at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)
    at org.codehaus.classworlds.Launcher.main(Launcher.java:375)

    any help?
    running on ubuntu server 12.04 lts
    2.0 branch of batchimport

  28. minh nhut says:

    Need help!
    When i copy all the files inside the folder graph.db and paste them to data/graph.db of neo4j. After that I start the server but it not works. Then I create the new graph.db and start server aganin and it works fine. Don’t know why?

  29. […] Batch Importer – Part 1. Data is everywhere… all around us, but sometimes the medium it is stored in can be a problem when analyzing it. […]

  30. […] use this Neo4j-Batch-Importer to import CSV files directly into the graph (including indexing), ETL-article by Max de Marzi).So now I had my Gephi project, but how to get it into Neo4j? Well, turns out there is a Gephi […]

  31. […] Neo4j – so how do I do that?There are many technical tools out there (definitely look here, here and here, but I needed something simple. So my friend and colleague Michael Hunger came to the […]

  32. […] to TSV. sql2graph was inspired by Max De Marzi blog posts on using batch-import: part 1 ( https://maxdemarzi.com/2012/02/28/batch-importer-part-1/ ) and part 2 ( https://maxdemarzi.com/2012/02/28/batch-importer-part-2/ ) It operates in […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: