CockroachDB supports importing data from .sql
dumps and some .csv
files.
Import from Generic SQL Dump
You can execute batches of INSERT
statements stored in .sql
files (including those generated by cockroach dump
) from the command line, importing data into your cluster.
$ cockroach sql --database=[database name] < statements.sql
INSERT
statement to include approximately 500 rows will provide the best performance.Import from PostgreSQL Dump
If you're importing data from a PostgreSQL deployment, you can import the .sql
file generated by the pg_dump
command to more quickly import data.
.sql
files generated by pg_dump
provide better performance because they use the COPY
statement instead of bulk INSERT
statements.Create PostgreSQL SQL File
Which pg_dump
command you want to use depends on whether you want to import your entire database or only specific tables:
Entire database:
$ pg_dump [database] > [filename].sql
Specific tables:
$ pg_dump -t [table] [table's schema] > [filename].sql
For more details, see PostgreSQL's documentation on pg_dump
.
Reformat SQL File
After generating the .sql
file, you need to perform a few editing steps before importing it:
- Remove all statements from the file besides the
CREATE TABLE
andCOPY
statements. - Manually add the table's
PRIMARY KEY
constraint to theCREATE TABLE
statement. This has to be done manually because PostgreSQL attempts to add the primary key after creating the table, but CockroachDB requires the primary key be defined upon table creation. - Review any other constraints to ensure they're properly listed on the table.
- Remove any unsupported elements, such as arrays.
Import Data
After reformatting the file, you can import it through psql
:
$ psql -p [port] -h [node host] -d [database] -U [user] < [file name].sql
For reference, CockroachDB uses these defaults:
[port]
: 26257[user]
: root
Import from CSV
You can import numeric data stored in .csv
files by executing a bash script that reads values from the files and uses them in INSERT
statements.
.csv
file to a .sql
file (you can find free conversion software online), and then import the .sql
file.Template
This template reads 3 columns of numerical data, and converts them into INSERT
statements, but you can easily adapt the variables (a
, b
, c
) to any number of columns.
> \| IFS=","; while read a b c; do echo "INSERT INTO csv VALUES ($a, $b, $c);"; done < test.csv;
Example
In this SQL shell example, use \!
to look at the rows in a CSV file before creating a table and then using \|
to insert those rows into the table.
> \! cat test.csv
12, 13, 14
10, 20, 30
> CREATE TABLE csv (x INT, y INT, z INT);
> \| IFS=","; while read a b c; do echo "INSERT INTO csv VALUES ($a, $b, $c);"; done < test.csv;
> SELECT * FROM csv;
+----+----+----+
| x | y | z |
+----+----+----+
| 12 | 13 | 14 |
| 10 | 20 | 30 |
+----+----+----+