Hey, newbie! Looks like you’re here for some MySQL export and import to CSV stuff. Is data integration your thing? It reminds me of this awesome “Star Trek: The Next Generation” episode called “The Nth Degree.”
Here’s the deal with that episode:
Lt. Barclay gets zapped by an alien probe while investigating the failures in the Argus Array, a deep-space telescope and radio antenna. Instead of killing him, it made him a super genius. But fear began to loom as Barclay integrated his higher brain functions into the ship’s computer core. It was even scarier when his genius took them to the center of the galaxy!
The result?
It seems that their fear was misplaced. It was the work of the Cytherians, a friendly but very advanced alien race. They only want an exchange of information. So, in the end, the Enterprise “imported” advanced information from the Cytherians. At the same time, they “exported” theirs to these aliens.
Now, this doesn’t paint your everyday MySQL database chores. But I’m sure you agree that like the Cytherians, you need tools for information exchange.
But there’s an easy way because MySQL export and import CSV is not like dealing with a positronic brain or Warp theory.
Let’s lay in a course for this journey:
- Prepare the Sample Data
- MySQL Export to CSV
- Using the CLI and SELECT INTO OUTFILE,
- Using dbForge Studio for MySQL to Export to CSV,
- and Using MySQL Workbench to Export to CSV
- MySQL Import CSV
- Using the CLI with LOAD DATA INFILE,
- Using dbForge Studio for MySQL to Import CSV,
- and Using MySQL WorkBench for MySQL Import CSV
So, we have the command-line interface and two GUI tools in our arsenal.
Ready?
Then, let’s make it so.
Preparing the Data
Much like any mission in Star Trek, it needs preparation. So, we will prepare our data for the MySQL export and import CSV.
But there’s no reason to fret. While we add some sci-fi flair to this article, this is not like Temporal Mechanics 101. Making you crazy is not in our mission parameters.
It’s a simple crew roster. See the structure for yourself:
CREATE TABLE Crew_Roster (
crew_id INT AUTO_INCREMENT PRIMARY KEY,
crew_name VARCHAR(50),
crew_rank VARCHAR(30),
position VARCHAR(50),
species VARCHAR(30),
date_joined DATE
);
Then, we’ll put in some excellent crew members in the series:
INSERT INTO Crew_Roster (crew_name, crew_rank, position, species, date_joined)
VALUES ('Jean-Luc Picard', 'Captain', 'Commanding Officer', 'Human', '2364-01-01'),
('William Riker', 'Commander', 'First Officer', 'Human', '2361-07-08'),
('Data', 'Lieutenant Commander', 'Chief Operations Officer', 'Android', '2338-06-02'),
('Worf', 'Lieutenant', 'Chief Security Officer', 'Klingon', '2364-01-01'),
('Beverly Crusher', 'Commander', 'Chief Medical Officer', 'Human', '2364-01-01'),
('Deanna Troi', 'Lieutenant Commander', 'Ship\'s Counselor', 'Betazoid', '2364-01-01'),
('Geordi La Forge', 'Lieutenant Commander', 'Chief Engineer', 'Human', '2364-01-01'),
('Guinan', 'Civilian', 'Bartender', 'El-Aurian', null),
('Wesley Crusher', 'Ensign', 'Acting Ensign', 'Human', '2364-01-01'),
('Ro Laren', 'Ensign', 'Flight Controller', 'Bajoran', '2368-05-16'),
('Reginald Barclay', 'Lieutenant', 'Holoprogramming Specialist', 'Human', '2366-09-04'),
('Miles O\'Brien', 'Chief Petty Officer', 'Transporter Chief', 'Human', '2364-01-01'),
('Tasha Yar', 'Lieutenant', 'Chief of Security', 'Human', '2364-01-01'),
('Alyssa Ogawa', 'Ensign', 'Nurse', 'Human', '2365-06-09'),
('Keiko O\'Brien', 'Civilian', 'Botanist', 'Human', '2365-06-09'),
('Alexander Rozhenko', 'Civilian', 'Child', 'Klingon', '2367-12-02'),
('Molly O\'Brien', 'Civilian', 'Child', 'Human', '2368-10-19'),
('Robin Lefler', 'Ensign', 'Mission Specialist', 'Human', '2368-01-01');
But we want some variety so let’s add some aliens to the mix:
CREATE TABLE Alien_Encounters (
encounter_id INT AUTO_INCREMENT PRIMARY KEY,
species_name VARCHAR(50),
first_contact_date DATE,
location VARCHAR(100),
encounter_summary TEXT,
federation_status VARCHAR(30)
);
Then, let’s have some data for it too:
INSERT INTO Alien_Encounters (species_name, first_contact_date, location, encounter_summary, federation_status)
VALUES ('Vulcan', '2063-04-05', 'Bozeman, Montana, Earth', 'First contact with Humans after Zefram Cochrane\'s warp flight', 'Friend'),
('Klingon', '2151-05-05', 'Qo\'noS', 'Early conflicts followed by alliance and membership in Federation', 'Foe turned Friend'),
('Romulan', '2156-03-22', 'Neutral Zone', 'Historical adversaries with sporadic conflicts', 'Foe'),
('Bajoran', '2328-07-28', 'Deep Space 9, Bajoran Wormhole', 'Alliance formed after Cardassian occupation', 'Friend'),
('Ferengi', '2355-12-24', 'Deep Space 9', 'Trade partners with a complex relationship', 'Foe turned Neutral'),
('Borg', '2365-06-16', 'System J-25', 'First encounter with the Borg Collective', 'Foe'),
('Q', '2364-09-03', 'Various', 'Entity of immense power, often tests humanity', 'Neutral'),
('Andorian', '2152-06-07', 'Weytahn', 'Early ally in Federation', 'Friend'),
('Betazoid', '2265-02-17', 'Betazed', 'Allied species with empathic abilities', 'Friend'),
('Tholian', '2268-09-15', 'Tholian space', 'Encounters marked by territorial disputes', 'Foe'),
('Gorn', '2267-10-20', 'Cestus III', 'Encountered in territorial disputes', 'Foe'),
('Trill', '2250-11-11', 'Trill', 'Joined species with symbiotic relationship', 'Friend'),
('Xindi', '2153-04-24', 'Delphic Expanse', 'Early adversaries in Temporal Cold War', 'Foe turned Neutral'),
('Horta', '2267-01-17', 'Janus VI', 'Silicon-based life form encountered in mining colony', 'Friend'),
('Changeling', '2370-03-18', 'Deep Space 9', 'Founders of the Dominion, infiltrated Alpha Quadrant', 'Foe'),
('Vidiian', '2371-04-30', 'Delta Quadrant', 'Encountered in the Delta Quadrant, known for medical knowledge and organ harvesting', 'Foe'),
('Species 8472', '2373-09-19', 'Delta Quadrant', 'Threatening species from fluidic space', 'Foe'),
('Hirogen', '2374-04-03', 'Delta Quadrant', 'Nomadic species known for hunting prey', 'Foe'),
('Kazon', '2371-08-14', 'Delta Quadrant', 'Warring factions encountered in the Delta Quadrant', 'Foe');
Sorry, I can’t vouch for the accuracy of the dates. So, let’s not make it a big deal because we want some dates in our data. And the alien species are the ones I can think of but not necessarily from The Next Generation series.
We’re done with what we need for the sample codes of MySQL export and import CSV. So, let’s move on.
MySQL Export to CSV
Exporting is like beaming the away team from the ship into the planet’s surface. In relational database terms, this is extracting data from our database, and storing it in another format.
In our case, the target format is a Comma-Separated Values or CSV file. This can be deceiving because it says Comma-separated values. But you’re not limited to commas as delimiters. It is still CSV if the delimiter is a tab, quote, or pipe symbol. Just check the reference.
CSV, on the other hand, is in text form, meaning it’s readable by the naked eye. It’s also universal and ensures portability between different platforms. This is both good and bad.
Being readable makes it also vulnerable and exploitable. Corruption is possible during the transfer process. So, this should be handled and used wisely. Eagle-eye monitoring is a must at all stages of processing.
So, how do we export data from MySQL?
Using the CLI and SELECT INTO OUTFILE
The CLI will accept the export command and MySQL will make a CSV for you. It’s like saying to the Enterprise Computer: “Computer, show me the latest crew roster aboard the Enterprise.”
BUT (All caps are intentional) you have to type in the commands and know the syntax.
The basic command to export to a CSV file is SELECT INTO OUTFILE. So, check out the syntax in the following subsection.
SELECT INTO OUTFILE Syntax
SELECT <column-list> | *
INTO OUTFILE <full-path-and-filename.CSV>
[<export-options>]
FROM <table-name>
[JOIN clauses]
[WHERE clause]
...
So, it means that it’s a normal SELECT statement. But the crucial part is the INTO OUTFILE <full-path-and-filename> and the <export-options>. That, my friend, will tell MySQL to export the rows into a CSV file.
SELECT INTO OUTFILE Export Options
Meanwhile, check out the <exports-options>:
FIELDS Clause
Here you will define how column data will look like in the CSV file. This has subclauses like TERMINATED BY ‘string’, [OPTIONALLY] ENCLOSED BY ‘char’, and ESCAPED BY ‘char’.
- TERMINATED BY ‘string’: This is like saying “Hey, each piece of info in a row is separated by this character string.” For example, you can use a comma, semi-colon, pipe character, or tab. So, imagine you’re making a list of things to buy, and you separate each by a comma.
- [OPTIONALLY] ENCLOSED BY ‘char’: Think of this as wrapping each piece of data in a character, like putting quotes around words. If you use ENCLOSED BY ‘”‘, it means each data piece is surrounded by double quotes. It’s like turning Vulcan, Klingon, Ferengi into ”Vulcan”, “Klingon”, “Ferengi”. The OPTIONALLY keyword encloses string or text data by ‘char’ but not numbers.
- ESCAPED BY ‘char’: If you set ESCAPED BY ‘\\’, it means if you have a character that might mess things up (like a quote inside quoted text), you stick a backslash in front of it. Check out the example later.
LINES Clause
Here you will define how lines will begin and end. This has subclauses like TERMINATED BY ‘string’ STARTING BY ‘string’.
- TERMINATED BY ‘string’: Think of this as saying, “This is where a new line starts.” It’s like hitting the Enter key at the end of a line in a text file. If you set LINES TERMINATED BY ‘\n’, each row of data ends with a new line character, so the next set of data starts on a new line.
- STARTING BY ‘string’: This is like having a special marker at the beginning of each line. If you use STARTING BY ‘* ‘, it means every line starts with an asterisk and a space.
Now, both FIELDS and LINES clauses are optional. So, if you didn’t specify both, MySQL will treat it as if you typed:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY ''
But if you want to specify both FIELDS and LINES, MySQL prefers you type FIELDS first then LINES. Do not switch it.
Things to Remember in Using SELECT INTO OUTFILE
There’s more.
If you want a happy ending with your MySQL export to CSV, here are some things to remember:
- Permissions: Ensure the MySQL user has the FILE privilege or you can’t create a CSV file.
- File Path: The folder and subfolders should exist, and MySQL can write on it. And the path it refers to is the path to a drive in the server. So, if you’re connecting to a remote server, don’t expect the CSV file in your workstation’s drive.
- Existing Files: MySQL won’t overwrite an existing file, so ensure the file path doesn’t already have a file with the same name.
With FILE privilege, folder write permissions, and not stepping on someone else’s file, it’s mostly about security. This is like moving people using a transporter in Star Trek. You can’t move people from the ship to a planet when force fields are up.
The permissions and file path requirements are needed for both MySQL export and import CSV, so remember these points.
SELECT INTO OUTFILE Examples
Let’s dive into some working sample code I made so you can test them on your workstations.
Example 1: Basic Export
The most basic example is to export to CSV without the export options. Check it out below:
SELECT
crew_name
,crew_rank
,position
,date_joined
INTO OUTFILE '/var/lib/mysql-files/crew_roster.csv'
FROM Crew_Roster;
Note the path /var/lib/mysql-files. I’m using Ubuntu Linux so it’s not like C:\SampleData\ in Windows. Except for the path, the syntax is the same whether you use Linux or Windows. As mentioned earlier, you can use any folder, like /tmp/ or C:\TEMP\ as long as it exists, and MySQL can write on it. Set the permissions needed to write on your intended folder location.
Here’s how it happened in the CLI:

MySQL exported 18 rows. So, here’s the output on my end:

As you can see, there are no headers. Also, a tab separates each column of data. And then, null values became \N.
Example 2: Using a Comma Delimiter with an Escape Character
This next example will use a comma delimiter. However, the data has a comma in its text value. So, we will use ESCAPED BY. I found this useful in dealing with data that will mess up the CSV output.
Here’s the code:
SELECT *
INTO OUTFILE '/var/lib/mysql-files/alien_foes.csv'
FIELDS ESCAPED BY '\\' TERMINATED BY ','
LINES TERMINATED BY '\n'
FROM Alien_Encounters
WHERE federation_status='Foe';
This will export all Federation foes from our table.
Check out the output below:

See the parts enclosed by green boxes? That’s the effect of the ESCAPED BY because the data has a comma which is also our delimiter.
Example 3: Using OPTIONALLY ENCLOSED BY and STARTING BY
Enclosing a column value can be a safety net and having a marker at the beginning of a line is a nice visual cue that it’s a row of data. So, let’s have an example that will do that.
Below is the code:
SELECT *
INTO OUTFILE '/var/lib/mysql-files/alien_crew_roster.csv'
FIELDS OPTIONALLY ENCLOSED BY '"' TERMINATED BY '|'
LINES STARTING BY '* ' TERMINATED BY '\n'
FROM Crew_Roster
WHERE species <> 'Human';
Take a look at the output below:

First, the pipe symbol separates column data. Then, text and date data are enclosed by double quotes, but not the numeric ones. This is the effect of the OPTIONALLY ENCLOSED BY. Finally, each row or line starts with an asterisk and space. This is the effect of the STARTING BY option.
That’s it for using the CLI and SELECT INTO OUTFILE to export data to CSV.
Using dbForge Studio for MySQL to Export to CSV
Before crew members go for away missions, they have a plan that includes several options to get the results they want. So, this time let’s have our next option – dbForge Studio for MySQL.
dbForge Studio for MySQL is a premier MySQL GUI tool. It is known for its user-friendly interface and provides top-of-the-line features for administrators, developers, and data analysts. It can also connect to different MySQL databases in the cloud and also to MariaDB.
And yes, this can do both MySQL export and import CSV using a wizard.
With dbForge Studio for MySQL, you can export to your workstation from a remote server using 14 formats, not just CSV. ODBC is also a target option, so it expands further to more target formats that are ODBC-compliant. It also provides several formatting options including a CSV header. Finally, you can automate the export process if you need to export the same data using the same options repeatedly.
That’s a heck of a feature set to export MySQL data if you ask me.
Let’s get started with the wizard.
STEP 1: Choose the Table to Export
In dbForge Studio for MySQL, make sure the Database Explorer is open. Then, navigate to the database and table you want to export. Then, right-click it and select Export Data.

As an alternative, you can right-click the database name from the Database Explorer. Then, select Tasks -> Export Data.
STEP 2: Choose the CSV Format
The Export Data wizard will launch in another window. You need to choose the target format. In our case, it’s the CSV file.

Then, click Next.
STEP 3: Choose the Source Table
Let’s export the same table data we used in SELECT INTO OUTFILE. The next step in the wizard is to choose the source table. If you right-click the database name from the previous step, there’s no selected table. But if you right-click the table name, it will be selected in this step.
See a sample below:

If you don’t need to add another table (because you can), click Next. You can also Cancel or move Back from this point in the wizard.
STEP 4: Configure the Output File(s)
If you choose several tables in STEP 3, you can export them into a single file or multiple files. But since we selected only one table, the Export data into separated files option is disabled. See an example below:

You have other options like add a timestamp to the filename or compress the output to a Zip file. You can even encrypt or add a password. The name of the CSV file by default is the same as the table name with a CSV extension.
You can set up another folder to store the CSV file.
If you’re done with configuring the output file, click Next. If the file exists in the target folder, there will be a prompt if you want to replace it or not. Choosing Yes will overwrite the existing file.
STEP 5: Specify the Format Options
You can relate to this step if you already know the SELECT INTO OUTFILE. In this step, you will set the delimiter, add an enclosure to the column data, Include column headers, and Force field quotes.
Check out the following sample:

In the above image, we chose to include column headers. We also chose a semicolon delimiter and a double-quote enclosure. We didn’t change the encoding and the Force field quotes.
At this point, you can click Next or Export to proceed. Choosing Export will skip the rest of the options and use the default settings.
STEP 6: Configure Data Formats
This part of the wizard has 2 sections. One shows the column data types that dbForge Studio found based on the table structure. Check out a sample below:

You can also uncheck the columns you don’t want to go to the CSV file. Moreover, there will be a separate column list depending on how many tables you choose to export in STEP 3.
The other part shows the data type formats for dates, integers, currencies, and more. Check out the following sample:

You can still change the formatting. For example, change the Date format from MM/dd/yyyy to yyyy-MM-dd.
If you don’t want to change any other option, you can click Next or Export.
STEP 7: Choose to Export All Rows or a Part of It
Then, you need to choose to export all rows, a range of rows, or selected rows.

Then, click Next or Export.
STEP 8: Choose Error Handling and Logging Options
The final step before export begins is to choose the type of error handling, and if you wish to use a log file. Check out the screenshot below:

From the above illustration, we choose the default error handling and mark check to write a report to a log file.
You can now click Export or go Back to check the options again.
There will be a progress indicator of your export process. If there are no errors, you will see a window like the one below:

You can open the CSV file by clicking Open result file. Here’s the result from my Windows station:

You can also click Export more, Show log file, or Open result folder.
Automating the MySQL Export
Do you export the same set of tables using the same options on a regular basis?
Then, this one is for you. With dbForge Studio for MySQL, you can automate the export process and run it using a scheduler.
But first, you need to follow the wizard and when you reach the finish line, click Save -> Save Template to save the export template. See a sample below:

Then, name your template file and save it. I named mine crew-roster.det.
Then, you need to configure the command-line options. So, click Save -> Save Command Line. A new window will appear like the one below:

Notice that we used the template file earlier. A password is also supplied so this command-line instruction can connect to the database when it runs. Click Save to save a batch file (.bat) that you can run using a scheduler like Windows Task Scheduler.
After that, it will run on its own on your set schedule.
That’s it for exporting to CSV files using dbForge Studio for MySQL.
Using MySQL Workbench to Export to CSV
Our third and final option to export table data to CSV files is using MySQL Workbench. This GUI tool from Oracle also uses a wizard to help you with MySQL export and import CSV.
Let’s start with exporting data. You can export one table at a time, select the columns you want, and export to CSV and JSON formats. It’s that simple.
Follow along as we guide you through the steps.
STEP 1: Choose the Table to Export
From MySQL Workbench Navigator, right-click the table you wish to export. Then, select Table Data Export Wizard. See a sample below:

This will open a new window for the export wizard.
STEP 2: Choose the Columns and the Range of Rows to Include
The next part is to choose the columns. By default, all columns are selected. You can also specify the range of rows to include. Simply enter the Row Offset to tell the wizard what row number to start and the Count for the number of rows to include. If you leave them blank, all rows will be included.
The sample below will export all rows in the crew roster:

You can still change the table to export and select the columns you want. But if this is enough for you, click Next.
STEP 3: Choose Formatting Options
Choose the target format. In this case, it’s CSV. Then, specify the column and line separator, enclosure, and null handling. Check out the following sample:

This will export to the local machine where Workbench was started. So, just like dbForge Studio for MySQL, the default output is in the client machine.
Click Next to continue. If the CSV file exists in the folder you specified, Workbench will prompt you if you want to overwrite the file or not. Choose Yes to continue.
STEP 4: Run the Export
At this point, you are ready to export. You will see a similar window below:

Clicking Next will start the export process. You can choose to Show Logs or not to see the progress. When done, you will see a window like the one below:

Then, clicking Next will show another window like the one below:

Finally, click Finish to close the wizard. You can open your favorite text editor to open the CSV file outside MySQL Workbench.
See the result below:

The difference between this and the one from dbForge Studio is the path and filename. And the null value became blank.
That’s it for MySQL export using MySQL Workbench. And we’re also done with MySQL export.
MySQL Import CSV
MySQL import CSV is like beaming the crew from a planet back to the ship. Or in MySQL terms, extracting each row in the CSV file and writing it to a MySQL table.
I’m sharing my experience to you using LOAD DATA INFILE in the CLI and the import wizards in dbForge Studio for MySQL and MySQL Workbench.
Let’s begin.
Using the CLI with LOAD DATA INFILE to Import CSV
The opposite of SELECT INTO OUTFILE is LOAD DATA INFILE. This will import CSV data into a MySQL table.
Follow along as the syntax is presented next.
LOAD DATA INFILE Basic Syntax
Here’s how to import data from a CSV file into a MySQL table using LOAD DATA INFILE.
LOAD DATA INFILE <full-path-filename>
[REPLACE | IGNORE]
INTO TABLE <table-name>
[<import-options>]
[IGNORE <number> ROWS]
[(column1,column2,columnN)]
Here’s a short description of each part of the syntax:
- <full-path-filename>: This is the CSV file to import together with its folder location.
- REPLACE: Optional. CSV lines or rows that exist in the target table through a unique key value will not be imported. Instead, the existing rows will be replaced or updated. Everything else will be inserted. Without REPLACE, existing rows will cause an error.
- IGNORE: Optional. CSV lines or rows that duplicate an existing row on a unique key value are discarded or not imported.
- INTO TABLE <table-name>: specifies the target table where the CSV rows will be loaded.
- [<import-options>]: Optional. This is the same as the <export-options> in SELECT INTO OUTFILE. I suggest you use the same options used in the export process to minimize the errors in your MySQL import CSV.
- [IGNORE <number> ROWS]: Optional. This is useful to ignore the first line of the CSV if they are headers and not data. So, that means specifying IGNORE 1 ROWS will start the import at line 2.
- [(column1, column2, columnN)]: Optional. Use this to specify the table columns based on the CSV column arrangement.
Things to Remember in Using LOAD DATA INFILE
- File Locations and Permissions: It uses the same permissions and file path requirements needed in SELECT INTO OUTFILE.
- Duplicate Rows: When anticipating existing rows in the CSV file, use REPLACE or IGNORE to handle the duplicates.
- Foreign Key Constraints: Extra care is needed to handle foreign keys. You may need to load the data in a temporary table and check for any foreign key conflicts in a script. Some may suggest disabling the foreign keys before running LOAD DATA INFILE to avoid the errors. This will only be useful for insert/update speed of very large data but not fixing the data problem. You will do fixes eventually when you reenable it. So, validate and fix the data first. Disable the foreign keys for write speed.
LOAD DATA INFILE Examples
The following are working examples demonstrating the features of LOAD DATE INFILE.
Example 1: Import More Rows to Table
Let’s add more aliens to our table using LOAD DATA INFILE. Check out the CSV file below:

Now, let’s beam up these aliens with this code:
LOAD DATA INFILE '/var/lib/mysql-files/more_aliens.csv' INTO TABLE Alien_Encounters FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n';
And the result:

Example 2: Importing CSV with Headers and Existing Key
Let’s have an example of a CSV with column headers. The primary key values also exist and will cause an error.
Here’s the sample data:

Crew IDs 16 and 17 exist in the table with different names. How does LOAD DATA INFILE deal with this?
Here’s the code:
LOAD DATA INFILE '/var/lib/mysql-files/more_crew.csv' INTO TABLE Crew_Roster FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
The IGNORE 1 ROWS will skip the CSV header and start with line 2.
And the result is a duplicate entry error. See below:

Crew IDs 16 and 17 belong to Alexander and Molly. So, inserting them is a primary key violation.
Example 3: Using REPLACE
The previous example with a duplicate error can be handled with REPLACE if we intend to update the non-key data.
Here’s the modified code with REPLACE.
LOAD DATA INFILE '/var/lib/mysql-files/more_crew.csv' REPLACE INTO TABLE Crew_Roster FIELDS TERMINATED BY '|' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
The addition of REPLACE before INTO TABLE will fix the duplicate error if the intention is to update.
Check out the result below:

The MySQL import is a success. Notice that the existing records were deleted first, then inserted the new rows.
Using dbForge Studio for MySQL to Import CSV
DbForge Studio for MySQL has an Import Data feature that can import from 10 different file formats, including CSV and ODBC. With ODBC, it extends to more file formats as long as they are ODBC-compliant.
You can expect a configuration of similar settings in data import compared to data export. You can also do import CSV automation.
But we will import from a CSV. Follow along with the steps.
STEP 1: Right-Click the Table you Want as a Target for MySQL Import CSV
In the Database Explorer, right-click the table you want to insert/update data from a CSV file. Then, select Import Data. Check out the sample below:

STEP 2: Choose the Source CSV File
Then, a new window will appear for the Import Wizard. It will let you choose which file format to use as the source. So, click CSV.

Then, indicate the CSV file. We’re going to use the same file as earlier.
Click Next to continue.
STEP 3: Choose the Destination
Since we chose the table in Step 1, you can click Next or Import to proceed with the default settings. Or, if you got the table wrong in Step 1, you can change it from here. If you wish, you can create a new table based on the CSV file.
See a sample below:

STEP 4: Choose Import Options
This step will let you choose delimiters, specify headers, and enclosure. You can also see a snapshot of the source data with delimiters.
Check it out below:

Our data has no headers. So, I changed the Header position to 0.
Click Next or Import to proceed.
STEP 5: Configure Data Type Formats and Null Handling
This step has 2 parts. First, the data type formats for numbers, dates, and Boolean. You can also specify the null handling. See a sample below:

The next step involves column settings. You can set formats and null handling for each column. See below:

Click Next to continue.
STEP 6: Map the CSV Columns to Table Columns
Mapping is essential to tell dbForge Studio what CSV column goes to what table column. The next step is about this.
See a sample below:

This is also the part where you specify the key for the CSV. You need to specify the key so dbForge Studio can identify the rows to update if there are any.
STEP 7: Choose the Import Mode
This can be Append, Update, Append/Update, Delete, or Repopulate. Choose which of them is best for your scenario.
See a sample below:

In my case here, I chose Append/Update to deal with both inserts and updates.
Step 8: Configure the Output
In this step, you can choose between importing the data directly to the database, writing to a script file, or opening the data import script to the internal editor. See a sample below:

Click Next or Import to proceed.
STEP 9: Configure Error Handling and Logging
This step will let you choose how you want to handle the error. Your choices are to abort, prompt user for an action, and ignore all errors. Then, you can choose to generate a log file or not.
See a sample below:

At this point, you can Import, go Back, or Cancel.
If you choose to import, it will show a progress bar so you can see what’s happening. After that, you will see a success window like the one below:

You can close the wizard by clicking Finish. Or, click Import more to import some more data.
Automating the MySQL Import
Just like the data export wizard in dbForge Studio for MySQL, you can also do the same automation in CSV import. You save an import template and save the command-line batch file for unattended execution in a scheduler.
At this point, the MySQL import using dbForge Studio for MySQL is done.
Using MySQL Workbench for MySQL Import CSV
The third and final option to import data from a CSV file to MySQL is using MySQL Workbench.
A few things to remember:
- It can’t handle existing records. You have to truncate the table first.
- Headers are not handled properly. But the import will proceed.
Follow along as we discuss the steps.
STEP 1: Choose a Table
In the MySQL Workbench Navigator, right-click the target table for the MySQL import. See a sample below:

STEP 2: Choose the CSV File
A new window will appear. Here’s the sample screen to choose the CSV file:

Click Next to continue.
STEP 3: Choose an Existing Table or a New Table as Target
The selection will point to the table you right-clicked in STEP 1. But you can choose another table if this is a mistake, or create a new table. Check out a sample below:

Click Next to continue.
STEP 4: Map the Source to Target Columns and Configure Import Options
The next step will let you map the source to target columns. You will also see a snapshot of the data. See a sample below:

Notice the first row became column names. But the import process will always start at row 1.
If you click the wrench icon above, you can set the import options like delimiter, enclosure, etc. See a sample below:

Click Next to continue.
STEP 5: Get Ready to Start the Import
Then, the next step will appear telling you that the import is ready. You can choose to Show Log.

Click Next to continue.
The import process will proceed. If you choose to show the log, you will see where you are in the import process.

The above shows a failed import where a row has a null value. But the rest have been imported just fine.
Click Next to continue. You will see how many rows were imported and how long it took.

Then, click Finish to close the wizard.
That’s it on how to import CSV data to MySQL.
Takeaways in MySQL Export and Import CSV
The MySQL CLI, dbForge Studio for MySQL, and MySQL Workbench are good tools for MySQL export and import CSV. You learned the steps with examples on how to do them. It’s like beaming the away team to and from a planet.
And it’s not a 23rd century tech that’s beyond our reach today. You got this.
The most comprehensive and easy tool for MySQL export and import CSV is using dbForge Studio for MySQL. And this is free to try today. You can play with other features as well and feel your productivity skyrocket. Download it here.
Tags: database management, import csv into mysql, mariadb, mysql, mysql data export, mysql data import, mysql export csv, mysql export table to csv, mysql export to csv, mysql export to csv command line, mysql import csv Last modified: April 14, 2025



