Importing Structure and Data in phpMyAdmin 3.3.x for Effective MySQL Management

Marc Delisle

October 2010


Mastering phpMyAdmin 3.3.x for Effective MySQL Management

Mastering phpMyAdmin 3.3.x for Effective MySQL Management

A complete guide to get started with phpMyAdmin 3.3 and master its features

  • The best introduction to phpMyAdmin available
  • Written by the project leader of phpMyAdmin, and improved over several editions
  • A step-by-step tutorial for manipulating data with phpMyAdmin
  • Learn to do things with your MySQL database and phpMyAdmin that you didn't know were possible!
        Read more about this book      

(For more resources on PHP, see here.)

In general, an exported file can be imported either to the same database it came from or to any other database; the XML format is an exception to this and a workaround is given in the XML section later in this chapter. Also, a file generated from an older phpMyAdmin version should have no problem being imported by the current version, but the difference between the MySQL version at the time of export and the one at the time of import might play a bigger role regarding compatibility. It's diffcult to evaluate how future MySQL releases will change the language's syntax, which could result in import challenges.

The import feature can be accessed from several panels:

  • The Import menu available from the homepage, the Database view, or the Table view
  • The Import files menu offered inside the Query window

An import file may contain the DELIMITER keyword. This enables phpMyAdmin to mimic the mysql command-line interpreter. The DELIMITER separator is used to delineate the part of the file containing a stored procedure, as these procedures can themselves contain semicolons.

The default values for the Import interface are defned in $cfg['Import'].

Before examining the actual import dialog, let's discuss some limits issues.

Limits for the transfer

When we import, the source file is usually on our client machine and therefore must travel to the server via HTTP. This transfer takes time and uses resources that may be limited in the web server's PHP confguration.

Instead of using HTTP, we can upload our file to the server by using a protocol such as FTP, as described in the Reading files from a web server upload directory section. This method circumvents the web server's PHP upload limits.

Time limits

First, let's consider the time limit. In, the $cfg['ExecTimeLimit'] confguration directive assigns, by default, a maximum execution time of 300 seconds (five minutes) for any phpMyAdmin script, including the scripts that process data after the file has been uploaded. A value of 0 removes the limit, and in theory, gives us infnite time to complete the import operation. If the PHP server is running in safe mode, modifying $cfg['ExecTimeLimit'] will have no effect. This is because the limits set in php.ini or the user-related web server confguration file, (such as .htaccess or the virtual host confguration files) take precedence over this parameter.

Of course, the time it effectively takes depends on two key factors:

  • Web server load
  • MySQL server load

The time taken by the file, as it travels between the client and the server does not count as execution time because the PHP script only starts to execute after the file has been received on the server. Therefore, the $cfg['ExecTimeLimit'] parameter has an impact only on the time used to process data (like decompression or sending it to the MySQL server).

Other limits

The system administrator can use the php.ini file or the web server's virtual host confguration file to control uploads on the server.

The upload_max_filesize parameter specifes the upper limit or maximum file size that can be uploaded via HTTP. This one is obvious, but another less obvious parameter is post_max_size. As HTTP uploading is done via the POST method, this parameter may limit our transfers. For more details about the POST method, please refer to

The memory_limit parameter is provided to prevent web server child processes from grabbing too much of the server's memory—phpMyAdmin runs inside a child process. Thus, the handling of normal file uploads, especially compressed dumps, can be compromised by giving this parameter a small value. Here, no preferred value can be recommended; the value depends on the size of uploaded data we want to handle and on the size of the physical memory. The memory limit can also be tuned via the $cfg['MemoryLimit'] parameter in, as seen in Chapter 6, Exporting Structure and Data (Backup).

Finally, file uploads must be allowed by setting file_uploads to On; otherwise, phpMyAdmin won't even show the Location of the textfile dialog. It would be useless to display this dialog as the connection would be refused later by the PHP component of the web server.

Handling big export files

If the file is too big, there are ways in which we can resolve the situation. If the original data is still accessible via phpMyAdmin, we could use phpMyAdmin to generate smaller CSV export files, choosing the Dump n rows starting at record # n dialog. If this were not possible, we could use a spreadsheet program or a text editor to split the file into smaller sections. Another possibility is to use the upload directory mechanism, which accesses the directory defned in $cfg['UploadDir'].

In recent phpMyAdmin versions, the Partial import feature can also solve this file size problem. By selecting the Allow interrupt... checkbox, the import process will interrupt itself if it detects that it's close to the time limit. We can also specify a number of queries to skip from the start, in case we successfully import a number of rows and wish to continue from that point.

Uploading into a temporary directory

On a server, a PHP security feature called open_basedir (which limits the files that can be opened by PHP to the specifed directory tree) can impede the upload mechanism. In this case, or if uploads are problematic for any other reason, the $cfg['TempDir'] parameter can be set with the value of a temporary directory. This is probably a subdirectory of phpMyAdmin's main directory, into which the web server is allowed to put the uploaded file.

Importing SQL files

Any file containing MySQL statements can be imported via this mechanism. This format is more commonly used for backup/restore purposes. The relevant dialog is available in the Database view or the Table view, via the Import subpage, or in the Query window.

There is no relation between the currently-selected table (here author) and the actual contents of the SQL file that will be importeAll of the contents of the SQL file will be imported, and it's those contents that determine which tables or databases are affected. However, if the imported file does not contain any SQL statementsto select a database, all statements in the imported file will be executed on the currently-selected database.

Let's try an import exercise. First, we make sure that we have a current SQL export of the book table (as explained in Chapter 6, Exporting Structure and Data (Backup)). This export file must contain the structure and the data. Then we drop the book table—yes, really! We could also simply rename it. (See Chapter 9, Performing Table and Database Operations, for the procedure.)

Now it's time to import the file back. We should be on the Import subpage, where wecan see the Location of the text file dialog. We just have to hit the Browse button and choose our file.

phpMyAdmin is able to detect which compression method (if any) has been applied to the file. Depending on the phpMyAdmin version, and the extensions that are available in the PHP component of the web server, there is variation in the format that the program can decompress.

However, to import successfully, phpMyAdmin must be informed of the character set of the file to be imported. The default value is utf8. However, if we know that the import file was created with another character set, we should specify it here.

A SQL compatibility mode selector is available at import time. This mode should be adjusted to match the actual data that we are about to import, according to the type of the server where the data was previously exported.

Another option, Do not use AUTO_INCREMENT for zero values, is selected by default. If we have a value of zero in a primary key and we want it to stay zero instead of being auto-incremented, we should use this option.

To start the import, we click on Go. The import procedure continues and we receive a message: Import has been successfully finished, 2 queries executed. We can browse our newly-created tables to confirm the success of the import operation.

The file could be imported for testing in a different database or even on another MySQL server.

Importing CSV files

In this section, we will examine how to import CSV files. There are two possible methods—CSV and CSV using LOAD DATA. The first method is implemented internally by phpMyAdmin and is the recommended one for its simplicity. With the second method, phpMyAdmin receives the file to be loaded and passes it to MySQL. In theory, this method should be faster. However, it has more requirements due to MySQL itself (see the Requirements subsection of the CSV using LOAD DATA section).

Differences between SQL and CSV formats

There are some differences between the SQL and CSV formats. The CSV file format contains data only, so we must already have an existing table in place. This table does not need to have the same structure as the original table (from which the data comes); the Column names dialog enables us to choose which columns are affected in the target table.

Because the table must exist prior to the import, the CSV import dialog is available only from the Import subpage in the Table view, and not in the Database view.

Exporting a test file

Before trying an import, let's generate an author.csv export file from the author table. We use the default values in the CSV export options. We can then Empty the author table—we should avoid dropping this table because we still need the table structure.


From the author table menu, we select Import and then CSV:

We can influence the behavior of the import in a number of ways. By default, importing does not modify existing data (based on primary or unique keys). However, the Replace table data with file option instructs phpMyAdmin to use the REPLACE statement instead of the INSERT statement, so that existing rows are replaced with the imported data.

Using Ignore duplicate rows, INSERT IGNORE statements are generated. These cause MySQL to ignore any duplicate key problems during insertion. A duplicate key from the import file does not replace existing data, and the procedure continues for the next line of CSV data.

We can also specify the character that terminates each field, the character that encloses data, and the character that escapes the enclosing character. Usually this is \. For example, for a double quote enclosing character, if the data field contains a double quote, it must be expressed as "some data \" some other data".

For Lines terminated by, recent versions of phpMyAdmin offer the auto choice, which should be tried first as it detects the end-of-line character automatically. We can also specify manually which characters terminate the lines. The usual choice is \n for UNIX-based systems, \r\n for DOS or Windows systems, and \r for Mac-based system (up to Mac OS 9). If in doubt, we can use a hexadecimal file editor on our client computer (not part of phpMyAdmin) to examine the exact codes.

By default, phpMyAdmin expects a CSV file with the same number of fields and the same field order as the target table. However, this can be changed by entering a comma-separated list of column names in Column names, respecting the source file format. For example, let's say our source file contains only the author ID and the author name information:

"1","John Smith"
"2","Maria Sunshine"

We'd have to put id, name in Column names in order to match the source file.

When we click on Go, the import is executed and we receive a confirmation. We might also see the actual INSERT queries generated if the total size of the file is not too big.

Import has been successfully finished, 2 queries executed.
INSERT INTO `author` VALUES ('1', 'John Smith',
'+01 445 789-1234')# 1 row(s) affected.

INSERT INTO `author` VALUES ('2', 'Maria Sunshine',
'333-3333')# 1 row(s) affected.

        Read more about this book      

(For more resources on PHP, see here.)


With this method, phpMyAdmin relies on the server's LOAD DATA INFILE or LOAD DATA LOCAL INFILE mechanisms to do the actual import, instead of processing the data internally. These statements are the fastest way of importing text in MySQL. They cause MySQL to start a read operation either from a file located on the MySQL server (LOAD DATA INFILE) or from another place (LOAD DATA LOCAL INFILE) which, in this context, is always the web server's file system. If the MySQL server is located on a computer other than the web server, we won't be able to use the LOAD DATA INFILE mechanism.


Relying on the MySQL server has some consequences. Using LOAD DATA INFILE requires that the logged-in user possess a global FILE privilege. Also, the file itself must be readable by the MySQL server's process.

The use of the LOCAL modifier in LOAD DATA LOCAL INFILE must be allowed by the MySQL server and MySQL's client library used by PHP.

Both the LOAD methods are available from the phpMyAdmin LOAD interface, which tries to choose the best possible default option.

Using the LOAD DATA interface

We select Import from the author table menu. Choosing CSV using LOAD DATA brings up the following dialog:

The available options have already been covered in the CSV section.

In the familiar Location of the text file question, we choose our author.csv file.

Finally, we can choose the LOAD method, as discussed earlier, by selecting the Use LOCAL keyword option. We then click on Go.

If all goes well, we can see the confirmation screen as follows:

This screen shows the exact LOAD DATA LOCAL INFILE statement used. Here is what has happened:

  • We chose author.csv.
  • The contents of this file were transferred over HTTP and received by the web server.
  • The PHP component inside the web server saved this file to a work directory (here /opt/php-upload-tmp/) and gave it a temporary name.
  • phpMyAdmin informed of the location of this working file, built a LOAD DATA LOCAL INFILE command, and sent it to MySQL. Note that just one query was executed, which loaded many rows.
  • The MySQL server read and loaded the contents of the file into our target table. It then returned the number of affected rows (2), which phpMyAdmin displayed in the results page.
  • Importing other formats

    In addition to SQL and CSV formats, phpMyAdmin can import Open Document Spreadsheet, Excel, and XML files. However, these files need to have been exported by phpMyAdmin itself, or closely follow what phpMyAdmin does when exporting.

    Open Document Spreadsheet

    By default, when we export via phpMyAdmin in this format, the Put fields names in the first row option is not selected. This means that the exported file contains only data. At import time, a corresponding option Column names in first row is available and should not be selected if the file does not contain the column names in its first row.

    However, if the exported file does contain the column names, we can select this option. Therefore, phpMyAdmin will do the following:

    • Create a table, using the file name (author.ods) as the table name (author)
    • Use the first row's column contents as column names for this table
    • Determine each column's type and appropriate size, based on the data itself
    • Insert the data into the table

    Other import options exist to indicate what should be done with empty rows and with data containing percentages or currency values.


    For Excel format, the technique about column names being part of the export file—as discussed in the previous section—applies as well.

            Read more about this book      

    (For more resources on PHP, see here.)


    The amount of structural information that can be created by importing an XML file depends on the options that were chosen at export time. Indeed, if the Export Structure Schemas option was selected, the exact CREATE TABLE statement is placed in the exported file. Therefore, the same table structure is available in the restored table.

    Likewise, if the Export contents option was selected, the whole data is there in the XML file, ready to be imported. There are no options available at import time, as XML is a self-describing format; therefore, phpMyAdmin can correctly interpret what's in the file and react appropriately.

    As the original database name is a part of the XML export, the current phpMyAdmin version only supports importing an XML file into the database from which the export originated. To import to a different database, we need to first use a text editor and change the database name inside this line:

    <pma:database name="marc_book" collation="latin1_swedish_ci"

    Reading files from a web server upload directory

    To get around cases where uploads are completely disabled by a web server's PHP configuration, or where upload limits are too small, phpMyAdmin can read uploaded files from a special directory located on the web server's file system. This mechanism is applicable for SQL and CSV imports.

    We first specify the directory name of our choice in the $cfg['UploadDir'] parameter, for example, './upload'. We can also use the %u string, for Exporting Structure and Data (Backup), to represent the user's name.

    Now let's go back to the Import subpage. We get an error message: The directory you set for upload work cannot be reached.

    This error message is expected, as the directory does not exist. It is supposed to have been created inside the current phpMyAdmin installation directory. The message might also indicate that the directory exists, but can't be read by the web server. (In PHP safe mode, the owner of the directory and the owner of the phpMyAdmin-installed scripts must be the same.)

    Using an SFTP or FTP client, we create the necessary directory, and can now upload a file there (for example book.sql) bypassing any PHP timeouts or maximum upload limits.

    Note that the file itself must have permissions that allow the web server to read it.

    In most cases, the easiest way is to allow everyone to read the file.

    Refreshing the Import subpage brings up the following

    Clicking on Go should execute the file.

    Automatic decompression is also available for files located in the upload directory. The file names should have extensions such as .bz2, .gz, .sql.bz2, or .sql.gz.

    Using the double extensions (.sql.bz2) is a better way to indicate that a .sql file was produced and then compressed, as we see all of the steps used to generate this file.

    Displaying an upload progress bar

    Especially when importing a large file, it's interesting to have a visual feedback on the progression of upload. Please note that the progress bar we are discussing here informs us only about the uploading part, which is a subset of the whole import operation.

    Having a JavaScript-enabled browser is a requirement for this feature. Moreover, the web server's PHP component must have at least one of the following extensions:

    phpMyAdmin uses AJAX techniques to fetch progress information and then displays it as a part of the File to import dialog. The number of bytes uploaded, the total number of bytes, and the percentage uploaded are displayed below the bar.

    Configuring APC

    A few php.ini directives play an important role for upload progress. First, the apc.rfc1867 directive must be set to On or true, otherwise this extension won't report upload progress to the calling script. When set to On, this extension updates an APC user cache entry with the upload status information.

    The frequency of the updates can be set via the apc.rfc1867_freq directive, which can take the form of a percentage of the total file size (for example apc.rfc1867_freq = "10%"), or a size in bytes (suffixes k for kilobytes, m for megabytes, and g for gigabytes are accepted). A value of 0 here indicates update as often as possible, which looks interesting but in reality may slow down the upload.

    This very notion of update frequency explains why the bar progresses in chunks rather than continuously when using this mechanism.

    Displaying a character set dialog

    The $cfg['AllowAnywhereRecoding'] parameter also affects importing. When set to true, this parameter displays a character set dialog where we can specify which character set the file is encoded in. We must indicate the correct character set to avoid garbled data after the import operation. The default value of the character set for this dialog is drawn from the $cfg['Import']['charset'] directive.


    In this article we covered:

    • Various options in phpMyAdmin that allow us to import data
    • The different mechanisms involved in importing files
    • The limits that we might hit when performing a transfer, and ways to bypass these limits

    Further resources on this subject:

    You've been reading an excerpt of:

    Mastering phpMyAdmin 3.3.x for Effective MySQL Management

    Explore Title