Oracle SQL Loader is an excellent tool for loading a large amount of data into an Oracle Table based on a variety of different input formats.  This blog entry is simply a collective post from other entries on the web.  For my example, I needed to load Gigabytes of data into Oracle (approx 7 million records).  Using a Java process was just too slow, so this was a perfect time to implement SQL Loader.

Oracle SQL Loader Requirements

  • The Input File(s), in our case a fixed field length file
  • An Oracle Control file that you will create
  • An Oracle Table dump the data

Database Table

Our table was defined as follows:

CREATE TABLE mycustomtable
 field1 VARCHAR2(32) NOT NULL,
 field2 VARCHAR2(40),
 field3 VARCHAR2(30)

The input file that we used was fixed field/fixed record length format.  You can easily use CSV files as well.  Refer to the SQL Loader documentation for more on the syntax

Control File

REPLACE INTO TABLE mycustomtable 
field1 POSITION (286:317) CHAR,
field2 POSITION (107:146) CHAR,
field3 POSITION (77:106) CHAR

This control file contains a number of other fields, so we only want 4 of them and the numbers defined in the control file (286:317) represent the start and end position of the field in the input file.  The REPLACE INTO means we want to empty all data in the table before loading the input file.  If you are loading more than one input file, you can use the APPEND INTO instead of REPLACE INTO.  Save this file with a .ctl extension.

The SQL Loader Command

Now run something similar to this:

sqlldr userid=schemausername/password@sid control=your-saved-control-file.ctl log=sqlload.log

Let’s take a closer look at this command:

  1. Replace the schemausername with your DB Schema User Name for your Oracle Database.
  2. Replace the password with your DB Schema User’s Password for your Oracle Database
  3. Replace sid with the Oracle Service name, aka your database
  4. your-saved-control-file.ctl is the file you saved when you created your control file
  5. sqlload.log is just any name you create to output the execution.

When you complete the script, a bunch of messages will show on the screen indicating a number of records are loaded into the table.  It will run through pretty quickly.  When it’s done you may see a .bad file which contains all of the records that did not load into the table (ie constraint issues etc) but you should review the log file created so you can see how many records were successful, any failures etc.

Hope this helped out!