September 26, 2024: PostgreSQL 17 Released!
Unsupported versions: 7.4 / 7.3 / 7.2
This documentation is for an unsupported version of PostgreSQL.
You may want to view the same page for the current version, or one of the other supported versions listed above instead.

5.6. Storing Binary Data

PostgreSQL provides two distinct ways to store binary data. Binary data can be stored in a table using PostgreSQL's binary data type bytea, or by using the Large Object feature which stores the binary data in a separate table in a special format, and refers to that table by storing a value of type OID in your table.

In order to determine which method is appropriate you need to understand the limitations of each method. The bytea data type is not well suited for storing very large amounts of binary data. While a column of type bytea can hold up to 1 GB of binary data, it would require a huge amount of memory (RAM) to process such a large value. The Large Object method for storing binary data is better suited to storing very large values, but it has its own limitations. Specifically deleting a row that contains a Large Object does not delete the Large Object. Deleting the Large Object is a separate operation that needs to be performed. Large Objects also have some security issues since anyone connected to the database case view and/or modify any Large Object, even if they don't have permissions to view/update the row containing the Large Object.

7.2 is the first release of the JDBC Driver that supports the bytea data type. The introduction of this functionality in 7.2 has introduced a change in behavior as compared to previous releases. In 7.2 the methods getBytes(), setBytes(), getBinaryStream(), and setBinaryStream() operate on the bytea data type. In 7.1 these methods operated on the OID data type associated with Large Objects. It is possible to revert the driver back to the old 7.1 behavior by setting the compatible property on the Connection to a value of 7.1

To use the bytea data type you should simply use the getBytes(), setBytes(), getBinaryStream(), or setBinaryStream() methods.

To use the Large Object functionality you can use either the LargeObject API provided by the PostgreSQL JDBC Driver, or by using the getBLOB() and setBLOB() methods.

Important: For PostgreSQL, you must access Large Objects within an SQL transaction. You would open a transaction by using the setAutoCommit() method with an input parameter of false.

Note: In a future release of the JDBC Driver, the getBLOB() and setBLOB() methods may no longer interact with Large Objects and will instead work on bytea data types. So it is recommended that you use the LargeObject API if you intend to use Large Objects.

Example 5-4. Binary Data Examples

For example, suppose you have a table containing the file name of an image and you also want to store the image in a bytea column:

CREATE TABLE images (imgname text, img bytea);

To insert an image, you would use:

File file = new File("myimage.gif");
FileInputStream fis = new FileInputStream(file);
PreparedStatement ps = conn.prepareStatement("INSERT INTO images VALUES (?, ?)");
ps.setString(1, file.getName());
ps.setBinaryStream(2, fis, file.length());
ps.executeUpdate();
ps.close();
fis.close();

Here, setBinaryStream() transfers a set number of bytes from a stream into the column of type bytea. This also could have been done using the setBytes() method if the contents of the image was already in a byte[].

Retrieving an image is even easier. (We use PreparedStatement here, but the Statement class can equally be used.)

PreparedStatement ps = con.prepareStatement("SELECT img FROM images WHERE imgname=?");
ps.setString(1, "myimage.gif");
ResultSet rs = ps.executeQuery();
if (rs != null) {
    while(rs.next()) {
        byte[] imgBytes = rs.getBytes(1);
        // use the stream in some way here
    }
    rs.close();
}
ps.close();

Here the binary data was retrieved as an byte[]. You could have used a InputStream object instead.

Alternatively you could be storing a very large file and want to use the LargeObject API to store the file:

CREATE TABLE imagesLO (imgname text, imgOID OID);

To insert an image, you would use:

// All LargeObject API calls must be within a transaction
conn.setAutoCommit(false);

// Get the Large Object Manager to perform operations with
LargeObjectManager lobj = ((org.postgresql.PGConnection)conn).getLargeObjectAPI();

//create a new large object
int oid = lobj.create(LargeObjectManager.READ | LargeObjectManager.WRITE);

//open the large object for write
LargeObject obj = lobj.open(oid, LargeObjectManager.WRITE);

// Now open the file
File file = new File("myimage.gif");
FileInputStream fis = new FileInputStream(file);

// copy the data from the file to the large object
byte buf[] = new byte[2048];
int s, tl = 0;
while ((s = fis.read(buf, 0, 2048)) > 0)
{
        obj.write(buf, 0, s);
        tl += s;
}

// Close the large object
obj.close();

//Now insert the row into imagesLO
PreparedStatement ps = conn.prepareStatement("INSERT INTO imagesLO VALUES (?, ?)");
ps.setString(1, file.getName());
ps.setInt(2, oid);
ps.executeUpdate();
ps.close();
fis.close();

Retrieving the image from the Large Object:

// All LargeObject API calls must be within a transaction
conn.setAutoCommit(false);

// Get the Large Object Manager to perform operations with
LargeObjectManager lobj = ((org.postgresql.PGConnection)conn).getLargeObjectAPI();

PreparedStatement ps = con.prepareStatement("SELECT imgOID FROM imagesLO WHERE imgname=?");
ps.setString(1, "myimage.gif");
ResultSet rs = ps.executeQuery();
if (rs != null) {
    while(rs.next()) {
        //open the large object for reading
        int oid = rs.getInt(1);
        LargeObject obj = lobj.open(oid, LargeObjectManager.READ);

        //read the data
        byte buf[] = new byte[obj.size()];
        obj.read(buf, 0, obj.size());
        //do something with the data read here

        // Close the object
        obj.close();
    }
    rs.close();
}
ps.close();