Skip to content

Java + JDBC

Generate Java code using standard JDBC APIs. Works with any JDBC-compatible database including SQLite, DuckDB, PostgreSQL, MySQL, and more.

PropertyValue
Generatorjava/sqlite, java/duckdb, or java/postgres
RuntimeJVM (Java 17+)
API StyleSynchronous
DriverAny JDBC driver

Add SQG to your build process and include the appropriate JDBC driver:

Terminal window
# Install SQG globally
pnpm add -g @sqg/sqg
build.gradle
// SQLite
implementation 'org.xerial:sqlite-jdbc:3.51.1.0'
// DuckDB
implementation 'org.duckdb:duckdb_jdbc:1.5.2.0'
// PostgreSQL
implementation 'org.postgresql:postgresql:42.7.0'
pom.xml
<!-- SQLite -->
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.51.1.0</version>
</dependency>
<!-- DuckDB -->
<dependency>
<groupId>org.duckdb</groupId>
<artifactId>duckdb_jdbc</artifactId>
<version>1.4.2.0</version>
</dependency>
<!-- PostgreSQL -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.7.0</version>
</dependency>
sqg.yaml
version: 1
name: my-app
sql:
- files:
- queries.sql
gen:
- generator: java/sqlite # or: java/duckdb, java/postgres
output: ./src/main/java/com/myapp/db/
config:
package: com.myapp.db
migrations: true # generates applyMigrations() method

Config options:

  • package - Java package name for generated classes
  • migrations - When true, generates an applyMigrations() method with built-in tracking
  1. Write your SQL

    Java JDBC Complete Example
    Full example for Java JDBC generator
    Try in Playground
    -- MIGRATE 1
    CREATE TABLE users (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    email TEXT UNIQUE,
    active INTEGER DEFAULT 1
    );
    -- QUERY all_users
    SELECT id, name, email, active FROM users ORDER BY name;
    -- QUERY get_user :one
    @set id = 1
    SELECT id, name, email, active FROM users WHERE id = ${id};
    -- QUERY find_active_users
    @set active = 1
    SELECT id, name, email FROM users WHERE active = ${active};
    -- EXEC create_user
    @set name = 'John Doe'
    @set email = '[email protected]'
    INSERT INTO users (name, email) VALUES (${name}, ${email});
    -- EXEC update_user
    @set id = 1
    @set name = 'Jane Doe'
    @set email = '[email protected]'
    UPDATE users SET name = ${name}, email = ${email} WHERE id = ${id};
    -- EXEC delete_user
    @set id = 1
    DELETE FROM users WHERE id = ${id};
  2. Generate code

    Terminal window
    sqg sqg.yaml
  3. Use the generated code

    import com.myapp.db.MyApp;
    import java.sql.Connection;
    import java.sql.DriverManager;
    public class Main {
    public static void main(String[] args) throws Exception {
    // Connect to database
    Connection conn = DriverManager.getConnection("jdbc:sqlite:app.db");
    // Apply migrations (with built-in tracking)
    MyApp.applyMigrations(conn);
    // Create query instance
    MyApp queries = new MyApp(conn);
    // Insert data
    queries.createUser("Alice", "[email protected]");
    queries.createUser("Bob", "[email protected]");
    // Query data
    for (MyApp.AllUsersRow user : queries.allUsers()) {
    System.out.println(user.name() + ": " + user.email());
    }
    // Get single row
    MyApp.GetUserRow user = queries.getUser(1);
    if (user != null) {
    System.out.println("Found: " + user.name());
    }
    // Update data
    queries.updateUser(1, "Alice Smith", "[email protected]");
    // Delete data
    queries.deleteUser(2);
    conn.close();
    }
    }

The generator creates a single Java file with:

  • Record types for each query result
  • A class with methods for each query/exec
  • Static getMigrations() method returning migration SQL strings
  • Static applyMigrations() method (when config.migrations: true) that tracks and applies migrations automatically
Generated Java
From: Java JDBC Complete Example
See full code in Playground
// ...
public List<AllUsersResult> allUsers() throws SQLException {
try (
var stmt = connection.prepareStatement(
"SELECT id, name, email, active FROM users ORDER BY name;"
)
) {
try (var rs = stmt.executeQuery()) {
var results = new ArrayList<AllUsersResult>();
while (rs.next()) {
results.add(
new AllUsersResult(
(Integer) rs.getObject(1),
(String) rs.getObject(2),
(String) rs.getObject(3),
(Integer) rs.getObject(4)
)
);
}
return results;
}
}
}
// ...
SQL TypeJava Type
INTEGER, INTInteger
BIGINTLong
REAL, DOUBLEDouble
FLOATFloat
TEXT, VARCHARString
BLOBbyte[]
BOOLEANBoolean
DATELocalDate
TIMESTAMPLocalDateTime
TIMESTAMPTZOffsetDateTime
TIMELocalTime
DECIMAL, NUMERICBigDecimal
UUIDUUID
TEXT[], INTEGER[]List<String>, List<Integer>
ENUMGenerated Java enum (see PostgreSQL ENUMs)

All types are nullable (using wrapper classes).

Add the :batch modifier to any EXEC to also generate a JDBC batch method. The single-row method is kept alongside it, so callers can pick per use case. Works with any JDBC database (SQLite, DuckDB, PostgreSQL, …).

-- EXEC insert_user :batch
@set id = 'u1'
@set name = 'Alice'
@set email = '[email protected]'
INSERT INTO users (id, name, email) VALUES (${id}, ${name}, ${email});

Multi-parameter EXECs generate a params record so call sites stay readable:

public record InsertUserParams(String id, String name, String email) {}
public int insertUser(String id, String name, String email) throws SQLException { ... }
public int[] insertUserBatch(Iterable<InsertUserParams> params) throws SQLException { ... }

Single-parameter EXECs skip the record:

-- EXEC delete_user :batch
@set id = 'u1'
DELETE FROM users WHERE id = ${id};
public int[] deleteUserBatch(Iterable<String> params) throws SQLException { ... }

Parameters are bound with typed JDBC setters (setInt, setString, setLong, …) based on the inferred Java type, falling back to setObject for types without a typed setter (LocalDate, OffsetDateTime, BigDecimal, UUID, list/array types). Typed setters skip the driver’s per-row type inspection, which is noticeable on large batches.

When to use:

  • :batch works for INSERT, UPDATE, and DELETE on any JDBC database.
  • For PostgreSQL INSERTs, add ?reWriteBatchedInserts=true to your JDBC URL for ~2.5× throughput — the driver automatically rewrites batches into multi-VALUES statements.
  • For the fastest PostgreSQL INSERTs, use TABLE ... :appender (COPY BINARY) instead — typically 5–20× faster than :batch. See the PostgreSQL insert benchmark for the numbers.

SQG supports PostgreSQL via the java/postgres generator. PostgreSQL-specific features include:

PostgreSQL requires a running server for type introspection. Provide the connection string via environment variable:

Terminal window
export SQG_POSTGRES_URL="postgresql://user:password@localhost:5432/mydb"

If SQG_POSTGRES_URL is not set, SQG will automatically start a PostgreSQL container using Testcontainers (requires Docker).

SQG introspects PostgreSQL’s pg_type catalog to resolve user-defined types like ENUMs and generates type-safe Java enum classes for each one:

-- MIGRATE 1
CREATE TYPE task_status AS ENUM ('pending', 'active', 'completed', 'cancelled');
CREATE TABLE tasks (
id SERIAL PRIMARY KEY,
title TEXT NOT NULL,
status task_status DEFAULT 'pending'
);
-- QUERY get_tasks_by_status
@set status = 'active'
SELECT id, title, status FROM tasks WHERE status = ${status}::task_status;
-- EXEC insert_task
@set title = 'New Task'
@set status = 'pending'
INSERT INTO tasks (title, status) VALUES (${title}, ${status}::task_status);

SQG generates a Java enum with bidirectional conversion between enum values and database strings:

// Generated enum
public enum TaskStatus {
PENDING("pending"),
ACTIVE("active"),
COMPLETED("completed"),
CANCELLED("cancelled");
private final String value;
private static final java.util.Map<String, TaskStatus> BY_VALUE = ...;
public String getValue() { return value; }
public static TaskStatus fromValue(String value) {
TaskStatus result = BY_VALUE.get(value);
if (result == null) throw new IllegalArgumentException("Unknown value: " + value);
return result;
}
}

The generated query methods use the enum type directly — no raw strings needed:

// Parameters accept the enum type
queries.insertTask("Deploy v2", TaskStatus.ACTIVE);
// Results return the enum type
for (var task : queries.getTasksByStatus(TaskStatus.ACTIVE)) {
TaskStatus status = task.status(); // TaskStatus enum, not String
System.out.println(status.getValue()); // "active"
}

Under the hood, SQG uses getValue() when binding parameters and fromValue() when reading results, so the conversion is automatic and type-safe.

PostgreSQL arrays are mapped to Java List<T>:

CREATE TABLE tasks (
id SERIAL PRIMARY KEY,
tags TEXT[],
priority_scores INTEGER[]
);
// Generated record
record GetAllTasksResult(Integer id, String title, List<String> tags, List<Integer> priorityScores) {}
// Usage
for (var task : queries.getAllTasks()) {
System.out.println(task.tags()); // [urgent, backend]
System.out.println(task.priorityScores()); // [10, 20, 30]
}

The TABLE :appender annotation generates high-performance bulk insert methods for PostgreSQL using PgBulkInsert, which uses PostgreSQL’s COPY BINARY protocol — the fastest way to insert data.

Additional dependency required:

// Gradle
implementation 'de.bytefish:pgbulkinsert:9.0.0'
<!-- Maven -->
<dependency>
<groupId>de.bytefish</groupId>
<artifactId>pgbulkinsert</artifactId>
<version>9.0.0</version>
</dependency>

The dependency is only needed when you use -- TABLE ... :appender. If you don’t use appenders, there’s no extra dependency.

SQL:

-- MIGRATE 1
CREATE TABLE events (
id BIGINT PRIMARY KEY,
name TEXT NOT NULL,
tags TEXT[],
scores INTEGER[],
created_at TIMESTAMPTZ
);
-- TABLE events :appender

Generated code:

// Type-safe row record
public record EventsRow(
Long id, String name, List<String> tags,
List<Integer> scores, OffsetDateTime createdAt
) {}
// Static mapper and writer (zero per-call allocation)
private static final PgBulkInsert.PgMapper<EventsRow> EVENTS_MAPPER = ...;
private static final PgBulkInsert.PgBulkWriter<EventsRow> EVENTS_WRITER = ...;
// Generated bulk insert method
public void bulkInsertEventsRow(Iterable<EventsRow> rows)
throws SQLException, IOException { ... }

Usage:

var rows = List.of(
new MyApp.EventsRow(1L, "signup", List.of("web"), List.of(10), OffsetDateTime.now()),
new MyApp.EventsRow(2L, "login", null, null, OffsetDateTime.now()),
new MyApp.EventsRow(3L, "purchase", List.of("mobile", "premium"), List.of(5, 8), null)
);
queries.bulkInsertEventsRow(rows);

Supported column types: All standard PostgreSQL types including BOOLEAN, SMALLINT, INTEGER, BIGINT, REAL, DOUBLE PRECISION, NUMERIC, TEXT, VARCHAR, DATE, TIMESTAMP, TIMESTAMPTZ, UUID, JSONB, and array types (TEXT[], INTEGER[], BIGINT[], etc.).

DuckDB also supports appenders via the TABLE :appender annotation — see the DuckDB-specific appender docs for details.

PostgreSQL TIMESTAMPTZ columns are mapped to OffsetDateTime (with UTC offset):

CREATE TABLE events (
id SERIAL PRIMARY KEY,
created_at TIMESTAMPTZ DEFAULT NOW()
);
// Generated type uses OffsetDateTime
record EventResult(Integer id, OffsetDateTime createdAt) {}