pg_incremental

Incremental Processing by Crunchy Data

Overview

PackageVersionCategoryLicenseLanguage
pg_incremental1.5.0FEATPostgreSQLC
IDExtensionBinLibLoadCreateTrustRelocSchema
2850pg_incrementalNoYesNoYesNoNopg_catalog
Relatedage hll rum pg_graphql pg_jsonschema jsquery pg_hint_plan

pg_cron is optional since v1.3 and only required for scheduled pipelines.

Version

TypeRepoVersionPG VerPackageDeps
EXTPIGSTY1.5.01817161514pg_incremental-
RPMPIGSTY1.5.01817161514pg_incremental_$v-
DEBPIGSTY1.5.01817161514postgresql-$v-pg-incremental-
OS / PGPG18PG17PG16PG15PG14
el8.x86_64PIGSTY MISSPIGSTY MISS
el8.aarch64PIGSTY MISSPIGSTY MISS
el9.x86_64PIGSTY MISSPIGSTY MISS
el9.aarch64PIGSTY MISSPIGSTY MISS
el10.x86_64PIGSTY MISSPIGSTY MISS
el10.aarch64PIGSTY MISSPIGSTY MISS
d12.x86_64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
d12.aarch64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
d13.x86_64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
d13.aarch64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
u22.x86_64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
u22.aarch64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
u24.x86_64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS
u24.aarch64
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY 1.5.0
PIGSTY MISSPIGSTY MISS

Build

You can build the RPM / DEB packages for pg_incremental using pig build:

pig build pkg pg_incremental         # build RPM / DEB packages

Install

You can install pg_incremental directly. First, make sure the PGDG and PIGSTY repositories are added and enabled:

pig repo add pgsql -u          # Add repo and update cache

Install the extension using pig or apt/yum/dnf:

pig install pg_incremental;          # Install for current active PG version
pig ext install -y pg_incremental -v 18  # PG 18
pig ext install -y pg_incremental -v 17  # PG 17
pig ext install -y pg_incremental -v 16  # PG 16
dnf install -y pg_incremental_18       # PG 18
dnf install -y pg_incremental_17       # PG 17
dnf install -y pg_incremental_16       # PG 16
apt install -y postgresql-18-pg-incremental   # PG 18
apt install -y postgresql-17-pg-incremental   # PG 17
apt install -y postgresql-16-pg-incremental   # PG 16

Create Extension:

CREATE EXTENSION pg_incremental;

Usage

pg_incremental defines exactly-once incremental pipelines for append-only tables and file feeds. Upstream documents three pipeline types: sequence, time-interval, and file-list.

Install And Scheduling Model

The upstream README still documents pg_cron-backed scheduling and installs with:

CREATE EXTENSION pg_incremental CASCADE;

Pipelines run immediately when created unless execute_immediately := false, then continue on a pg_cron schedule. The README notes that each scheduled execution appears in cron.job_run_details even when no new data is available.

Sequence Pipelines

Use sequence pipelines to process safe ranges of sequence values:

SELECT incremental.create_sequence_pipeline('event-aggregation', 'events', $$
  INSERT INTO events_agg
  SELECT date_trunc('day', event_time), count(*)
  FROM events
  WHERE event_id BETWEEN $1 AND $2
  GROUP BY 1
  ON CONFLICT (day) DO UPDATE
  SET event_count = events_agg.event_count + excluded.event_count
$$);

The README documents max_batch_size for limiting how many sequence IDs are processed per run.

Time-Interval Pipelines

Use time windows when the command should receive $1 and $2 as a passed interval:

SELECT incremental.create_time_interval_pipeline('event-aggregation', '1 day', $$
  INSERT INTO events_agg
  SELECT event_time::date, count(DISTINCT event_id)
  FROM events
  WHERE event_time >= $1 AND event_time < $2
  GROUP BY 1
$$);

For export-style jobs, the README documents batched := false so each interval runs separately.

File-List Pipelines

Use file-list pipelines to process newly discovered files:

SELECT incremental.create_file_list_pipeline('event-import', 's3://mybucket/events/*.csv', $$
  SELECT import_events($1)
$$);

The v1.5.0 release adds max_batches_per_run to file-list pipelines. The README documents incremental.skip_file() for permanently marking a bad file as processed.

Operations And Monitoring

The README documents:

  • CALL incremental.execute_pipeline(name): run once if new work exists.
  • SELECT incremental.reset_pipeline(name): reset progress.
  • SELECT incremental.drop_pipeline(name): remove a pipeline.
  • Views and tables such as incremental.sequence_pipelines, incremental.time_interval_pipelines, incremental.file_list_pipelines, and incremental.processed_files.

The v1.5.0 release note also calls out a DROP EXTENSION fix for environments where pg_cron is not present.


Last Modified 2026-04-19: update extension stub docs (9f178c3)