- 🚀 Fastest full-featured node & deno client
- 🏷 ES6 Tagged Template Strings at the core
- 🏄♀️ Simple surface API
- 🖊️ Dynamic query support
- 💬 Chat and help on Gitter
$ npm install postgres
Create your sql
database instance
// db.js
import postgres from 'postgres'
const sql = postgres({ /* options */ }) // will use psql environment variables
export default sql
Simply import for use elsewhere
// users.js
import sql from './db.js'
async function getUsersOver(age) {
const users = await sql`
select
name,
age
from users
where age > ${ age }
`
// users = Result [{ name: "Walter", age: 80 }, { name: 'Murray', age: 68 }, ...]
return users
}
async function insertUser({ name, age }) {
const users = sql`
insert into users
(name, age)
values
(${ name }, ${ age })
returning name, age
`
// users = Result [{ name: "Murray", age: 68 }]
return users
}
- Connection
- Queries
- Building queries
- Advanced query methods
- Transactions
- Listen & notify
- Realtime subscribe
- Numbers, bigint, numeric
- Connection details
- Custom Types
- Teardown / Cleanup
- Error handling
- TypeScript support
- Changelog
You can use either a postgres://
url connection string or the options to define your database connection properties. Options in the object will override any present in the url. Options will fall back to the same environment variables as psql.
const sql = postgres('postgres://username:password@host:port/database', {
host : '', // Postgres ip address[s] or domain name[s]
port : 5432, // Postgres server port[s]
database : '', // Name of database to connect to
username : '', // Username of database user
password : '', // Password of database user
...and more
})
More options can be found in the Connection details section.
Postgres.js utilizes Tagged template functions to process query parameters before interpolation. Using tagged template literals benefits developers by:
- Enforcing safe query generation
- Giving the
sql``
function powerful utility and query building features.
Any generic value will be serialized according to an inferred type, and replaced by a PostgreSQL protocol placeholder $1, $2, ...
. The parameters are then sent separately to the database which handles escaping & casting.
All queries will return a Result
array, with objects mapping column names to each row.
const xs = await sql`
insert into users (
name, age
) values (
'Murray', 68
)
returning *
`
// xs = [{ user_id: 1, name: 'Murray', age: 68 }]
Please note that queries are first executed when
awaited
– or manually by using.execute()
.
Parameters are automatically extracted and handled by the database so that SQL injection isn't possible. No special handling is necessary, simply use tagged template literals as usual. Dynamic queries and query building can be seen in the next section. // todo
const name = 'Mur'
, age = 60
const users = await sql`
select
name,
age
from users
where
name like ${ name + '%' }
and age > ${ age }
`
// users = [{ name: 'Murray', age: 68 }]
Be careful with quotation marks here. Because Postgres infers column types, you do not need to wrap your interpolated parameters in quotes like
'${name}'
. This will cause an error because the tagged template replaces${name}
with$1
in the query string, leaving Postgres to do the interpolation. If you wrap that in a string, Postgres will see'$1'
and interpret it as a string as opposed to a parameter.
const columns = ['name', 'age']
sql`
select
${ sql(columns) }
from users
`
// Which results in:
select "name", "age" from users
const user = {
name: 'Murray',
age: 68
}
sql`
insert into users ${
sql(user, 'name', 'age')
}
`
// Which results in:
insert into users ("name", "age") values ($1, $2)
You can omit column names and simply execute sql(user)
to get all the fields from the object as columns. Be careful not to allow users to supply columns that you do not want to be inserted.
If you need to insert multiple rows at the same time it's also much faster to do it with a single insert
. Simply pass an array of objects to sql()
.
const users = [{
name: 'Murray',
age: 68,
garbage: 'ignore'
},
{
name: 'Walter',
age: 80
}]
sql`insert into users ${ sql(users, 'name', 'age') }`
// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)
// Here you can also omit column names which will use object keys as columns
sql`insert into users ${ sql(users) }`
// Which results in:
insert into users ("name", "age") values ($1, $2), ($3, $4)
This is also useful for update queries
const user = {
id: 1,
name: 'Murray',
age: 68
}
sql`
update users set ${
sql(user, 'name', 'age')
}
where user_id = ${ user.id }
`
// Which results in:
update users set "name" = $1, "age" = $2 where user_id = $3
Value lists can also be created dynamically, making where in
queries simple too.
const users = await sql`
select
*
from users
where age in ${ sql([68, 75, 23]) }
`
or
const [{ a, b, c }] => await sql`
select
*
from (values ${ sql(['a', 'b', 'c']) }) as x(a, b, c)
`
Postgres.js features a simple dynamic query builder by conditionally appending/omitting query fragments.
It works by nesting sql``
fragments within other sql``
calls or fragments. This allows you to build dynamic queries safely without risking sql injections through usual string concatenation.
const olderThan = x => sql`and age > ${ x }`
const filterAge = true
sql`
select
*
from users
where name is not null ${
filterAge
? olderThan(50)
: sql``
}
`
// Which results in:
select * from users where name is not null
// Or
select * from users where name is not null and age > 50
sql`
select
*
from users ${
id
? sql`where user_id = ${ id }`
: sql``
}
`
// Which results in:
select * from users
// Or
select * from users where user_id = $1
Using keywords or calling functions dynamically is also possible by using sql``
fragments.
const date = null
sql`
update users set updated_at = ${ date || sql`now()` }
`
// Which results in:
update users set updated_at = now()
Dynamic identifiers like table names and column names is also supported like so:
const table = 'users'
, column = 'id'
sql`
select ${ sql(column) } from ${ sql(table) }
`
// Which results in:
select "id" from "users"
Use cursors if you need to throttle the amount of rows being returned from a query. You can use a cursor either as an async iterable or with a callback function. For a callback function new results won't be requested until the promise / async callback function has resolved.
await sql`
select
*
from generate_series(1,4) as x
`.cursor(async([row]) => {
// row = { x: 1 }
await http.request('https://example.com/wat', { row })
}
// for await...of
const cursor = sql`select * from generate_series(1,4) as x`.cursor()
for await (const [row] of cursor) {
// row = { x: 1 }
await http.request('https://example.com/wat', { row })
}
A single row will be returned by default, but you can also request batches by setting the number of rows desired in each batch as the first argument to .cursor
:
await sql`
select
*
from generate_series(1,1000) as x
`.cursor(10, async rows => {
// rows = [{ x: 1 }, { x: 2 }, ... ]
await Promise.all(rows.map(row =>
http.request('https://example.com/wat', { row })
))
}
If an error is thrown inside the callback function no more rows will be requested and the outer promise will reject with the thrown error.
You can close the cursor early either by calling break
in the for await...of
loop, or by returning the token sql.CLOSE
from the callback function.
await sql`
select * from generate_series(1,1000) as x
`.cursor(row => {
return Math.random() > 0.9 && sql.CLOSE // or sql.END
})
If you want to handle rows returned by a query one by one, you can use .forEach
which returns a promise that resolves once there are no more rows.
await sql`
select created_at, name from events
`.forEach(row => {
// row = { created_at: '2019-11-22T14:22:00Z', name: 'connected' }
})
// No more rows
Rather than executing a given query, .describe
will return information utilized in the query process. This information can include the query identifier, column types, etc.
This is useful for debugging and analyzing your Postgres queries. Furthermore, .describe
will give you access to the final generated query string that would be executed.