Today I Learned

4 posts by diogobiazus

Use PostgreSQL table as queue skipping locked rows

This command only works in PG 9.5

First, are you sure you shouldn't be using some in-memory queue? Or some message broker? Or Redis?

If you are really convinced that you want this sort of behaviour on top of the goo'old PostgreSQL you can achieve a queue-like access pattern by locking and skipping locked rows.

One way to ensure that only one database client can modify a set of rows is to select the rows for update as in

BEGIN;
SELECT * FROM queue ORDER BY priority DESC LIMIT 1 FOR UPDATE;
...

This will block any other UPDATE or SELECT .. FOR UPDATE until the above transaction is finished. Now if you want concurrent database clients to fetch the next row available in the priority list just use:

BEGIN;
SELECT * FROM queue ORDER BY priority DESC LIMIT 1 FOR UPDATE SKIP LOCKED;
...

The above command could be interpreted as select the next row from queue that nobody has yet locked.

Encrypt data using psql + keybase

To export any query to a CSV and send it to stdout one can use:

psql -c "\copy (select version()) to stdout csv header"

So you can just replace select version() with any query in the above command and the results will be dumped in your terminal screen. If you have any sensitive data that is not already encrypted you could pipe this results directly to keybase as in:

psql -c "\copy (select version()) to stdout csv header" | keybase encrypt diogob

Where diogob is the recipient of your message (or your own username in case you want to store this file for future use).

Global gitignore

To enable a global .gitignore for a specific user you can use the git config core.excludefiles as in:

git config --global core.excludesfile '~/.gitignore'

This will make the .gitignore in your home folder to be used in every git project in adition to local .gitignore files.

For more info read the git documentation on gitignore

Serializing many value objects to database columns

While reading the IDDD book on serialization of value objects there is this description of an approach called ORM and Many Values Serialized into a Single Column. It's good to note that some of the main objections to this approach are technology related and barely applicable in a world of Rails' ActiveRecord + PostgreSQL.

The objections presented by the book are:

  • Column width: It mentions that serializing to varchar fields will meet some limitations imposed by Oracle and MySQL implementations. In PostgreSQL, besides having composite types (e.g. json or array), the limit on any column is much higher (1GB).
  • Must query: The book states that if the values must be queried this approach cannot be used. This is another limitation imposed by the underlying technology. Using PostgreSQL one can easily query composite values and even created indexes over them.
  • Requires custom user type: This is not related to the database technology but is heavily biased towards hibernate. In Rails' ActiveRecord the custom serializers require very little boilerplate and it offers out of the box support for json, array and range types.