Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Postgres, JDBC, Time Zones

DZone's Guide to

Postgres, JDBC, Time Zones

· Java Zone
Free Resource

Build vs Buy a Data Quality Solution: Which is Best for You? Gain insights on a hybrid approach. Download white paper now!

I've been banging my head against the wall, trying to get things to work correctly with JDBC and PostgreSQL ... ah, time zones, every programmer's nemesis.

Does this seem familiar?

  • Parse "1984-03-02" to a date, say #inst "1984-03-02T00:00:00.000-00:00"
  • Insert that date into the database, as a PostgreSQL date column
  • Read the value back from the database
  • Print it out and get #inst "1984-03-01T08:00:00Z"
  • Curse!

A little bit of digging shows that this is a pretty common problem unless both the client and the server are running in the UTC time zone. Now, it goes without saying that you are eligible for institutionalization unless you are running your servers in UTC, and that goes triple for your data store ... but the client? That really shouldn't matter.

Except it does; unless you use the full version of PreparedStatement.setTimestamp(int, Timestamp, Calendar), the PostgreSQL driver uses ... whatever the default client time zone is. Really, that's in the JDBC specification. So much for repeatable behavior!

My solution uses a dash of Joda Time (via clj-time):

(ns setup
  (:import
    [java.util Date Calendar TimeZone]
    [java.sql PreparedStatement])
  (:require
    [clj-time.coerce :as coerce]
    [clojure.java.jdbc :as jdbc]))
(extend-type Date
  jdbc/ISQLParameter
  (set-parameter [val ^PreparedStatement stmt ix]
    (let [cal (Calendar/getInstance (TimeZone/getTimeZone "UTC"))]
      (.setTimestamp stmt ix (coerce/to-timestamp val) cal))))
This hooks into the  clojure.java.jdbc  library and extends the logic related to how a PreparedStatement is initialized for execution. It ensures the date or timestamp is interpreted in UTC.

Build vs Buy a Data Quality Solution: Which is Best for You? Maintaining high quality data is essential for operational efficiency, meaningful analytics and good long-term customer relationships. But, when dealing with multiple sources of data, data quality becomes complex, so you need to know when you should build a custom data quality tools effort over canned solutions. Download our whitepaper for more insights into a hybrid approach.

Topics:

Published at DZone with permission of Howard Lewis Ship, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}