From: | Karsten Hilbert <Karsten(dot)Hilbert(at)gmx(dot)net> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Use arrays to store multilanguage texts |
Date: | 2004-05-29 22:10:42 |
Message-ID: | 20040530001042.B559@hermes.hilbert.loc |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> > I am wondering, if it's effective to use text arrays to store
> > multilanguage information.
We solved it like this:
- i18n_curr_lang holds the (ISO) language string per user
- i18n_keys holds the strings to be translated
- i18n_translations holds the translations
- function i18n() inserts a text value into i18n_keys thus
marking it for translation, this we use during insertion of
data in the "primary language" (think gettext, think English)
- function _() returns a translated text value (or it's
primary language equivalent), we use that in queries and view
definitions
- v_missing_translations is a view of, well, ...
Then here:
we've got a Python tool that will connect to your database and
create schema files ready for re-insertion via psql after
adding the missing translations. Kind of like a gettext po
file.
Karsten
--
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD 4537 78B9 A9F9 E407 1346
From | Date | Subject | |
---|---|---|---|
Next Message | Marty Scholes | 2004-05-29 23:31:42 | Re: turn WAL off. |
Previous Message | vicky | 2004-05-29 20:26:13 | Embedded SQL - Unable to connect to PostgreSQL Database |