Django fixture fails, stating "DatabaseError: value too long for type character varying(50)"

16,872

Solution 1

Well, what makes the difference is the encoding of the template databases. On the production server they had ascii encoding while on the dev box it is utf-8.

By default postgres creates a database using the template1. My understanding is that if its encoding is not utf-8, then the database you create will have this issue, even though you create it with utf-8 encoding.

Therefore I dropped it and recreated it with its encoding set to UTF8. The snippet below does it (taken from here):

psql -U postgres 

UPDATE pg_database SET datallowconn = TRUE where datname = 'template0';
\c template0
UPDATE pg_database SET datistemplate = FALSE where datname = 'template1';
drop database template1;
create database template1 with template = template0 encoding = 'UNICODE';
UPDATE pg_database SET datistemplate = TRUE where datname = 'template1';
\c template1
UPDATE pg_database SET datallowconn = FALSE where datname = 'template0';

Now the fixture loads smoothly.

Solution 2

Update: the 50 char limit is now 255 in Django 1.8

--

Original answer:

I just encountered this this afternoon, too, and I have a fix (of sorts)

This post here implied it's a Django bug to do with length of the value allowed for auth_permission. Further digging backs up that idea, as does this Django ticket (even though it's initially MySQL-related).

It's basically that a permission name is created based on the verbose_name of a model plus a descriptive permission string, and that can overflow to more than the 50 chars allowed in auth.models.Permission.name.

To quote a comment on the Django ticket:

The longest prefixes for the string value in the column auth_permission.name are "Can change " and "Can delete ", both with 11 characters. The column maximum length is 50 so the maximum length of Meta.verbose_name is 39.

One solution would be to hack that column to support > 50 characters (ideally via a South migration, I say, so that it's easily repeatable) but the quickest, most reliable fix I could think of was simply to make my extra-long verbose_name definition a lot shorter (from 47 chars in the verbose_name to around 20). All works fine now.

Solution 3

Get the real SQL query on both systems and see what is different.

Solution 4

Just for information : I also had this error

DatabaseError: value too long for type character varying(10)

It seems that I was writing data over the limit of 10 for a field. I fixed it by increasing the size of a CharField from 10 to 20

I hope it helps

Solution 5

As @stevejalim says, it's quite possible that the column auth_permission.name is the problem with length 50, you verify this with \d+ auth_permission in postgres's shell. In my case this is the problema, thus when I load django models's fixtures I got “DatabaseError: value too long for type character varying(50)”, then change django.contrib.auth's Permission model is complicated, so ... the simple solution was perform a migrate on Permission model, I did this running ALTER TABLE auth_permission ALTER COLUMN name TYPE VARCHAR(100); command in postgres's shell, this works for me.

credits for this comment

Share:
16,872
shanyu
Author by

shanyu

Mechanical Engineering + MBA + PhD in finance. Python & Javascript.

Updated on July 11, 2022

Comments

  • shanyu
    shanyu almost 2 years

    I have a fixture (json) which loads in development environment but fails to do so in server environment. The error says: "DatabaseError: value too long for type character varying(50)"

    My development environment is Windows & Postgres 8.4. The server runs Debian and Postgres 8.3. Database encoding is UTF8 in both systems.

    It is as if unicode markers in the fixture count as chars on the server and they cause some strings to exceed their field's max length. However that does not happen in the dev environment..