MySQL: column size limit MySQL: column size limit sql sql

MySQL: column size limit

According to the documentation:

Although InnoDB supports row sizes larger than 65,535 bytes internally, MySQL itself imposes a row-size limit of 65,535 for the combined size of all columns:

mysql> CREATE TABLE t (a VARCHAR(8000), b VARCHAR(10000),    -> c VARCHAR(10000), d VARCHAR(10000), e VARCHAR(10000),    -> f VARCHAR(10000), g VARCHAR(10000)) ENGINE=InnoDB;

ERROR 1118 (42000): Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. You have to change some columns to TEXT or BLOBs

(Unfortunately, this example does not provide the character set so we don't really know how large the columns are.)

The utf8 encoding uses 1, 2, or 3 bytes per character. So, the maximum number of characters that can safely fit in a page of 65,535 bytes (the MySQL maximum) is 21,845 characters (21,845*3 = 65,535).

Despite the versions being similar, it would appear the Windows is being conservative in its space allocation and guaranteeing that you can store any characters in the field. Linux seems to have a more laissez-faire attitude. You can store some strings with over 21,845 characters, depending on the characters.

I have no idea why this difference would exist in the same version. Both methods are "right" in some sense. There are simple enough work-arounds:

  • Use TEXT.
  • Switch to a collation that has shorter characters (which is presumably what you want to store).
  • Reduce the size of the field.

utf8 needs up to 3 bytes per character. utf8mb4: 4; latin1: 1; ascii: 1; etc. VARCHAR(N) is implemented as a 1- or 2-byte length in front of the bytes for the text. That is allowed to hold N characters (not bytes). So, if you say you want utf8, then 3*N must be less than 65535, the max value for a 2-byte length.

Be glad you are not running in some old version, where VARCHAR had a limit of 255.

If your txt does not need characters other than ascii or English, then use CHARACTER SET latin1.

In InnoDB, when there are 'long' fields (big varchars, texts, blobs, etc), some or all of the column is stored in a separate block(s). There is a limit of about 8000 bytes for what is stored together in the record.

If you really need 50K of utf8, then MEDIUMTEXT is what you need. It uses a 3-byte length and can hold up to 16M bytes (5M characters, possibly more, since utf8 is a variable length encoding).

Most applications can (should?) use either ascii (1 byte per character) or utf8mb4 (1-4 bytes per character). The latter allows for all languages, including Emoji and the 4-byte Chinese characters that utf8 cannot handle.

As for why Windows and Linux work differently here, I don't know. Are you using the same version? Suggest you file a bug report with . (And provide a link to it from this Question.)

please simply use TEXT to declare txt column

DROP TABLE IF EXISTS text;CREATE TABLE `texts` (  `id` BINARY(16) NOT NULL DEFAULT '\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0',  `txt` TEXT DEFAULT NULL,   PRIMARY KEY (`id`)) ENGINE=INNODB DEFAULT CHARSET=utf8;