What exactly is UIFont's point size? What exactly is UIFont's point size? ios ios

What exactly is UIFont's point size?


A font has an internal coordinate system, think of it as a unit square, within which a glyph's vector coordinates are specified at whatever arbitrary size accommodates all the glyphs in the font +- any amount of margin the font designer chooses.

At 72.0 points the font's unit square is one inch. Glyph x of font y has an arbitrary size in relation to this inch square. Thus a font designer can make a font that appears large or small in relation to other fonts. This is part of the font's 'character'.

So, drawing an 'A' at 72 points tells you that it will be twice as high as an 'A' drawn at 36 points in the same font - and absolutely nothing else about what the actual bitmap size will be.

ie For a given font the only way to determine the relationship between point size and pixels is to measure it.


I am not sure how -[NSString sizeWithFont:] measures the height. Does it use line height or the difference between the peaks of the beziers? What text did you use?

I believe -[UIFont lineHeight] would be better to measure the height.

Edit:Also, note that none of the measurement methods returns the size in pixels. It returns the size in points. You have to multiply the result by [UIScreen mainScreen].scale.

Note the difference between typographic points used when constructing the font and points from iOS default logical coordinate space. Unfortunately, the difference is not explained very clearly in the documentation.


I agree this is very confusing. I'm trying to give you some basic explanation here to make the things clearer.

First, the DPI (dot-per-inch) thing comes from printing, on physical papers. So does font. The unit point was invented to discribe physical printing size of text, just because inch is too large for usual text sizes. Then people invented point, that is the length of 1/72 inch (actually evolved in the history), to describe text size easily. So yes, if you are writing a document in Word or other word processing software for printing, you will get absolutely one-inch-height text if you use 72pt font.

Second, the theoretical text height is usually different from the rendered strokes you can actually see by your eyes. The original text height idea came from the actual glyphs used for printing. All letters are engraved on glyph blocks, which share the same height – which matches the font point height. However, depending on different letters and different font design, the actual visible part of the text may a little bit shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the top of a letter "k" to the bottom of a letter "p", it will match the font height.

Third, computer display screwed up DPI, as well as the definition of point at the same time. The resolution of computer displays are described by their native pixels, such as 1024 x 768 or 1920 x 1080. Software actually doesn't care the physical size of your monitors, because everything would be very fuzzy if they scale screen content like printing on paper — just the physical resolution is not high enough to make everything smooth and legit. Software uses a very simple and dead way: Fixed DPI for whatever monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. That's said, no matter how many pixels make an inch on your monitor, software just ignores it. When the operating system renders text in 72pt, it would be always 96px high on Windows and 72px high on Mac. (That's why Microsoft Word documents always look smaller on Mac and you usually need zoom to 125%.)

Finally on iOS, it's very similar, no matter it's iPhone, iPod touch, iPad or Apple Watch, iOS uses the fixed 72DPI for non-retina screen, 144DPI for @2x retina display, and 216DPI for @3x retina display used on iPhone 6 Plus.

Forget about the real inch. It only exists on actual printing, not for displaying. For software displaying text on your screen, it's just an artificial ratio to physical pixels.