On Sat, 10 Sep 2005 10:01:04 +0200, "Edi Weitz" edi@agharta.de said:
On Fri, 09 Sep 2005 14:32:22 +0300, era+regex=coach@iki.fi wrote: Somehow the answer of Martin Simmons from LispWorks doesn't show up on gmane.org so instead of providing a URL I include it here.
<...>
1 point is 1/72 inch, we should have the identity: point-size = 72 x pixel-size / resolution
My understanding of this is imperfect, but I think I have an explanation.
While the unit "typographical point" is 1/72 inch (in the American system -- European didot points are a slightly different size) this is not what the font scaler is using. Instead, it uses the display DPI, which can commonly be 72 dpi or 96 dpi, but in this case is something like 120 dpi (another fairly common size). In fact I hand-tweaked the dpi setting in the X server so it's exactly right for my display, and not any particular standard size.
vnix$ xdpyinfo | fgrep -e screen -e dimensions -e resolution default screen number: 0 number of screens: 1 screen #0: print screen: no dimensions: 1920x1440 pixels (410x305 millimeters) resolution: 119x120 dots per inch
In my /etc/X11/xorg.conf file (like XF86Config with XFree86), I have the following hand-addition under Section "Monitor"
# http://www.ubuntuforums.org/showthread.php?t=20976 DisplaySize 409 304 # millimeters
(As you can see, the X server rounds the size I passed in -- I haven't figured out how to make it not do that. Tweaking the numbers ever so slightly causes it to round them to some other random number ...) The link in the comment is to an article which explores some font issues -- it's not particularly well-structured I'm afraid, but that's where I picked this up. (I haven't done most of the other tweaks suggested on that page.)
/* era */