Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Software - Troubleshooting and Discussion > macOS > Does Leopard sport a resolution-independent interface?

Does Leopard sport a resolution-independent interface?
Thread Tools
solofx7
Mac Elite
Join Date: Dec 2006
Status: Offline
Reply With Quote
Jun 12, 2007, 04:14 PM
 
i thought there was going to be resolution independence.
i have had a similar topic here before in regards to the fact that windows looks sharper in some aspects, but OS X looks better overall.
when you look at OS X and windows side by side windows looks sharper. i have no problem with OS X, but i do not think that it is reaching the highest level of sharpness possible for the hardware. any thoughts?
i just did not see anything about this at WWDC.
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Jun 12, 2007, 04:40 PM
 
Leopard will introduce resolution independence, yes.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
larkost
Mac Elite
Join Date: Oct 1999
Location: San Jose, Ca
Status: Offline
Reply With Quote
Jun 12, 2007, 07:01 PM
 
Resolution independance was probably not stressed because there are not really many displays that can take much advantage of it. When displays start coming with 200+dpi then it is going to be important that you already have a core of applications (from Apple and others) that can take advantage of them. I immagine that it was played up much more at the "State of the Union" keynote that was Monday afternoon (and is much more Developer focused).
     
solofx7  (op)
Mac Elite
Join Date: Dec 2006
Status: Offline
Reply With Quote
Jun 13, 2007, 10:50 AM
 
gotcha, thanks for the info, but it is just an old topic that keeps coming up...
     
USNA91
Dedicated MacNNer
Join Date: Nov 2004
Status: Offline
Reply With Quote
Jun 13, 2007, 12:56 PM
 
Would someone be kind enough to explain whar "resolution independance" is? I haven't the foggiest clue.

Thanks.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 13, 2007, 02:08 PM
 
Right now any interface element, such as the menu bar, is always a set number of pixels high. The smaller the pixels, the smaller the elements. Resolution independence means that you can zoom the entire interface freely. This will let Apple move to newer displays with higher resolution without forcing its users to squint.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 13, 2007, 03:01 PM
 
I've finally fixed the title. The old one tempted me every time to add some smart comment … 
I don't suffer from insanity, I enjoy every minute of it.
     
USNA91
Dedicated MacNNer
Join Date: Nov 2004
Status: Offline
Reply With Quote
Jun 13, 2007, 05:16 PM
 
Originally Posted by P View Post
Right now any interface element, such as the menu bar, is always a set number of pixels high. The smaller the pixels, the smaller the elements. Resolution independence means that you can zoom the entire interface freely. This will let Apple move to newer displays with higher resolution without forcing its users to squint.
AH! I see! Thanks!
     
Brass
Professional Poster
Join Date: Nov 2000
Location: Tasmania, Australia
Status: Offline
Reply With Quote
Jun 13, 2007, 06:19 PM
 
Also means that zooming in on existing or new displays will not end up with fuzzy pixilated looking screen. (eg, CTRL mouse-scrolling)
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Jun 13, 2007, 06:39 PM
 
Originally Posted by Brass View Post
Also means that zooming in on existing or new displays will not end up with fuzzy pixilated looking screen. (eg, CTRL mouse-scrolling)
I doubt that.
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Jun 13, 2007, 06:41 PM
 
Originally Posted by TETENAL View Post
I doubt that.
I think he means that you can get bigger elements on your screen without actually changing the resolution, not that you can actually zoom in on things without it getting pixelated.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Brass
Professional Poster
Join Date: Nov 2000
Location: Tasmania, Australia
Status: Offline
Reply With Quote
Jun 13, 2007, 08:46 PM
 
Originally Posted by TETENAL View Post
I doubt that.
Why would you doubt it? Why would anyone create a resolution independent system that became all fuzzy and pixelated when you zoomed in?

In any case, people at WWDC who have tried the beta have confirmed that CTRL-scroll-wheel does in fact produce nice sharp zoomed in screen images.
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Jun 13, 2007, 09:08 PM
 
Originally Posted by Brass View Post
Why would you doubt it?
Because windows are buffered in a pixel buffer. If you zoom in you would either have to increase the buffer size and redraw, which would be slow, or you would have to draw into very large buffers to begin with, which would waste memory.
In any case, people at WWDC who have tried the beta have confirmed that CTRL-scroll-wheel does in fact produce nice sharp zoomed in screen images.
No, one person who claimed to be there posted this. It hasn't been confirmed to be like this by anybody else. And I doubt it's in Leopard.
     
JLL
Professional Poster
Join Date: Apr 1999
Location: Copenhagen, Denmark
Status: Offline
Reply With Quote
Jun 13, 2007, 11:54 PM
 
CTRL-scroll does not use RI in Leopard.

RI isn't even user adjustable at the moment and probably won't be for long according to Apple's comments last year.
JLL

- My opinions may have changed, but not the fact that I am right.
     
- - e r i k - -
Posting Junkie
Join Date: May 2001
Location: Brisbane, Australia
Status: Offline
Reply With Quote
Jun 14, 2007, 06:22 AM
 
Indeed. We have two conflicting reports on the RI - zoom debacle. The first one from inkhead which said it definitely did scale up without pixelation, and JLL who now claims it doesn't. Apparently inkhead has refused to comment any further (they are both at WWDC), so maybe he just misinterpreted what he saw and are too shamed to fess up or something. Who knows?

All we know so far is that:
1) Leopard definitely supports resolution independence (Technically so did Tiger)
2) The new unified interface is redone to support RI, with vectors and higher resolution bitmaps (ie. 512x512 icons).
3) Apple is telling developer to start supporting RI in their apps

Whether or not we will be able to take advantage of RI at Leopard's launch, whether through adjustable PPI or even through zooming in with Universal Access remains to be seen.

[ fb ] [ flickr ] [] [scl] [ last ] [ plaxo ]
     
kilechki
Forum Regular
Join Date: Feb 2005
Location: Paris, Fr
Status: Offline
Reply With Quote
Jun 14, 2007, 06:41 AM
 
Vista does unpixelated zoom, doesn'it?
So it has to be possible.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 14, 2007, 08:30 AM
 
Unpixelated zoom only works if all the graphics are available in some sort of vector format. You can fake it a bit by taking a bigger bitmap and scaling it down in varying amounts (the Dock does this) - then the pixelation is not obvious. RI interface just means that there is a method to scale every standard interface element by some factor without it being obvious - each button is scaled according to its rules (it's probably vectors), each icon according to its rules (icons are usually scaled the other way, by taking a bigger bitmap and scaling down), etc.

Zooming currently works in a different way. It renders the screen into a bitmap and then zooms the bitmap up. To achieve "unpixelated" zooms, you can either render everything at a much higher resolution to begin with and then zoom in - always staying beneath the "actual" 100% zoom level - or keep re-drawing everything. I very much doubt that Apple will implement the second, that would be very slow and jerky. The first is a nifty idea, but it would force you to render everything at a higher resolution than you could display all the time, on the off chance that the user wants to zoom in.
     
inkhead
Senior User
Join Date: Mar 2004
Status: Offline
Reply With Quote
Jun 16, 2007, 10:23 PM
 
Yes Leopard has resolution independence. Just because Apple didn't announce it to non-developers does NOT mean it doesn't exist!

Apple has made it clear that not upgrading your apps properly for this would basically cut you out of good support and favor.

There was 12 session this year at WWDC on this.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 17, 2007, 06:16 PM
 
So the GUI has resolution independence. I presume that means that it has to know the physical size of a monitor? Is there a new dialog box in which you can input that information?

Also, and MOST importantly, does it apply to *fonts*??? Like, in Leopard, if I set my font to 12-point, in say BBEdit or TextEdit or Word, and then I measure it on the screen with a ruler, will it actually be the same size it will be as when it prints out? Or will it continue to be the case that a 9-point font is actually displayed on the screen as a 6-point font, and a 12-point as a 9-point, and so on???

-=DG=-, who pines for the days of WYSIWYG to return, when monitors were 72 dpi damnit
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Jun 17, 2007, 06:23 PM
 
Originally Posted by Dark Goob View Post
So the GUI has resolution independence. I presume that means that it has to know the physical size of a monitor? Is there a new dialog box in which you can input that information?
The actual interface for controlling the UI scale reportedly isn't in yet, but the interface that developers can use to test with is a sliding scale that controls how large things are drawn. Resolution independence doesn't necessarily mean things are always the same size; it can also mean you don't have to switch to a lower resolution to make things large enough to be visible. If Apple does through with that kind of scheme, you can set it so your screen is truly 72 DPI if you want, but you can also set it higher.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Jun 17, 2007, 06:26 PM
 
Originally Posted by Dark Goob View Post
So the GUI has resolution independence. I presume that means that it has to know the physical size of a monitor? Is there a new dialog box in which you can input that information?
You don't have to input this information. Your computer knows the size of the monitor.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 17, 2007, 06:41 PM
 
Originally Posted by Chuckit View Post
The actual interface for controlling the UI scale reportedly isn't in yet, but the interface that developers can use to test with is a sliding scale that controls how large things are drawn. Resolution independence doesn't necessarily mean things are always the same size; it can also mean you don't have to switch to a lower resolution to make things large enough to be visible. If Apple does through with that kind of scheme, you can set it so your screen is truly 72 DPI if you want, but you can also set it higher.
Well, in order for the computer to be able to display a font at the size it's supposed to be, the computer needs to know the size of the monitor. The whole point of resolution independence, in terms of fonts, is that you would not have to be at 72 DPI to get WYSIWYG.

So again I ask -- does the resolution independence apply to fonts in Leopard? Like, on my MacBookPro, will my 12-point fonts finally actually be 12-point fonts, or will they still only be drawn at 9-point on the screen?

-=DG=-
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Jun 17, 2007, 09:17 PM
 
Yes, resolution independence does apply to fonts. It applies to everything on the screen. But again, this doesn't mean that characters will actually be 12 physical points in size if you don't set it that way.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
JKT
Professional Poster
Join Date: Jan 2002
Location: London, UK
Status: Offline
Reply With Quote
Jun 18, 2007, 09:49 AM
 
Originally Posted by Dark Goob View Post
Well, in order for the computer to be able to display a font at the size it's supposed to be, the computer needs to know the size of the monitor. The whole point of resolution independence, in terms of fonts, is that you would not have to be at 72 DPI to get WYSIWYG.

So again I ask -- does the resolution independence apply to fonts in Leopard? Like, on my MacBookPro, will my 12-point fonts finally actually be 12-point fonts, or will they still only be drawn at 9-point on the screen?

-=DG=-
Um, if the screen was truly WYSIWYG, it would be e.g. 300 dpi minimum and preferably 600dpi or higher, not 72 dpi!

DG - the whole point of RI is that you can scale everything so that it doesn't fit within the constraints of the resolution of the monitor. If you want to have your 12 point fonts be the same size as 12 point fonts on a piece of paper, you will be able to scale the UI so that this is the case. Whether or not Apple will include an UI that makes this easy to achieve is another matter - it shouldn't be too hard to achieve on LCD monitors where the physical dimensions of the actual image are fixed and known, but CRTs might be trickier (as the dimensions on the screen vary from monitor to monitor depending on how pinched or expanded the vertical and horizontal display is).
     
besson3c
Clinically Insane
Join Date: Mar 2001
Location: yes
Status: Offline
Reply With Quote
Jun 18, 2007, 02:03 PM
 
Will this resolution independence be independent?
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Jun 18, 2007, 02:28 PM
 
For word processors and graphic programs you can already get this kind of WYSIWYG. All you need to do is calculate the true resolution of your screen, divide by 72 and use that as the zoom factor. If you have a 106 ppi screen for example, 147% zoom is your "true" size. Not automatic, but possible. What currently is not possible is scaling the UI. At least not properly in Tiger.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 18, 2007, 06:49 PM
 
Originally Posted by TETENAL View Post
For word processors and graphic programs you can already get this kind of WYSIWYG. All you need to do is calculate the true resolution of your screen, divide by 72 and use that as the zoom factor. If you have a 106 ppi screen for example, 147% zoom is your "true" size. Not automatic, but possible. What currently is not possible is scaling the UI. At least not properly in Tiger.
Look. Not all programs have a "zoom percentage" that can be set.

Take a web browser for example. I can go in my preferences right now in Camino, and tell it I want my display font to be 12-point. But it's not going to be 12-point, it's going to show up as 9-point if you measure it with a ruler. THE COMPUTER IS LYING, IT'S NOT 12-POINT! Computers should not lie.

If I set it to 12-point, it should show up on the SCREEN at 12-point! Since that is what we are setting, is the preferences for the display of the font.

Or take TextEdit. On my MacBook Pro, the ruler at the top of the window in TextEdit shows inches. But if you measure it with a ruler, they are not inches! They are smaller than inches (though when it prints, they are inches). And the so-called 12-point font does not show up at 12-point. This is not WYSIWYG. But it should be, and a resolution-independent UI with smart-scaling should be able to make sure that the computer, even in the simplest applications, does not lie font sizes.

Right now, WYSIWYG depends on screen resolution being 72 DPI to work, since each pixel of a display bitmap font is therefore 1-point in size. Therefore, WYSIWYG, a main feature of the classic Mac UI, is specifically RESOLUTION DEPENDENT. If the UI is to become fully resolution INDEPENDENT, then font size should not remain in the stone age of dependency.

JKT Wrote:
> > >
Um, if the screen was truly WYSIWYG, it would be e.g. 300 dpi minimum and preferably 600dpi or higher, not 72 dpi!
< < <

Look, man, very funny, but WYSIWYG is not (and has never been) about matching the actual pixels-per-inch, just matching the actual physical SIZE between monitor and printer (along with matching color, typeface, etc.). Besides, many printers are less than 300 DPI, for example a 133-line-screen web press. Hehe.

Just like there is ColorSync for colors, there ought to be SizeSync for font sizes! And it should be system-wide. It should never be the case that if you set a font at 12-point, it shows up as a 9-point font!! Unless you "zoom out" to something other than 100%. And the OS should handle it for you, automatically, BECAUSE IT'S A MAC.

> > >
If you want to have your 12 point fonts be the same size as 12 point fonts on a piece of paper, you will be able to scale the UI so that this is the case. Whether or not Apple will include an UI that makes this easy to achieve is another matter - it shouldn't be too hard to achieve on LCD monitors where the physical dimensions of the actual image are fixed and known, but CRTs might be trickier (as the dimensions on the screen vary from monitor to monitor depending on how pinched or expanded the vertical and horizontal display is).
< < <

Well if they don't make it "easy to achieve" then they are doing disservice to the soul of Macintosh. It should not only be easy to achieve on LCD's, it should be flawless and happen automatically by default (at least on Apple-branded and specifically Mac-compatible ones). Further, I agree with you that on CRTs, due to the various controls that they have for resizing the picture, we can only have hope for a "pretty close" approximation. But there should be a UI scaling preferences where you can take a tape measure, measure your screen's dimensions, type it in the box, and the computer makes everything WYSIWYG and properly scaled. Of course, since the vast majority of Macs now have built-in LCD displays, from a "default of the OS user experience" standpoint this would be phenomenal.

Chuckit wrote:
> > >
Yes, resolution independence does apply to fonts. It applies to everything on the screen. But again, this doesn't mean that characters will actually be 12 physical points in size if you don't set it that way.
< < <

Well it should be the default setting that it shows up at 12 physical points on the screen. If I set my Finder font display at 12-point, if I take a ruler and measure it on the screen, then a capital "I" should be 4.2324 mm in size (or exactly 1/6th-inch) since a "point" is an exact unit of measurement (0.3527mm, or 1/72nd-inch). (Point (typography) - Wikipedia, the free encyclopedia)

Otherwise, please tell me what in the heck is the point of calling it a "point"? Why not just call it an "arbitrary display unit"? If it's not going to show up at the size the user specifies, then why specify a size at all?

You should have to "zoom out" or "zoom in" for it to be otherwise.

Damn, I feel like the George Carlin of Mac punditry.

-=DG=-
( Last edited by Dark Goob; Jun 18, 2007 at 06:59 PM. )
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 18, 2007, 07:27 PM
 
I think you misunderstand a few things here, DG: `point' doesn't mean pixels, the unit is much older than computer screens. The rules in TextEdit isn't supposed to measure an inch on your screen, but an inch on your printout! The unit point was essentially defined in 1959 (long, long time before DTP). The computer isn't lying to you, I think you have the wrong concept in mind. Since point is a unit of measurement in the real world, it's the printout and not the screen which counts: and the computer sticks to this. The reason why 1/72 of an inch = 1 pt is simply related to the fact that this was close enough to alternative definitions of the point. If displays were 100 dpi back then, I'm sure they would have used 1/100th or so. But again, point has nothing to do with pixels here.

It isn't a feature of the `classic Mac UI', since the dpi of a screen is independent of the OS (granted that the OS can display the resolution you need). The 12" dual USB iBooks were using 110 dpi. Even the clamshell iBooks or my PowerBook G3 Kanga (aka 3500) had 86 dpi (on OS 8 as well as 9.2).

WYSIWYG also doesn't mean the word document is of the same length scale, but you have a representation of your page and all the proportions are the way you will see it on a print out. The ruler connects the length scale of your screen with the length scale of your print out. This is expected behavior, really.
( Last edited by OreoCookie; Jun 18, 2007 at 07:36 PM. )
I don't suffer from insanity, I enjoy every minute of it.
     
TETENAL
Addicted to MacNN
Join Date: Aug 2004
Location: FFM
Status: Offline
Reply With Quote
Jun 18, 2007, 07:59 PM
 
Actually, from reading both of your posts, it seems pretty obvious that DG understands things better than you. There is nothing (other then the historical assumption of a display with fixed resolution of 72 ppi) that prevents a computer from showing a document at the exact same size as it would be when printed out. A computer screen is "real world" too, Oreo.
     
Chuckit
Clinically Insane
Join Date: Oct 2001
Location: San Diego, CA, USA
Status: Offline
Reply With Quote
Jun 18, 2007, 08:03 PM
 
Originally Posted by TETENAL View Post
Actually, from reading both of your posts, it seems pretty obvious that DG understands things better than you. There is nothing (other then the historical assumption of a display with fixed resolution of 72 ppi) that prevents a computer from showing a document at the exact same size as it would be when printed out. A computer screen is "real world" too, Oreo.
Actually, Oreo's last point is relevant. WYSIWYG editors don't have to show things at the exact physical size they would be as long as they are to scale.
Chuck
___
"Instead of either 'multi-talented' or 'multitalented' use 'bisexual'."
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 18, 2007, 08:11 PM
 
Originally Posted by OreoCookie View Post
I think you misunderstand a few things here, DG: `point' doesn't mean pixels, the unit is much older than computer screens. The rules in TextEdit isn't supposed to measure an inch on your screen, but an inch on your printout! The unit point was essentially defined in 1959 (long, long time before DTP). The computer isn't lying to you, I think you have the wrong concept in mind. Since point is a unit of measurement in the real world, it's the printout and not the screen which counts: and the computer sticks to this. The reason why 1/72 of an inch = 1 pt is simply related to the fact that this was close enough to alternative definitions of the point. If displays were 100 dpi back then, I'm sure they would have used 1/100th or so. But again, point has nothing to do with pixels here.

It isn't a feature of the `classic Mac UI', since the dpi of a screen is independent of the OS (granted that the OS can display the resolution you need). The 12" dual USB iBooks were using 110 dpi. Even the clamshell iBooks or my PowerBook G3 Kanga (aka 3500) had 86 dpi (on OS 8 as well as 9.2).

WYSIWYG also doesn't mean the word document is of the same length scale, but you have a representation of your page and all the proportions are the way you will see it on a print out. The ruler connects the length scale of your screen with the length scale of your print out. This is expected behavior, really.
Dude, did you read my post? I don't misunderstand what a "point" is. I've been a newspaper editor for many years. A "point" is a unit of measurement. Screens obtained WYSIWYG originally by making 1 pixel to be 1 point by doing 72DPI (since a point is 1/72nd of an inch). The ruler in TextEdit should be the same size on the screen and on the print-out if displayed at 100% (WYSIWYG).

The computer is lying, if for example in the Finder, you set the display font size to 12-point and it shows up at 9-point. If you disagree with that very logical statement, then not to flame, but you're smoking crack.

The classic Mac UI had to do with a combination of hardware and software (it always has, that's why Apple stands out). They specifically chose 72 DPI because it enabled them to make sure that fonts displayed at the proper sizes on the screen. By classic I do not mean iBooks! I mean, Mac Plus, Mac II series, etc. The presence of WYSIWYG display fonts has been lost for some time, since well before such things as the PowerBook G3's and iBooks.

I understand that it can be considered a form of WYSIWYG if the display font is proportional to the printout. However, having a preference which allows you to set a font that is never intended for printing (such as user interface elements) at a particular point size, and then having it actually display at a DIFFERENT point size, is pointless -- literally.

-=DG=-
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 18, 2007, 08:16 PM
 
Originally Posted by Chuckit View Post
Actually, Oreo's last point is relevant. WYSIWYG editors don't have to show things at the exact physical size they would be as long as they are to scale.
Yes... BUT, the DEFAULT should be to show it at the same size as the printout, unless you "zoom in" or "zoom out". For example, if I am displaying something at "100%" then it should be the same physical size on the screen as on the printout.

Programs like TextEdit which do not have a "zoom" control, should default to displaying things at the 100% real-world size, since otherwise, a 9-point font gets shrunken down to an unreadable 6-point on the screen, forcing the user to raise it to a larger size in order to be able to edit the document without going blind -- and then forcing them to change it back to the desired font size for printing.

I mean come on this isn't rocket science. Why is this such a hard concept to understand? Probably because you have never actually used a computer that had true WYSIWYG, so it is an utterly foreign concept to you. Well I remember the days of true WYSIWYG, and I am just hoping that Leopard brings them back!!!

-=DG=-
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 18, 2007, 08:18 PM
 
It sure is. However, a screen is a resolution-dependent device. The dpi of the TFT is fixed (if you use the native resolution, that is), and the only mistake is that it is way more than 72 dpi these days. It has nothing to do with `classic MacOS' or something like that. The reason that 1 pt corresponded to 1 px on these ancient displays is a `coincidence'.

I also think that I don't misunderstand the nature of the unit point: it's not connected to pixels, the definition is 1/72th of an inch. So the only way to have 1 point on your printout = 1 point on your screen is with a resolution-independent UI (in `real' units, not pixels). To my knowledge, only Leopard sports that (perhaps also Vista, I'm not sure). For sure, it wasn't a paradigm of pre-OS X MacOS.

The reason for the discrepancy is IMHO two-fold: (i) Computer designers got used to the unit point and associated a certain size with it. 1 point became synonymous with a certain amount of pixels, if you wish. (ii) Every OS to date still uses the `incorrect conversion' point-to-pixel, in essence, the OS measures with pixels and not points.
( Last edited by OreoCookie; Jun 18, 2007 at 08:52 PM. )
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 18, 2007, 08:27 PM
 
Originally Posted by Dark Goob View Post
Dude, did you read my post? I don't misunderstand what a "point" is. I've been a newspaper editor for many years. A "point" is a unit of measurement. Screens obtained WYSIWYG originally by making 1 pixel to be 1 point by doing 72DPI (since a point is 1/72nd of an inch). The ruler in TextEdit should be the same size on the screen and on the print-out if displayed at 100% (WYSIWYG).
You do misunderstand what the problem is, and it's two-fold: (i) for computer companies (long before OS X) point has become synonymous to pixel, you make the same mistake: you insist that the display's resolution should be 72 dpi so that 12 pt Times is as large on your print out as it is on your screen. (ii) WYSIWYG doesn't mean that what you see on your screen is an unscaled image of what your printout is going to look like, it's a general interface concept. You may argue that in certain situations, a good WYSIWYG app should offer such a choice, but it's really just an interface concept and is not a peculiarity of the Mac.

IMHO the ruler in TextEdit should indicate the scale of the text I've typed. Other than that, I want to see as much text on my screen as possible while still remaining legible. I don't care about dpi while writing. It's different for other applications (e. g. layouting), but this is a different application. 100 % has become customary for people and computer companies alike. Don't get me wrong, I understand that 100 % doesn't correspond to life-size, blame Microsoft, but that's what the rulers are for. I would say everybody has developed a feel for what 12 pt should look like (arguably the standard size used in most letters these days).

I do agree that 100 % zoom in certain apps (essentially non-pixel-based apps) should give you a 1-to-1 rendering of your screen. However, this is a problem of the app, not of the OS (when you make the assumption that you measure OS GUI elements in pixels and pixels = points).

IMHO a resolution-independent interface is really a good way out of this debacle: tell the OS what the resolution of your screen is and help people with bad eyes to scale their interface to their liking. And help people like you so that 1 cm of your print-out = 1 cm of your screen -- not by lowering the resolution so that your display has 72 dpi.
( Last edited by OreoCookie; Jun 18, 2007 at 08:50 PM. )
I don't suffer from insanity, I enjoy every minute of it.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 18, 2007, 11:38 PM
 
OreoCookie wrote:
> > >
It sure is. However, a screen is a resolution-dependent device. The dpi of the TFT is fixed (if you use the native resolution, that is), and the only mistake is that it is way more than 72 dpi these days. It has nothing to do with `classic MacOS' or something like that. The reason that 1 pt corresponded to 1 px on these ancient displays is a `coincidence'.
< < <

It's not a coincidence. Apple deliberately established 72 DPI as the standard screen pixel density for their platform in the early days of the Mac, so that WYSIWYG could be possible. For many years all their monitors had a default setting that was 72 DPI. Likewise, all bitmap fonts (aka screen fonts) were designed so that if they were displayed at 72 DPI, they would show up at the proper point-size on the screen.

Also, it was not a "coincidence" that they chose 72 DPI as the resolution to use. I am quite well aware that the length standard "point" system predates computers by decades. But what you may not know is that the reason that the "point" was decided upon as 1/72nd of an inch, was because that size was about the minimal distance difference perceivable by humans. That is to say, two things next to each other which have only a 1/300th of an inch difference look to be the same size; whereas it is possible to tell the difference when one of the objects or type characters is 1/72nd of an inch smaller. This is similar to how the movie industry developed 24 FPS as the standard, since any slower than that would look choppy and the "persistence of vision" effect would be compromised.

It presented itself as convenient and logical to simply make the standard pixel size to be exactly one point, so that screen-fonts could be designed easily which showed up as the proper size on the screen.

Of course, as technology advanced, in the late 90s many people started to use much higher pixel densities on computer displays. Therefore the old 72 DPI standard became irrelevant, yet the use of bitmapped "screen fonts" still remained standard practice -- thus the tendency of operating systems to render fonts at smaller than the size they are supposed to be rendered at.

The idea hopefully is that a resolution independent OS like 10.5 would finally do away with bitmapped "screen fonts" in favor of things always being rendered at an accurate size, with the OS knowing the physical dimensions of the screen and its pixel density.

> > >
I also think that I don't misunderstand the nature of the unit point: it's not connected to pixels, the definition is 1/72th of an inch. So the only way to have 1 point on your printout = 1 point on your screen is with a resolution-independent UI (in `real' units, not pixels). To my knowledge, only Leopard sports that (perhaps also Vista, I'm not sure). For sure, it wasn't a paradigm of pre-OS X MacOS.

The reason for the discrepancy is IMHO two-fold: (i) Computer designers got used to the unit point and associated a certain size with it. 1 point became synonymous with a certain amount of pixels, if you wish. (ii) Every OS to date still uses the `incorrect conversion' point-to-pixel, in essence, the OS measures with pixels and not points.
< < <

Yes you are correct here.

> > >
You do misunderstand what the problem is, and it's two-fold: (i) for computer companies (long before OS X) point has become synonymous to pixel, you make the same mistake: you insist that the display's resolution should be 72 dpi so that 12 pt Times is as large on your print out as it is on your screen. (ii) WYSIWYG doesn't mean that what you see on your screen is an unscaled image of what your printout is going to look like, it's a general interface concept. You may argue that in certain situations, a good WYSIWYG app should offer such a choice, but it's really just an interface concept and is not a peculiarity of the Mac.
< < <

No, I am not insisting that we still use 72 DPI. You misunderstand me. I simply am saying that we should display a 12-point font at 12-point on the screen (1/6th of an inch). It would be rendered at 102 DPI/PPI, or 130 DPI/PPI, or whatever pixel density your current monitor is set to. But it would still show up as 12 points in size, since 12 points is a specific unit of measurement that predates computers, and is equivalent to 1/6th of an inch.

To address your point (ii), WYSIWYG DOES mean that what you see on your screen is an unscaled image of what your printout is going to look like. What You See Is What You Get. That's what WYSIWYG stands for. That's the point of it.

Also, you should not be so fixated on "printing." Not all text gets printed!!! If I'm in the Finder preferences and set my window to display filenames at 16-point, they should show up at 16-point on the screen, not a shrunken version of 16-point. It has nothing to do with whether or not I'm going to ever print it out.

If it is an application for word processing or desktop publishing, when set to 100%, fonts should render at 100% of the size that they are going to print at.

> > >
IMHO the ruler in TextEdit should indicate the scale of the text I've typed.
< < <

It's a RULER. The size of a ruler at 100% (default size) should be the same size as the ruler in real life! How is this so hard to understand?

All currently made Mac screens are at least as wide as a standard sheet of paper, so I don't see why it would be necessary to, by default, shrink every ruler down to smaller than 100%.

> > >
Other than that, I want to see as much text on my screen as possible while still remaining legible. I don't care about dpi while writing.
< < <

Well then WYSIWYG is not for you. What you want is a shrunken down, zoomed-out version of what the printout will look like (i.e. What You See Is SMALLER than What You Get).

> > >
It's different for other applications (e. g. layouting), but this is a different application. 100 % has become customary for people and computer companies alike. Don't get me wrong, I understand that 100 % doesn't correspond to life-size, blame Microsoft, but that's what the rulers are for.
< < <

Well, just because the status quo has gone away from WYSIWYG, that does not mean that the Mac should not return to being a WYSIWYG platform. The ruler on the screen should be the same size as the units it's telling you when displayed at 100%. Otherwise, what is the meaning of "100%" if really it's only 80% of the real size???? Come on, that's just retarded.

> > >
I do agree that 100 % zoom in certain apps (essentially non-pixel-based apps) should give you a 1-to-1 rendering of your screen. However, this is a problem of the app, not of the OS (when you make the assumption that you measure OS GUI elements in pixels and pixels = points).
< < <

Well when it comes to rendering point-sizes of fonts, which is a service of the OS, then it should be handled in WYSIWYG fashion as I am suggesting. Other than that, it should be up to the application developer to ensure WYSIWYG, but the OS should offer aid in that regard as well (such as providing information to the app on what the current screen DPI is, etc.).

> > >
IMHO a resolution-independent interface is really a good way out of this debacle: tell the OS what the resolution of your screen is and help people with bad eyes to scale their interface to their liking. And help people like you so that 1 cm of your print-out = 1 cm of your screen -- not by lowering the resolution so that your display has 72 dpi.
< < <

Again I am not suggesting that the display be at 72 DPI. That would be retarded.

I am just saying that since a point = 1/72nd of an inch, a 72-point font at 100% on the screen should be 1" in size. Not smaller, not larger. And, 100% should be the default. If you want to have as much text as possible on the screen, then you would just set your font to 9 point or something like that which is really small, or you zoom out to 75% or whatever.

The whole point is that I DON'T WANT to have to set my screen to 72 DPI in order to get TextEdit to show me my fonts and ruler at a non-shrunken size! I WANT to have my screen be 110 DPI or whatever it is, and have things render properly! Which is the point of resolution independence.

And I want to know if Leopard does this, or if TextEdit and fonts in the Finder, browsers, etc. are still all shrunken down from the point-sizes that the preferences boxes say they are. That's all.

-=DG=-
( Last edited by Dark Goob; Jun 18, 2007 at 11:46 PM. )
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2007, 02:45 AM
 
I don't think there's much disagreement between us. I'm well aware that older Apple Displays tended to adhere to the 72 dpi idea (correct me if I'm wrong, but I think the last of such displays was the BW 21" Apple Display). Of course many people chose (past tense) higher resolutions to cram more PS/Illustrator palettes on their computer or whatnot.

I still think your idea of WYSIWYG is a bit off and too narrow. WYSIWYG is not about 1-to-1 reproductions, but about a representation without additional layers of abstraction. In the old days, word processors running on DOS/Novell/whatever. You would set margins in a menu using the F keys, for example. The WYSIWYG paradigm says that you should see the 2.5 cm left margin on screen, e. g. with a ruler (if you choose to enable it). WYSIWYG doesn't necessarily mean the left margin must measure 2.5 cm on your screen, but 2.5 cm with respect to the on-screen ruler, for instance. This representation is entirely independent from the zoom factor. Now, you get no argument that it would be more accurate if 2.5 cm at a 100 % zoom factor correspond to 2.5 cm on your screen, it doesn't (for reasons we all agree on). But it's still 2.5 cm with respect to the ruler on the screen and hence consistent with the WYSIWYG paradigm.
( Last edited by OreoCookie; Jun 19, 2007 at 03:07 AM. )
I don't suffer from insanity, I enjoy every minute of it.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 19, 2007, 04:37 AM
 
Originally Posted by OreoCookie View Post
I still think your idea of WYSIWYG is a bit off and too narrow. WYSIWYG is not about 1-to-1 reproductions, but about a representation without additional layers of abstraction. In the old days, word processors running on DOS/Novell/whatever. You would set margins in a menu using the F keys, for example. The WYSIWYG paradigm says that you should see the 2.5 cm left margin on screen, e. g. with a ruler (if you choose to enable it). WYSIWYG doesn't necessarily mean the left margin must measure 2.5 cm on your screen, but 2.5 cm with respect to the on-screen ruler, for instance. This representation is entirely independent from the zoom factor. Now, you get no argument that it would be more accurate if 2.5 cm at a 100 % zoom factor correspond to 2.5 cm on your screen, it doesn't (for reasons we all agree on). But it's still 2.5 cm with respect to the ruler on the screen and hence consistent with the WYSIWYG paradigm.
Listen. Forget about rulers for a second.

Go to the Finder. Click on your desktop, or in any window. Hit command-J, or go to View > Show View Options. The view options box will appear. You will see a pop-up menu which says Text Size. Click it, and it will come up with the following options: 10 pt, 11 pt, 12 pt, 13 pt, 14 pt, 15 pt, 16 pt. Now, set your text to 12 pt. Rename a file "I". Measure the length of the "I" on your screen with a ruler. Is it 12 points in size?

In Leopard, I want the answer to this question to always be yes, because the resolution-independent GUI will know my screen size, aspect ratio, and resolution, and it will render a 12-point font at the proper size on my screen (Since otherwise what's the point of calling it 12 points? See my point?)

Now back to the question about rulers. I agree that hey, it's better than nothing, and is loosely WYSIWYG, if your displayed document (rulers and all) is proportional to the printed document. However, I also feel that by default, a program should display rulers at 100% to match their actual size on the screen to the units they are claiming to be. That is what we call "user interface predictability" and it is part of what makes computers "user friendly". Someone who is new to computers would have a much easier time understanding that, than understanding the concept that "hey your rulers (and everything else) is smaller than what it will print out as because someone else decided for you that would be the best way to do it, and haha, if you want it otherwise, you will have to go through some complex steps to fix it."

See, I'm a Mac guy. I think it should be simple. Things should just work. They should be the same size, color, font, and everything else on the screen as in the print, as close as technically possible, by default. If someone wants to "zoom out" and shrink it down to fit more pages on the screen, or what-not, then by all means enable that as a feature. But the user should not have to monkey with stuff or perform arcane mathematical equations in order to get 1" on the screen ruler to be 1" in real life, because 1" is 1", and should be 1" unless otherwise requested. Get me?

-=DG=-
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2007, 04:50 AM
 
We've already established that points have essentially become pixels on the computer -- or what's the point of using 13 pt Lucide Grande for menus when pixels are the more logical choice?
I don't suffer from insanity, I enjoy every minute of it.
     
red rocket
Mac Elite
Join Date: Mar 2002
Status: Offline
Reply With Quote
Jun 19, 2007, 07:38 AM
 
I agree with Dark Goob.

If I have a page of printed text or anything else in my hand, I should be able to slap it up against the screen, and it should overlay its electronic equivalent with 100% perfection, by default.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Jun 19, 2007, 09:53 AM
 
I am with DG on this one - you're absolutely correct. The OS should know - by manual setting if required, preferably by reading out EDID data - how big the display is, so that a 12 point I is always 12/72 of an inch high, or as close as can be achieved within the native resolution of the display. A 10 cm ruler should be exactly 10 cm long, if I put a ruler on the display. The OS can provide zoom functionality if required - similar to decreasing the resolution these days - but that should not be the default.
     
kman42
Professional Poster
Join Date: Sep 2000
Location: San Francisco
Status: Offline
Reply With Quote
Jun 19, 2007, 12:53 PM
 
I agree with everyone. WYSIWYG in practical terms became WYSIWYG x Scaling Factor to make your whole doc fit on the screen. This was convenient when we didn't all have 24" monitors. Nowadays, we should get back to WYSIWYG being WYSIWYG x 100% Scaling Factor or just what it is in the real world. This should be the default for all fonts and applications. It would require some getting used to, but would be great in the long run. Resolution independence should make it possible.

kman
     
JKT
Professional Poster
Join Date: Jan 2002
Location: London, UK
Status: Offline
Reply With Quote
Jun 19, 2007, 01:59 PM
 
Originally Posted by P View Post
I am with DG on this one - you're absolutely correct. The OS should know - by manual setting if required, preferably by reading out EDID data - how big the display is, so that a 12 point I is always 12/72 of an inch high, or as close as can be achieved within the native resolution of the display. A 10 cm ruler should be exactly 10 cm long, if I put a ruler on the display. The OS can provide zoom functionality if required - similar to decreasing the resolution these days - but that should not be the default.
I'm not going to argue against this concept (though there are issues of usability in terms of being able to see e.g. your whole document and all your palettes etc on screen at the same time), but what I am curious to know is if this has actually ever been the case in the past. DG appears to be saying that it has, but for the life of me, I can't recall it being so (however, I didn't start using GUI computers regularly until the early nineties, and Macs until the early to mid-nineties, by which time screen resolutions and physical document sizes bore no relationship to each other).

If it hasn't been the case in the past, then RI is actually the first time it will be possible for the OS to do this entirely by default, no matter (CRTs notwithstanding) what monitor you have attached and what resolution you use.
     
Don Pickett
Professional Poster
Join Date: Mar 2000
Location: New York, NY, USA
Status: Offline
Reply With Quote
Jun 19, 2007, 04:35 PM
 
Originally Posted by kman42 View Post
I agree with everyone. WYSIWYG in practical terms became WYSIWYG x Scaling Factor to make your whole doc fit on the screen. This was convenient when we didn't all have 24" monitors. Nowadays, we should get back to WYSIWYG being WYSIWYG x 100% Scaling Factor or just what it is in the real world. This should be the default for all fonts and applications. It would require some getting used to, but would be great in the long run. Resolution independence should make it possible.

kman
No it should not, and I set type for a living. There are times, especially when I'm reading a long document, that I want to cram as much text onto the screen as possible. Having 12 point type always be 12 point type limits the amount of text I can cram onto my screen. I don't want that limitation. It would be an absolute killer on a laptop screen. OS X is hungry for deskop real estate as it is.

All of you need to pop a couple Xanax and go otuside for a while. It's the place with the sun and the trees and other people.
The era of anthropomorphizing hardware is over.
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 19, 2007, 04:58 PM
 
Originally Posted by OreoCookie View Post
We've already established that points have essentially become pixels on the computer -- or what's the point of using 13 pt Lucide Grande for menus when pixels are the more logical choice?
No, we haven't agreed to that. Pixels are NOT a logical choice because pixels are a lot smaller on some screens than on others.

The whole POINT of a resolution-independent GUI would be that you DON'T tie interface elements to being a particular number of pixels in size. Rather, the POINT is that if you set the interface font to display at 12-point size, then it displays at that physical size of 12 points (no matter whether it takes 12 pixels or 18 pixels or 25 pixels to represent that 12 points, as it would on a 72 DPI screen, 108 DPI screen, or 150 DPI screen respectively). If like you suggest we tie it to pixels, that is called RESOLUTION DEPENDENCE and would result in text being all shrunken down due to being tied to pixels (resolution)!

Why is this so hard to understand for you?

-=DG=-
     
Dark Goob
Forum Regular
Join Date: Aug 2001
Location: Portland, OR
Status: Offline
Reply With Quote
Jun 19, 2007, 05:26 PM
 
red rocket wrote:
> > >
If I have a page of printed text or anything else in my hand, I should be able to slap it up against the screen, and it should overlay its electronic equivalent with 100% perfection, by default.
< < <

P wrote:
> > >
I am with DG on this one - you're absolutely correct. The OS should know - by manual setting if required, preferably by reading out EDID data - how big the display is, so that a 12 point I is always 12/72 of an inch high, or as close as can be achieved within the native resolution of the display. A 10 cm ruler should be exactly 10 cm long, if I put a ruler on the display. The OS can provide zoom functionality if required - similar to decreasing the resolution these days - but that should not be the default.
< < <

kman42 wrote:
> > >
I agree with everyone. WYSIWYG in practical terms became WYSIWYG x Scaling Factor to make your whole doc fit on the screen. This was convenient when we didn't all have 24" monitors. Nowadays, we should get back to WYSIWYG being WYSIWYG x 100% Scaling Factor or just what it is in the real world. This should be the default for all fonts and applications. It would require some getting used to, but would be great in the long run. Resolution independence should make it possible.
< < <

Thanks for the vindication, guys. At least someone understand what the heck I'm saying.

JKT wrote:
> > >
I'm not going to argue against this concept (though there are issues of usability in terms of being able to see e.g. your whole document and all your palettes etc on screen at the same time), but what I am curious to know is if this has actually ever been the case in the past. DG appears to be saying that it has, but for the life of me, I can't recall it being so (however, I didn't start using GUI computers regularly until the early nineties, and Macs until the early to mid-nineties, by which time screen resolutions and physical document sizes bore no relationship to each other).

If it hasn't been the case in the past, then RI is actually the first time it will be possible for the OS to do this entirely by default, no matter (CRTs notwithstanding) what monitor you have attached and what resolution you use.
< < <

Yeah, back in the day, all monitors such as the Apple Portrait Display, Apple Two-Page Monochrome DIsplay, etc. had default resolutions as close to 72 DPI as possible, ranging from 69-80. The Portrait Display, meant to simulate an 8.5 x 11" sheet of paper, was actually 8" x 10.875" screen real estate, but it was as close as the technology at the time allowed for. Similar type of situation with the Two-Page display, etc. Most screens with multiple-scan technology would have a resolution that was either 72 DPI, or very close, like 69 or 75 DPI for working with WYSIWYG. Obviously, Apple did not engineer the tubes themselves, they had to go with the closest thing from what was available out there.

But yeah, 10.5 would be the first time it's possible to really get it 100% right.

Don Pickett wrote:
> > >
There are times, especially when I'm reading a long document, that I want to cram as much text onto the screen as possible. Having 12 point type always be 12 point type limits the amount of text I can cram onto my screen. I don't want that limitation. It would be an absolute killer on a laptop screen. OS X is hungry for deskop real estate as it is.

All of you need to pop a couple Xanax and go otuside for a while. It's the place with the sun and the trees and other people.
< < <

Whoah there nelly. I have not suggested that the computer should FORCE you to view everything at 100%! I'm just suggesting that it should be the default setting. And also that if you set your browser or Finder to show you things at 12-point, then it should do so (if you want to cram more text on the screen in those non-printing types of apps, then set the display font to 9-point!).

For your DTP and typesetting applications, you would just set your zoom to 75% or 50% or whatever enables you to get the amount of text crammed on the screen that you desire. But if you set it to the 100% view (default view), then it would show you the real-life size on the screen that it would print out at.

What's so bad about that? ... And man, drugs aren't the answer.

-=DG=-
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2007, 05:43 PM
 
Originally Posted by Dark Goob View Post
No, we haven't agreed to that. Pixels are NOT a logical choice because pixels are a lot smaller on some screens than on others.
I'm aware of that, gee. But up until recently, we didn't have the technology to do a resolution-independent interface. Hence, calling 1 pixel 1 pixel and not 1 point is accurate, identifying 1 pixel with 1 point (coz that used to work for 72 dpi displays) is not.
Originally Posted by Dark Goob View Post
The whole POINT of a resolution-independent GUI would be that you DON'T tie interface elements to being a particular number of pixels in size.
You don't have to convince me of the fact that resolution-independent UIs are clearly superior, there is no disagreement here. Again, the technology for resolution-independent displays wasn't there! You need a vector-based (as opposed to pixel-based) UI for that. If the UI is pixel-based, then pixel is the natural unit, not point, not cm, not inch. Just try and change the font-size of a pixel-based UI for all the menus and dialogs (there is/used to be an option for that under Windows). It's a nightmare for an interface designer (who measures in pixels, I've dealt with that problem extensively).

Now, the technology is there and we can measure independently of pixels. But that's because we have a vector-based model of UI elements.

Note that I'm not saying Word and Illustrator, for instance, should measure font sizes in pixel, because that is an ill-defined unit of measurement; here you should use point, millimeter or inch.
For pixel-based applications such as UI elements and websites, using pixel to specify font sizes makes sense; indeed, web designers are used to specify things in pixels (if they wish to), and you can specify font sizes in pixels.
Originally Posted by Dark Goob View Post
Why is this so hard to understand for you?
It's not hard to understand, you've just misinterpreted what I've said. I'm a big fan of resolution-independent UIs and I see it as the only way to solve some of the issues consistently!
( Last edited by OreoCookie; Jun 19, 2007 at 06:27 PM. )
I don't suffer from insanity, I enjoy every minute of it.
     
teszeract
Dedicated MacNNer
Join Date: Oct 2002
Location: the end of the world
Status: Offline
Reply With Quote
Jun 19, 2007, 06:10 PM
 
     
teszeract
Dedicated MacNNer
Join Date: Oct 2002
Location: the end of the world
Status: Offline
Reply With Quote
Jun 19, 2007, 06:11 PM
 
Also, if there is no consensus that 12pt on screen==12pt on paper, then at the very least 100% should mean something, no? BTW, I'm with the 12pt==12pt camp.
     
teszeract
Dedicated MacNNer
Join Date: Oct 2002
Location: the end of the world
Status: Offline
Reply With Quote
Jun 19, 2007, 06:14 PM
 
But on the whole, I am looking so forward to resolution-independence because as I get older (happy 40th to me), all those tiny tiny pull out arrows on pallettes is making my arm ache with hits and misses. Not to mention reduced productivity. Adobe, put your hand up.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Jun 19, 2007, 06:17 PM
 
I don't think there are any camps here. 12 points should be 12 points and 12 pixels shouldn't be confused with 12 points. Hence the paradigm shift from a pixel-based to a vector-based UI is a good thing
I don't suffer from insanity, I enjoy every minute of it.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 11:25 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,