Advertisement

College Programming Computer

Started by July 02, 2012 11:49 PM
52 comments, last by Chad Smith 12 years, 4 months ago

I think saying the "effective resolution" is anything other than the real resolution is not really appropriate. You're complaining that the text and UI elements take up space/are too large, and that's a valid complaint, but it has nothing to do with resolution, "effective" or otherwise. I guess it's slightly better than just saying the "maximum desktop resolution is 1920x1200" but it's still vastly misleading.



Effective resolution is an accurate description of the problem. Prior to the MBP retina display, laptop screen resolution conveyed a certain amount real-estate with which to organize applications. You can fit more applications on the same screen on a 1920x1080 resolution screen than a 1440x900 resolution screen. When someone claims that the MBP has a revolutionary 2880x1800 screen, that implies an increase in screen real-estate when compared to other systems. In reality, when using the recommended "retina" mode, you only have mediocre screen real-estate. You are effectively limited to 1440x900 pixels with which to organize your workspace.

That being said, 1920x1200 is great, but not a huge improvement over the many 1920x1080 laptops which have been available for years.

[quote name='cowsarenotevil' timestamp='1341551204' post='4956213']
I think saying the "effective resolution" is anything other than the real resolution is not really appropriate. You're complaining that the text and UI elements take up space/are too large, and that's a valid complaint, but it has nothing to do with resolution, "effective" or otherwise. I guess it's slightly better than just saying the "maximum desktop resolution is 1920x1200" but it's still vastly misleading.



Effective resolution is an accurate description of the problem. Prior to the MBP retina display, laptop screen resolution conveyed a certain amount real-estate with which to organize applications. You can fit more applications on the same screen on a 1920x1080 resolution screen than a 1440x900 resolution screen. When someone claims that the MBP has a revolutionary 2880x1800 screen, that implies an increase in screen real-estate when compared to other systems. In reality, when using the recommended "retina" mode, you only have mediocre screen real-estate. You are effectively limited to 1440x900 pixels with which to organize your workspace.

That being said, 1920x1200 is great, but not a huge improvement over the many 1920x1080 laptops which have been available for years.
[/quote]

The problem is that the numbers of "pixels" you're throwing around has nothing to do with pixels. You're talking as if pixels are a measure of size relative to UI elements, and this simply isn't the case. If you were really limited to 1440x900 or 1920x1200 pixels; or even that represented, say the amount of detail preserved in things like images (i.e. what everyone else who uses the term "effective resolution" means), it would be a completely different problem.

Think about it this way: you're only thinking of this "problem" in terms of resolution because you have a prior frame of reference for UI elements in pixels. If not for that, the numbers 1920x1200 and 1440x900 wouldn't just mean something obliquely related to how you're using them, they would mean literally nothing. If you didn't have any frame of reference for how big the UI is "supposed to be" relative to the number of pixels, like me and other non-Mac users, you would be just as confused as I was when you started saying things about how the maximum resolution is actually much lower than it should be.

The UI elements are too big for your liking, and you can't make them smaller. This is potentially a great annoyance, and probably deserves proper recognition, so I don't see why you're obscuring the real problem by saying things that simply aren't true like "the maximum desktop resolution is 1920x1200."
-~-The Cow of Darkness-~-
Advertisement

The problem is that the numbers of "pixels" you're throwing around has nothing to do with pixels. You're talking as if pixels are a measure of size relative to UI elements, and this simply isn't the case. If you were really limited to 1440x900 or 1920x1200 pixels; or even that represented, say the amount of detail preserved in things like images (i.e. what everyone else who uses the term "effective resolution" means), it would be a completely different problem.

Why even be able to set the resolution if that's the case?

I think all he was saying is that the high resolution is more about making the standard experience sharper/clearer rather than giving you more real estate. It's nice, but I think the impact would be hugely dependent on the user.

ANYWAY. Something I don't think anybody brought up is you should check with your university to see if they have any student discounts. They're usually quite significant.

~stuff~


As I said before, resolution on monitors and laptops has always indicated a difference in workable screen real-estate. That has changed with the "retina" display. There are now two important numbers to know, actual resolution and effective resolution. One says something about how many physical pixels there are, and the other says something about how much usable screen real-estate there is. What happens if you create a native OS X applications and tell it to render at 1440x900? How much of the screen will it take up? What does a website that is designed for up to 1600px wide look like in "retina" mode? For how people build and use applications, the effective resolution is the more important number because that is what they are working with whether they are designing UI layout (yes, that is a 10 pixel gap, not 40) or they are organizing their desktop.

ANYWAY. Something I don't think anybody brought up is you should check with your university to see if they have any student discounts. They're usually quite significant.


Depends on if you consider $100 off to be significant.

As I said before, resolution on monitors and laptops has always indicated a difference in workable screen real-estate. That has changed with the "retina" display. There are now two important numbers to know, actual resolution and effective resolution. One says something about how many physical pixels there are, and the other says something about how much usable screen real-estate there is.


This might be a relatively convincing argument if "effective resolution" didn't already mean something completely different, and if you hadn't started talking about "maximum resolution" without any of this context.

For how people build and use applications, the effective resolution is the more important number because that is what they are working with whether they are designing UI layout (yes, that is a 10 pixel gap, not 40) or they are organizing their desktop.[/quote]

Only people who use some applications. For image editors and video editors the number of pixels is what people are interested in; that's why you can always tell programs like GIMP to display images according to the monitor size/DPI as well as some kind of "actual pixels" setting. The same goes through vector graphics and design programs.

As for building applications, don't you think, given the wide variety of screen sizes on the market, that it's a good thing that it's starting to be possible to specify distances in a resolution-independent way versus always specifying things in pixels? It sounds like Apple hasn't quite caught up with itself in this regard, and that's a problem, but again, it's a different problem than one of the monitor having a lower resolution than it claims to.

I'd go so far as to say that, for the many users, the idea that higher resolution -> smaller UI elements is considered a bad thing, because the UI elements do not increase in "resolution" (that is, detail does not increase, aliasing does not decrease), they just get smaller and stay the same. Of course, for you it's the opposite: resolution means area in which to fit things. This is perfectly valid, but when you start throwing around terms like "maximum resolution" and "effective resolution" contrary to their traditional meaning, you shouldn't be surprised when people don't understand.
-~-The Cow of Darkness-~-
Advertisement

[quote name='tstrimple' timestamp='1341603581' post='4956440']
As I said before, resolution on monitors and laptops has always indicated a difference in workable screen real-estate. That has changed with the "retina" display. There are now two important numbers to know, actual resolution and effective resolution. One says something about how many physical pixels there are, and the other says something about how much usable screen real-estate there is.


This might be a relatively convincing argument if "effective resolution" didn't already mean something completely different, and if you hadn't started talking about "maximum resolution" without any of this context.

For how people build and use applications, the effective resolution is the more important number because that is what they are working with whether they are designing UI layout (yes, that is a 10 pixel gap, not 40) or they are organizing their desktop.[/quote]

Only people who use some applications. For image editors and video editors the number of pixels is what people are interested in; that's why you can always tell programs like GIMP to display images according to the monitor size/DPI as well as some kind of "actual pixels" setting. The same goes through vector graphics and design programs.

As for building applications, don't you think, given the wide variety of screen sizes on the market, that it's a good thing that it's starting to be possible to specify distances in a resolution-independent way versus always specifying things in pixels? It sounds like Apple hasn't quite caught up with itself in this regard, and that's a problem, but again, it's a different problem than one of the monitor having a lower resolution than it claims to.

I'd go so far as to say that, for the many users, the idea that higher resolution -> smaller UI elements is considered a bad thing, because the UI elements do not increase in "resolution" (that is, detail does not increase, aliasing does not decrease), they just get smaller and stay the same. Of course, for you it's the opposite: resolution means area in which to fit things. This is perfectly valid, but when you start throwing around terms like "maximum resolution" and "effective resolution" contrary to their traditional meaning, you shouldn't be surprised when people don't understand.
[/quote]

I admit that maximum resolution isn't a great term, but if you look around effective resolution is what pretty much every review I've read of the MacBook Pro screen uses when talking about the limited screen real-estate. Please show me references to the traditional definitions of effective and maximum resolution in the context of displays. Hell, if we want to be pedantic, resolution technically refers to pixel density not absolute width / height pixels. Common usage however is that the number of pixels wide, by the number of pixels high represents the resolution.
Keep in mind, I'm far from the only one using effective resolution when referring to the MBP retina display. Good luck fighting against common usage.

https://www.google.com/search?btnG=1&pws=0&q=macbook+pro+%22effective+resolution%22

For image editors and video editors the number of pixels is what people are interested in; that's why you can always tell programs like GIMP to display images according to the monitor size/DPI as well as some kind of "actual pixels" setting. The same goes through vector graphics and design programs.


Just tested this seashore, an image editing app for osx. When you select the actual size option, it fits the image to the effective pixels, not the actual. Meaning a 1440x900 image is full screen when in retina mode, and it's a fraction of the screen when in 1920x1200. Once again, this is when selecting the actual size zoom level. I don't have photoshop so I cannot verify that it behaves the same way, but once again effective > actual pixels.

I admit that maximum resolution isn't a great term, but if you look around effective resolution is what pretty much every review I've read of the MacBook Pro screen uses when talking about the limited screen real-estate.


If so, that's very disappointing, but that doesn't make them correct. Just because most articles about that face-eating attack said that "bath salts" were involved and that "bath salts are a form of LSD" doesn't make either of those things anything other than completely incorrect. If you look up "effective resolution" in any context other than reviews of the Macbook Pro, most of which have been around much longer than any of these things, you'll find out that it actually means something different.

Just because there's a niche group that uses the term one way (in this case, it's strictly limited to reviews of the Macbook Pro with Retina Display), doesn't change the fact that a) the other usage has been around longer and is still more widely used and b) the other usage makes more sense with respect to the definitions of the word "effective" and "resolution."

Like I said from the beginning, you didn't sufficiently describe the situation, and relying on the fact that a lot of reviews of the product also fail to describe the situation in detail or even correctly doesn't really make it better.

Please show me references to the traditional definitions of effective and maximum resolution in the context of displays.[/quote]

"Maximum resolution" in the context of displays always means either a) the native resolution of an LCD-type screen or b) the, well, maximum resolution of a CRT-type screen.

As for effective resolution, the traditional definition doesn't even apply to displays, because as far as I know there's never been (and still isn't) a display that has x number of pixels but renders it in such a way that less than x pixels of information can actually be extracted.

Even if I grant you everything you're saying about effective/maximum resolution, it's still not a problem with the display (so saying "in the context of displays" doesn't make sense), it's a problem with what's being put on the display by software.

Hell, if we want to be pedantic, resolution technically refers to pixel density not absolute width / height pixels. Common usage however is that the number of pixels wide, by the number of pixels high represents the resolution.[/quote]

Right, definitely, and according to both of these definitions, your claims are still inaccurate.


Just tested this seashore, an image editing app for osx. When you select the actual size option, it fits the image to the effective pixels, not the actual. Meaning a 1440x900 image is full screen when in retina mode, and it's a fraction of the screen when in 1920x1200. Once again, this is when selecting the actual size zoom level. I don't have photoshop so I cannot verify that it behaves the same way, but once again effective > actual pixels.


To be fair, I did say "it sounds like apple hasn't caught up with itself in this regard." But you're right, that's a problem. If you display the image at smaller sizes, does it actually throw away pixels? That would be even more of a problem and I would be sad.

Also, do any of the numerous methods that supposedly render UI elements at their old sizes (in pixels) actually work? Because if they do then I'd think all of this is even less relevant.
-~-The Cow of Darkness-~-

This topic is closed to new replies.

Advertisement