What happens if you span the desktop across the internal retina display and an external non-retina display and then move a window so that parts of it are on both displays? Is it just double size on the non-retina (which would be bad)? Or does it get drawn twice, in both resolutions, and each display gets their appropriate resolution (which would be preferable, but might be overly complex and taxing on the hardware)?
It is a typical strategy of Apple to slightly underspec desirable aspects of a product like hard drive capacity or RAM (...or ports - sigh). That way, they can release an updated version where those specs are bumped, resulting in a product that is perceived as significantly better instead of just slightly better in comparison to the previous model.
I think it's also about using components that give you the largest gross margin - A slightly outdated model of a component will always be cheaper and more readily available than the cutting edge model - which further carries the risk of being less broadly tested.
An in-between step is often to offer that option (like a larger SSD), but at significant additional cost. Particularly with things like RAM, apple sometimes charges you more than twice the regular market price: Getting a Mac Pro, for instance, you pay ~1k € extra when jumping from 6GB RAM to 32GB. Buying 32GB of RAM (yes, ECC and everything), sets you back 400€.
Their SSD prices have gotten slightly more reasonable lately (buying the regular HDD option, selling that drive and buying your own SSD often saved you hundreds), although they're still a good bit off - especially considering that Apple buys OEM from Toshiba and in bulk.
My personal conclusion is always to buy the bare minimum configuration and upgrade myself - I purchased a Macbook Pro 17" at the smallest configuration and added an SSD and an extra regular 1TB drive. Will probably add some RAM later this year. Had I gotten all that at Apple right away, I would probably have to pay 50% extra (although they don't even offer the option of exchanging the DVD Drive with an HDD).
If this is like the Air (which I'm waiting for the teardown), then I'm not sure how normal people are supposed to upgrade the memory as it's soldered, while the hard drive is upgradeable (though probably not feasible for non-techies unaccustomed to opening up their laptop).
Ouch, you're absolutely right. This was also what I was thinking as I watched the ads - "Great, so my old strategy is useless now?". How quick one forgets... ;-)
I was hoping for the 512 upgrade option too. In the keynote though he mentioned that a lot of pro users like to use external disks and with thunderbolt transfer rates will be very fast. That's probably their reasoning.
My guess is the SSD upgrade would have cost something like $500 and at that point it doesn't make sense to differentiate it from the higher-end model which also includes the new processor.
Indeed. I think most pro's who would need a computer like this would want to get the 512GB option on the laptop just to secure themselves over the next few years, specifically because one can't upgrade the internal hard drive themselves like the old MacBook Pro.
Though it's possible Apple figured that if they offered such option on the 'lower' model, most people wouldn't bother with the highest end because the processor bump is marginal.
Agreed - I'm in the same boat. A 500 dollar premium for a a larger SSD is steep, when I'm not particularly interested in the faster proc. Yes, I could use an external HD, but I really don't like having to carry around more than I absolutely have to.
The smart move now will be to wait 2-3 months before buying the Retina Display machine. I'm convinced that by the launch of Mountain Lion the OS will let you use the full resolution of the machine for UI, also it will give some time to other apps to upgrade their UI in order to support the new resolution.
While the upgrade to Mountain Lion will be free, I have a funny feeling that Adobe will charge (or at least will try) you some extra money for an updated version of Photoshop for Retina Display MacBook Pro :).
Any self respecting company or individual developer will probably provide a free upgrade for the UI of their applications soon in order to support the new resolution.
Why? Apple products have always been such that to gain maximum "value" they need to be bought at launch. And isnt an OS upgrade going to be 'just' $30?
The keynote and Apple website point out that all purchases of Macs from now on will come with a free upgrade to Mountain Lion once it's released.
I wonder if the free upgrade will be the App Store version that allows you to install it on all authorized versions. If so, this could be an advantage to buying after the Mountain Lion release, since Macs that come with newer OSes generally don't allow you to upgrade your other Macs to the new OS.
The way it works now when you get a new Mac, the iLife apps (and presumably Lion) are "granted" to the Apple ID you use to register the computer. Thus you can use that ID to install them on other computers.
Instead of giving you the real pixels with a DPI setting, they hide them and create the retina/non-retina modes for applications, so you have to pay for "retina-enabled" apps. A whole new way of sucking money out of apple users.
You won't pay anything more. Pro Apple apps just got a free update to Retina-enabled, regular apps will get updated with Mountain Lion, and third party apps will also be updated for free.
I don't think you understand what this high-DPI thing is meant to be and how it is supposed to work. Giving you the real pixels is not an option, because every asset of every app would have to be updated or look tiny and be unusable.
It doesn't even make sense what you propose, because retina screens are meant to give extra detail for same sized elements, not merely extra pixels -- nobody needs 5MP in a 15", whereas the extra detail is easily appreciated.
There's no guarantee that 3rd party developers will provide a free update. They could release Foo HD alongside their Foo app and make you buy it again.
This is an advantage of hiDPI screens that hadn't quite clicked for me before - a throwback to one of the advantages of CRTs, allowing multiple resolutions to look good on a single display.
While true, to an extent, it's also the case that resolutions other than native will require some kind of supersampling (eg. AA), unless they divide cleanly into native resolution - a problem CRTs never faced.
This results in both blur (ngligible with a high enough resolution, but doesn't seem to be the case here, seeing the Safari vs Chrome comparison) and a performance hit.
True, and even CRTs had the issue that if you set a resolution that created interference in the shadow mask it would look like crap. There was a always a 'best' resolution to a CRT even if it was forgiving in terms of different settings.
Given the 'free' GPU we get with machines these days (and by free I mean its always included) it seems reasonable to use its oversampling anti-aliasing feature for desktop display. I'm not sure why you wouldn't do that unless it was some sort of power issue.
The power issue is hugely significant. Using a GPU kills battery life, which is why laptops now have two GPUs -- a low-power basic GPU and a switchable high-power GPU
I get the impression that at every "resolution", we're really at the native retina resolution, but with UI widgets scaled to different levels, and different present-day resolutions reported to applications. Can anyone with one of these beauties confirm?
Yes, but this just amounts to scaling via the compositor and graphics stack (hardware-accellerated, of course), as opposed to being left to the 'screen' (post-GPU-hardware/firmware) as would be the case when actually driving a screen at non-native resolution.
See the Safari/Chrome comparison in the article to see what this means in real terms...
I really don't like the approach of hiding the display's real pixel size. By now, we should have solved the problem of displaying GUIs with correctly sized elements at various resolutions. This approach only makes it more complicated to do it right in the future.
Apple's is based on the typographic convention of 72 points per inch, thus text on a Mac screen would be drawn the same size as it would print.
According to Wikipedia, Microsoft's 96 is apparently based on screen text being viewed from a different distance than printed text, so they use 96 ppi to account for the difference, which also gave them more a few more pixels with which to draw characters. Apple had to draw a 10 point character 10 pixels high, but Microsoft could draw a 10 point character 13 pixels high, giving them more to work with.
> Apple's is based on the typographic convention of 72 points per inch, thus text on a Mac screen would be drawn the same size as it would print.
This would be true if the displays were also 72 dpi, which they are not. DPI in Windows can be set manually. Mine is set to 84, based on the handy on-screen ruler in the settings page. If I view a document in Word's print preview at 100% zoom, and stick a printed page on my display (via static electricity, try it) next to it, they match exactly. Very handy for printing over a page with existing content (paper with preprinted corporate borders, headers).
I think the Mac displays used to be, at least at the maximum size. I vaguely recall Mac monitors being somewhat "lower-resolution" than same-size Windows monitors, just because of this. Maybe it was Mac laptops? Not sure. Certainly the original Mac 128 was fixed resolution.
"built-in display was a one-bit black-and-white, 9 in (23 cm) CRT with a resolution of 512×342 pixels, establishing the desktop publishing standard of 72 PPI"
ImageWriter prints were 144dpi, and you could verify thtat the system was WYSIWIG by holding a print in front of the screen. Also, Mac OS graphics used 1/72" pixels for years (with various hacks added soon in order to support the LaserWriter's 300 dpi)
Windows (initially?) had separate notions of device pixels and logical pixels (dialog units?) that allowed for some resolution independence. Its most obvious disadvantage was that it was not simple (neigh impossible?) to know whether two parallel lines you drew looked equally wide.
ISTR that displays also squawk their DPI over EDID. I say that based on experience fiddling with point sizes to get Glass Tty VT220 working right on several different displays; but of course you can also change the dpi with a magic xorg.conf setting, command line option, or xrandr.
The same reason why Apple has not changed iOS devices' aspects and "virtual pixel count" since day one. Android did allow random resolutions and screen aspects, and look at the mess it has caused.
What would be coming in the future, though, that we'd have to do it "right"? If this is good enough that people can't see pixels, do we ever need to go higher? Resolution seems like one of the few features that has a limit, so this "hack" may in fact be the most elegant and simplest solution for now and in the future.
Hopefully, it's the first step towards truly resolution-independent interfaces. Imagine differences in screen resolution being differences in quality, as opposed to scale. Of course there will need to be a lot of research into UI/UX to make the most of the opportunities.
In a way, when referring to monitors, the word 'resolution' is in itself a misnomer, as we're not actually talking about density but instead about absolute pixel measurements.
Imagine differences in screen resolution being differences in quality, as opposed to scale.
That's exactly what this is, though — the window decorations, for example, in OS X on the new MBP are identical in size to the ones on the "old" MBP, they're just more detailed.
And since they're so detailed (hence "retina display" moniker) that most people won't be able to see any pixels at all and, since in just a few years, all displays will obviously be "retina displays", I think there's a strong argument to be made that resolution-independent interfaces aren't ever gonna happen and that that's not a bad thing at all. (They've been tried for years and they always end up a burden for the developer and/or not working well for the user.) The limitations of human biology make it so that this simple "hack" is actually the easiest for developers, works the best for users, and won't ever need to be replaced. … OK, yeah, one day we'll all have bionic eyes and we'll have to do the same doubling trick again, but presumably "Bionic-Retina Displays" will be cheap to produce by then. :)
Without completely eliminating all parameters to Cocoa and CoreFoundation drawing functions that take measurements in pixels and instead using pixel-less measures (sort of like how we specify font size in pts), this is always going to be a "hack".
Everytime someone makes such a comment, my mind is blown at how we already live in the future. Just a few weeks ago if someone would've told me that they're planning to _downscale_ to 1920x1200 on a _15"_ screen, on a portable computer I would've thought they were crazy.
Er no, it upscales. When you put a 640x480 image through an old lcd that supports only 1024x768, then it needs to scale the original image up, hence it upscales. Same if you put a 1080p image into a retina display which has a much higher resolution: it needs to upscale.
The resolution by the OS is ostensibly 1920x1200. This is what apps see and what they render bitmaps for.
This is then upscaled twice in each direction to 3840x2400.
Then, this finally somehow gets put on 2880x1800. As I understand it, it gets downscaled from 3840x2400 directly to 2880x1800.
LOL, it just dawned on me. You guys have been taken for such a fucking ride. I've just checked the numbers, I hadn't noticed it before.
The physical resolution of the monitor is 2880x1800. This is exactly 1.5x more in each direction than 1920x1200, which is the original input format.
Now you must understand two things:
1. scaling is a DSP process.
2. You never use floating point operations in digital signal processing if you can get away with integers, because integer operations are so much cheaper and because the result is always more precise. (The only reason to use floats is to get more headroom, but that's a detail you don't need to worry with because it is in no way applicable to resolution scaling. You could use floats for pixel colour, and many systems do that already)
So what does this mean? How do you multiply by 1.5 if you can only use integers?
--> YOU MULTIPLY BY THREE AND DIVIDE BY TWO. <--
The intermediate resolution of 3840x2400 exists solely as a tiny step in the rescaling algorithm, a step which in itself does not add any information that wasn't there in the original 1920x1200 image. Similarly, some of you audio people may realize that most of your VST or AU plugins have 64x oversampling. Does this mean that your project is working at 12 MHz sampling rate? NO!!! Your recorded material is still at 192k, and your project still does not contain any useful information above 96 kHz. Don't be silly. There's being tricked when not knowing, but given that a majority of the people commenting on mac hardware here seem to have some idea of audio processing, that's just fundamentalist hypocrisy.
You've fallen prey to sensationalist marketing and measurement of e-penis. Except your e-penis is completely made up and never actually happened.
Wow, some people in this thread are so, extremely, gullible. I had really come to expect more from HN, and am truly disappointed and dismayed.
>> You never use floating point operations in digital signal processing if you can get away with integers, because integer operations are so much cheaper and because the result is always more precise.
On Nvidia GPUs (the GPU in the machine), integer ops are second class citizens. The GPU architecture is optimized for floating point operations, and they do it extremely well. Using integers instead of floating point is not very advantageous. AMD GPUs are a different story however.
The intermediate resolution of 3840x2400 exists solely as a tiny step in the rescaling algorithm, a step which in itself does not add any information that wasn't there in the original 1920x1200 image.
The thing you're missing is that this step does add information. OS X will render retina graphics for the UI instead of the traditional graphics, then those higher-resolution graphics are downsized. So, instead of an 18x18 icon, it will use a more detailed 36x36 icon, then downscale that. In the case of apps that haven't been updated for retina displays you are correct — certain assets will be upscaled. But the operating system and many core apps are ready for retina display.
Nobody is saying it's a perfect solution, but it's much better than just upscaling. The point is that at these densities downscaling can look pretty good — supposedly better than the native resolutions emulated by the downscaling.
Let's say you have image assets in two sizes - 1x and 2x. If you want to show an image at 1.5x, you don't upscale from 1x, you downscale from 2x so that you have a crisper image.
If you're looking for a practical application of this, watch a YouTube video fullscreen that has 720p and 1080p options on a display that's 1680x1050 or anything in between the two video sizes. The 1080p version is better looking because it's scaled down, not up.
This is the concept behind running a 3820x2400 framebuffer and downscaling it to 2880x1800 for display - you're getting a crisper image because of the use of 2x drawing by the system for text and standard UI elements. Of course, third-party bitmaps aren't going to benefit from this until they're updated, but Apple developers have been through this before with retina displays on iOS devices.
I buy a new MacBook Pro every year, and the dilemma here is similar to Anand's. I bought the high-resolution display on my current 15.4" MBP at 1680x1050. The resolution can be scaled to 1680x1050 or 1920x1200, but I'd like to know how blurry that is as opposed to native resolution. I've flipped my current MBP to 1440x852 to get a feel for the same effective real estate as the Retina Display's native resolution. Will try it out for a few days before pulling the trigger on upgrading.
Per the article, the new MBP actually renders at double the scaled resolution (so 3360x2100 for 1680x1050) and then downscales that to fit 2880x1800. (See the screenshots.) It shouldn't look blurry at all. In fact, it should look considerably better than a MBP with a 1680x1050 display.
But using a scaled resolution "may affect performance" (understandably, of course, because each refresh now has to draw an enormous desktop and downscale it)
I would probably look at apps other than Chrome. There's "unaware of the retina display", and there's "we eschewed native APIs to some extent and rolled our own implementations of stuff".
I don't know if this is the case, exactly, with Chrome. I'm just saying it might not be wise to generalize from Chrome's appearance to retina-naive 3rd party apps in general.
> I don't know if this is the case, exactly, with Chrome
As part of its security model, Chrome does something akin to rendering everything offscreen in an unprivileged process then passing that to the user-facing process.
For creators, one of the reasons to own these devices is to create content for retina devices.
I'm curious of Apple and Adobe have worked together to allow image documents to display at true resolution, within a scaled output?
Does that make sense? I suppose what I'm saying is, working at a 100% canvas at 100% zoom level (instead of say, a 50% scale level) while all system elements are scaled.
EDIT: This might not even matter, from a practical perspective.
Why are they faking the resolution? Is it that the graphics card has problems, or is it to make their own apps look nicer than competitors, or what? I hope they are not rescaling stuff like photoshop to something other than actual screen res, otherwise they will severely piss off graphics professionals even more than they managed with the Mac Pro workstation non-upgrade.
[edit] reading through the article, it seems that they've either gone and broken the word 'resolution', or Anand is very confused.
At 1440 x 900 you don't get any increase in desktop resolution compared to a standard 15-inch MacBook Pro, but everything is ridiculously crisp.
I have read this sentence three times and it still makes no sense unless the word 'resolution' has been completely mauled by marketing idiots.
Because bitmap-based applications (which is all of them) designed for 110 DPI are unusable at 220 dpi, you can barely click the buttons.
> I hope they are not rescaling stuff like photoshop to something other than actual screen res
From the keynote, I would guess they are unless the application "opts in" to HiDPI support, as with iOS. HiDPI applications would be rendered in "true resolution" (2880x1800), non-HiDPI ones are automatically scaled to the specified global setting (for non-native controls, native controls probably use native rendering in all cases)
> I have read this sentence three times and it still makes no sense unless the word 'resolution' has been completely mauled by marketing idiots.
It scales bitmaps, but text is rendered at native resolution (or so I'd guess), so the text and "native" UI elements are going to be extremely crisp, because they'll be rendered with 4 times the usual amounts of pixels for the same physical size.
> So the current existing version of photoshop can't access all the pixels then?
It can if you set the global resolution to 2880x1800, but because most UI elements are likely bitmapped it'll be unusable: a 30px button will remain 30px, but will be a quarter the physical surface.
Ah well, that would be sad. Though maybe not surprising either (after reading the article, it looks like the "emulated" resolutions are limited but games get access to the native resolution).
As a bitmap editor, it can access 100 times those pixels.
As a GUI, it renders as it always did in Retina resolution, i.e it is not shows it's buttons and labels any more detailed, or any smaller. This is how every non-Retina ready app behaves.
That doesn't affect how big an image it can edit at all.
Windows 8 Metro, supposedly. They designed it to be either vector graphics or for bitmaps you have to supply multiple sizes. The largest size is chosen so that higher resolutions than that won't be neccesary because the human eye can't perceive it, so it will be future proof (at least that's what their marketing says).
FWIW, that's exactly the approach Apple has taken with both iOS and OS X. Some elements are rendered as vector graphics, most are rendered as bitmaps and the app provides a 1x and 2x bitmap.
They are scaling them, is just they are scaling everything rather than allowing user control. I like scaling of stuff, as long as any window that I choose to push to native resolution goes to native resolution, especially on a machine that expensive. And I don't expect to have to go out and rebuy lots of software to enable that.