Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Apple Releases New Crap

Apple Releases New Crap (Page 2)
Thread Tools
mindwaves
Registered User
Join Date: Sep 2000
Location: Irvine, CA
Status: Offline
Reply With Quote
Oct 18, 2014, 09:47 AM
 
Yes, the iPhone only having 1 GB of RAM and 16GB base storage (iPad the same for the base storage) is unacceptable. You have to jump to 64GB base storage in both iPad and iPhone for $100 more instead of the more typical 32GB to 64GB jump.
     
Shaddim
Clinically Insane
Join Date: Apr 2003
Location: 46 & 2
Status: Offline
Reply With Quote
Oct 18, 2014, 11:25 AM
 
FYI, I monkeyed around with a new retina iMac, and it's damned nice, but don't even consider buying one with only 2GB of VRAM. 4GB should be the bare minimum for 5K, and really, 6GB+ should be the target for any semi-serious gaming.
"Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it."
- Thomas Paine
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Oct 18, 2014, 01:38 PM
 
Originally Posted by EstaNightshift View Post
Shouldn't be disposable?

(I'd really rather have socketed RAM slots. I think I'll stick with my quad-core mini for a bit)
I must have made a typo and got caught by autocorrect. I changed it.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 19, 2014, 06:53 AM
 
Originally Posted by Shaddim View Post
FYI, I monkeyed around with a new retina iMac, and it's damned nice, but don't even consider buying one with only 2GB of VRAM. 4GB should be the bare minimum for 5K, and really, 6GB+ should be the target for any semi-serious gaming.
Not really. There is no GPU on the market that let's you game at 5K, and even 4K requires a multi-GPU setup in most cases. To make it work, you have to drop to half, the 1440p resolution, or lower. For that and settings that make the game run on a M290X anyway, 2GB VRAM is OK today, and will likely be usable for some time, as many cards have that.

Best prediction is that 4GB will be enough for the lifetime of the current consoles. A game on them currently gets 5.5GB RAM + VRAM, and since a decent game will need at least 1.5GB RAM, 4GB VRAM should be enough until the PS5 and its competition arrives.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Cold Warrior
Moderator
Join Date: Jan 2001
Location: Polwaristan
Status: Offline
Reply With Quote
Oct 19, 2014, 08:05 AM
 
Didn't someone make an external GPU rig for thunderbolt? I don't know why that came to mind and it's too early to research.
     
BadKosh
Professional Poster
Join Date: Aug 2007
Location: Just west of DC.
Status: Offline
Reply With Quote
Oct 19, 2014, 09:28 AM
 
the New ITunes REALLY SUCKS! Now, some of the internet radio streams are producing loud noise instead of the actual stream. My un-updated version (2 back) doesn't have this "feature". Jobs would kill these guys.
     
Shaddim
Clinically Insane
Join Date: Apr 2003
Location: 46 & 2
Status: Offline
Reply With Quote
Oct 19, 2014, 11:32 AM
 
Originally Posted by P View Post
Not really. There is no GPU on the market that let's you game at 5K, and even 4K requires a multi-GPU setup in most cases.
The new GeForce 980 will produce decent-enough frames for 4K (50-80 fps in most demanding titles), especially when generously overclocked, which is what it's designed for, and SLI 980s will run 5K at about the same rate. What the 4GB option for the Retina iMac will do, however, is allow you to play @1440p, which will scale much better than 1080p.

To make it work, you have to drop to half, the 1440p resolution, or lower. For that and settings that make the game run on a M290X anyway, 2GB VRAM is OK today, and will likely be usable for some time, as many cards have that.
1440p is too much for 2GB of VRAM, games like Shadow of Mordor require ~2.5GB of VRAM*, according to the MSI Afterburner HUD, just to run at medium textures @1440p (and a whopping 6GB to run high textures @4K).

Best prediction is that 4GB will be enough for the lifetime of the current consoles. A game on them currently gets 5.5GB RAM + VRAM, and since a decent game will need at least 1.5GB RAM, 4GB VRAM should be enough until the PS5 and its competition arrives.
With what is going down right now (Ubisoft and Activision threw the console makers under the bus, finally) I think studios will stand up to Sony and MS, and will open up the benefits of having a high-end gaming PC. This generation of consoles just plain sucks, with the XBone being the worst of the lot, and motivated PC gamers aren't allowing themselves to be held back by them, and have taken to actively hacking and modding games that try. I love what Carmack said recently (paraphrasing), "Today's player base for PC games are more tech savvy than the developers themselves, and if we don't give them what they want, they'll just break our shit and rework it."



*Addendum:
One of the issues with Shadow of Mordor for PC (and Watchdogs), and the reason it requires so much VRAM, is extremely lazy porting. Consoles have unified memory and the coding APIs are all written to fully take advantage of that. Well, most developers address that and at least re-code the memory calls. Not so with SoM, when Monolith released it for PC they simply put it in a new wrapper (essentially an emulator), tested it for a month to see if it was stable, and then threw it out to the public. By the end of the year some sufficiently nerdy gamers will rectify that and give the game the development attention it deserves, but they shouldn't have to. Did you know that the console manufacturers get final approval on all PC ports from the major publishers? To me, that's as bad as MS having control of every game written for the PS4 and being allowed to cripple them at will. That's some crooked s***.
"Those who expect to reap the blessings of freedom must, like men, undergo the fatigue of supporting it."
- Thomas Paine
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 19, 2014, 03:28 PM
 
Originally Posted by Waragainstsleep View Post
And the new Mac Mini has soldered RAM.
Thanks for the info - I wanted to get the cheapest mini for my home cinema and upgrade the RAM myself ...seems not to be a very good idea!
Maybe I should rather go for the middle model...
I hope it's still easy to install a second HD (SSD)??
***
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Oct 19, 2014, 04:12 PM
 
Originally Posted by badidea View Post
Thanks for the info - I wanted to get the cheapest mini for my home cinema and upgrade the RAM myself ...seems not to be a very good idea!
Maybe I should rather go for the middle model...
I hope it's still easy to install a second HD (SSD)??
Last year's model is what you're looking for.

Honestly, though, if all you're doing is media playback and light serving to other computers in the house the $499 new one will be fine, 4GB RAM and all.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 19, 2014, 05:04 PM
 
Originally Posted by Shaddim View Post
The new GeForce 980 will produce decent-enough frames for 4K (50-80 fps in most demanding titles), especially when generously overclocked, which is what it's designed for, and SLI 980s will run 5K at about the same rate. What the 4GB option for the Retina iMac will do, however, is allow you to play @1440p, which will scale much better than 1080p.
You can do the same with a pair of 290s, they have better scaling at high resolutions. That's my point, though - one card is nowhere near enough for 5K.

The upgraded GPU will be an improvement at 1440p, but not particularly because it has more VRAM. It has more shaders, which will help when there are more pixels to calculate effects for, but that doesn't use more RAM. In fact, if it is the card I think it is (Tonga), it has the same number of ROPs as the M290X, which means that you can't run any heavy AA on it - meaning that you hardly need any more video RAM at all.

In fact, I ran through FC3 and the recent Batman games at 1440p on a 7850/2GB. That worked, fine as long as you didn't set the idiot settings (Metro: LL was too much, though).

Originally Posted by Shaddim View Post
1440p is too much for 2GB of VRAM, games like Shadow of Mordor require ~2.5GB of VRAM*, according to the MSI Afterburner HUD, just to run at medium textures @1440p (and a whopping 6GB to run high textures @4K).
Nope, that's not how it works. A card will keep textures compressed in RAM, and expand the most used ones as space allows. If you have 6 GB of video RAM, it will expand to use that if it can. Doesn't mean that you get any measurable performance improvement from having more - a texture might only be used once, but because there is no RAM pressure in your test, it doesn't get recompressed ever.

Originally Posted by Shaddim View Post
With what is going down right now (Ubisoft and Activision threw the console makers under the bus, finally)
Link on that? Ubisoft is the worst of the big publishers. They purposefully messed up Watch dogs to run worse on PC to avoid making the consoles look bad. Their PC ports are always the worst. What exactly did they do to throw console makers under the bus?

Originally Posted by Shaddim View Post
I think studios will stand up to Sony and MS, and will open up the benefits of having a high-end gaming PC. This generation of consoles just plain sucks, with the XBone being the worst of the lot, and motivated PC gamers aren't allowing themselves to be held back by them, and have taken to actively hacking and modding games that try. I love what Carmack said recently (paraphrasing), "Today's player base for PC games are more tech savvy than the developers themselves, and if we don't give them what they want, they'll just break our shit and rework it."
The PS4 is not a bad machine for running a game at 1080p60. It was designed to that, and it does exactly that. The Xbone is the underpowered one, that is what is holding everyone back. Memory-wise the situation is identical, though.

Originally Posted by Shaddim View Post
*Addendum:
One of the issues with Shadow of Mordor for PC (and Watchdogs), and the reason it requires so much VRAM, is extremely lazy porting. Consoles have unified memory and the coding APIs are all written to fully take advantage of that. Well, most developers address that and at least re-code the memory calls. Not so with SoM, when Monolith released it for PC they simply put it in a new wrapper (essentially an emulator), tested it for a month to see if it was stable, and then threw it out to the public. By the end of the year some sufficiently nerdy gamers will rectify that and give the game the development attention it deserves, but they shouldn't have to. Did you know that the console manufacturers get final approval on all PC ports from the major publishers? To me, that's as bad as MS having control of every game written for the PS4 and being allowed to cripple them at will. That's some crooked s***.
Shadows of Mordor had a high-res textures pack that was designed to make use of very high amounts of video RAM, other than that, it wasn't so bad. 1440p is still a very high resolution to game at. Making use of the unified memory space for anything fancy is very tricky even on the new consoles. There might be something coming, but it is not what is making SoM hard to run today - no engine currently used does that, partially because it still has to run on the 360 and PS3 (The 360 has a unified memory pool, but the GPU cannot access the pageable memory, making it essentially impossible to do something fancy with it in a game. The PS3 has separate memory pools)

Most console ports use a wrapper. Doesn't mean that they're using an emulator. An Xbox game can be written for DirectX directly, so the graphics calls doesn't have to be ported, and the low-level API used to get more performance is nothing more than Mantle, which can also be used on an AMD card and has very limited benefit on a powerful chip.

Watch dogs was, as I mentioned, sabotaged on purpose. There are mods to correct the damage.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
subego  (op)
Clinically Insane
Join Date: Jun 2001
Location: Chicago, Bang! Bang!
Status: Offline
Reply With Quote
Oct 19, 2014, 05:16 PM
 
I could use a Mini with 32GB of RAM.

I'm not joking.
     
Cap'n Tightpants
Addicted to MacNN
Join Date: Oct 2014
Location: Shaddim's sock drawer
Status: Offline
Reply With Quote
Oct 19, 2014, 07:20 PM
 
Originally Posted by P View Post
You can do the same with a pair of 290s, they have better scaling at high resolutions. That's my point, though - one card is nowhere near enough for 5K.

The upgraded GPU will be an improvement at 1440p, but not particularly because it has more VRAM. It has more shaders, which will help when there are more pixels to calculate effects for, but that doesn't use more RAM. In fact, if it is the card I think it is (Tonga), it has the same number of ROPs as the M290X, which means that you can't run any heavy AA on it - meaning that you hardly need any more video RAM at all.

In fact, I ran through FC3 and the recent Batman games at 1440p on a 7850/2GB. That worked, fine as long as you didn't set the idiot settings (Metro: LL was too much, though).
I don't think we're actually disagreeing. However, the M290X in the iMac is essentially a downclocked Pitcairn, because supposedly 1/3rd of its TMUs are disabled, and for desktop gaming at anything above 1080p it would be woefully underpowered.

Nope, that's not how it works. A card will keep textures compressed in RAM, and expand the most used ones as space allows. If you have 6 GB of video RAM, it will expand to use that if it can. Doesn't mean that you get any measurable performance improvement from having more - a texture might only be used once, but because there is no RAM pressure in your test, it doesn't get recompressed ever.
Textures in modern games are huge, and since console ports love more VRAM, I can't see how more VRAM wouldn't be a great benefit. If anything, it would keep from having to flush any textures from memory (the game wouldn't have to go back to system RAM, or even worse, the hard drive).

Link on that? Ubisoft is the worst of the big publishers. They purposefully messed up Watch dogs to run worse on PC to avoid making the consoles look bad. Their PC ports are always the worst. What exactly did they do to throw console makers under the bus?
Ubisoft Engineer Supposedly Suggests 30fps Parity Is Set By Console Manufacturers

The PS4 is not a bad machine for running a game at 1080p60. It was designed to that, and it does exactly that. The Xbone is the underpowered one, that is what is holding everyone back. Memory-wise the situation is identical, though.
That's what I said.

Shadows of Mordor had a high-res textures pack that was designed to make use of very high amounts of video RAM, other than that, it wasn't so bad. 1440p is still a very high resolution to game at. Making use of the unified memory space for anything fancy is very tricky even on the new consoles. There might be something coming, but it is not what is making SoM hard to run today - no engine currently used does that, partially because it still has to run on the 360 and PS3 (The 360 has a unified memory pool, but the GPU cannot access the pageable memory, making it essentially impossible to do something fancy with it in a game. The PS3 has separate memory pools)
SoM isn't 100% smooth on my GeForce Blacks w/ 6GB of VRAM @1440p with the ultra texture pack, however, I expect that will change when they finally release the SLI patch. 1440p is now the 2nd highest gaming res recorded on Steam, so it's very much mainstream now.

Most console ports use a wrapper. Doesn't mean that they're using an emulator. An Xbox game can be written for DirectX directly, so the graphics calls doesn't have to be ported, and the low-level API used to get more performance is nothing more than Mantle, which can also be used on an AMD card and has very limited benefit on a powerful chip.
That's what I was saying, however, it wasn't re-coded and because of that it overloads VRAM. BTW, SoM by most accounts slogs on the PS3 and Xbox360, to the point where it's nearly unplayable in areas that are thick with orcs.
( Last edited by Cap'n Tightpants; Oct 20, 2014 at 02:49 AM. )
"I have a dream, that my four little children will one day live in a
nation where they will not be judged by the color of their skin,
but by the content of their character." - M.L.King Jr
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 20, 2014, 02:44 AM
 
Originally Posted by EstaNightshift View Post
Last year's model is what you're looking for.

Honestly, though, if all you're doing is media playback and light serving to other computers in the house the $499 new one will be fine, 4GB RAM and all.
Yes, you're right, except there's no last year's model!

I have a 2009 mac mini which is doing the job quite well. I want one with HDMI video+audio out now and I just don't like to buy an already outdated model...and it should be powerful enough for anything I'd want it to do in the near future!
***
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 20, 2014, 04:23 AM
 
Originally Posted by Cap'n Tightpants View Post
I don't think we're actually disagreeing. However, the M290X in the iMac is essentially a downclocked Pitcairn, because supposedly 1/3rd of its TMUs are disabled, and for desktop gaming at anything above 1080p it would be woefully underpowered.
A third of its TMUs disabled? TMUs sit with the shaders in the GCN setup, so disabling a third of the TMUs means disabling a third of the shaders. That would bring it down to 14 CUs, the same as Bonaire (admittedly, with more ROPs and bandwidth), which is a massive cutdown. I haven't seen anything to indicate that.

Originally Posted by Cap'n Tightpants View Post
Textures in modern games are huge, and since console ports love more VRAM, I can't see how more VRAM wouldn't be a great benefit. If anything, it would keep from having to flush any textures from memory (the game wouldn't have to go back to system RAM, or even worse, the hard drive).
The evidence just doesn't agree with that assumption. Not too long ago, 1GB VRAM was enough for 1080p gaming (there is a 1 GB 7850 that was, at launch in March 2013, judged to be "OK for now, but not future proof"). Flushing textures to main memory is not a big deal with PCIe 3.0 x16.

That piece essentially says that console manufacturers (read: Microsoft) are trying to get Ubi to nerf the PC games even more than they have before. This would be an unprecedented level of interference, and Ubi not bending to that is not "throwing console makers under the bus". If anything, setting the same limit for Xbone and PS4 is bending knee to MS.

Originally Posted by Cap'n Tightpants View Post
SoM isn't 100% smooth on my GeForce Blacks w/ 6GB of VRAM @1440p with the ultra texture pack, however, I expect that will change when they finally release the SLI patch. 1440p is now the 2nd highest gaming res recorded on Steam, so it's very much mainstream now.
Most likely the problem comes from you running SLI in the first place. On my setup, single 290 at 1440p, it is smooth as butter.

Originally Posted by Cap'n Tightpants View Post
That's what I was saying, however, it wasn't re-coded and because of that it overloads VRAM. BTW, SoM by most accounts slogs on the PS3 and Xbox360, to the point where it's nearly unplayable in areas that are thick with orcs.
How would this recoding use less VRAM? It can use the same texture compressions formats, because the GCN 1.1 in Xbone/PS4 is the same arch as that in the 290 and similar enough to the Radeon 7000 series, which I believe matches nVidia's Kepler arch quite closely. Did nVidia make some new texture compression format that you would like them to use?

VRAM pressure just isn't a major limiting factor at this point. If you make bigger textures, you run into memory bandwidth issues long before you run out of space. Future console games, when we get multiplat games that do not support the 360 and PS3 (of which there is currently exactly one, all other games are either multiplat that includes the last gen, or exclusives), might make better use of the additional VRAM space that the PS4 and Xbone give them, but even then 3GB VRAM is likely to be enough.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Oct 20, 2014, 05:19 AM
 
Originally Posted by subego View Post
I could use a Mini with 32GB of RAM.

I'm not joking.
A Quad mini with 32 GB and an SSD would be more than enough for the vast majority of studios.
     
ShortcutToMoncton
Addicted to MacNN
Join Date: Sep 2000
Location: The Rock
Status: Offline
Reply With Quote
Oct 20, 2014, 07:08 AM
 
I have the quad i7 with 16Gb and for my purposes - light serving and media playback/HTPC centre - it should be a good few years before it needs an upgrade.

But those had the built-in Intel graphics card. Pretty shoddy.
Mankind's only chance is to harness the power of stupid.
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Oct 20, 2014, 07:26 AM
 
Originally Posted by Spheric Harlot View Post
A Quad mini with 32 GB and an SSD would be more than enough for the vast majority of studios.
And that fact is PRECISELY why we won't see one for a while. Apple wants you to buy the more pro hardware for that!
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 20, 2014, 07:55 AM
 
Wow, I just had a small chat with someone at the Apple online store (there's a chat option) and they seem to have no glue what they are selling!

I wanted to know if it is possible to add an extra HD (SSD) to one of the new Mac minis and got the answer that I can get the HD swapped by a certified technican...but I can add extra RAM by myself!(??)
So I had to tell them that this is a) not what I wanted to know and b) not correct...

It seems to be quite complicated to find reliable information about the new minis!
***
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Oct 20, 2014, 07:57 AM
 
Primate Labs has some estimates about performance of the new Mac minis.

Estimating Mac mini Performance

Generally speaking, with multi-core performance, the new low-end mini is just a hair slower in overall performance than the original Mac Pro.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Oct 20, 2014, 08:28 AM
 
Originally Posted by EstaNightshift View Post
And that fact is PRECISELY why we won't see one for a while. Apple wants you to buy the more pro hardware for that!
To be fair, I doubt that Apple sold more than a handful to musicians. A fifteen-inch MacBook Pro has too many obvious benefits.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 20, 2014, 09:55 AM
 
Originally Posted by EstaNightshift View Post
And that fact is PRECISELY why we won't see one for a while. Apple wants you to buy the more pro hardware for that!
Actually, the reason why there isn't a quad core mini this time around is pretty mundane: unlike Ivy Bridge, the 2-core and 4-core variant of Haswell uses different sockets, so Apple would have had to design two boards for the same machine. With Ivy Bridge, both used the same socket. Could Apple have designed a separate motherboard? Sure, but the Mac mini is a niche product as it is, so I don't think they've had a big incentive to do so. Does anyone know whether Broadwell's 2c and 4c versions will use the same pin layout again?
I don't suffer from insanity, I enjoy every minute of it.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 20, 2014, 01:01 PM
 
Broadwell uses the same socket layout as Haswell (the shrinks always do), but it will be a parenthesis anyway. Skylake will use LGA1151 for all the socketed versions, but that doesn't say anything about the BGA versions - Haswell used LGA1150 for all versions except the Crystalwell ones.

I don't know, honestly.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
ort888
Addicted to MacNN
Join Date: Feb 2001
Location: Your Anus
Status: Offline
Reply With Quote
Oct 20, 2014, 01:59 PM
 
The problem is that there is a hole big enough to drive a truck through in Apple's desktop lineup. I know the iMac is supposed to fill that hole, but the iMac is something different. Not everyone wants a computer glued to the back of their super expensive monitor. You used to be able to get a decent pro quality desktop machine for $1,600. Now you can either get a Mini or a Mac Pro. You're getting a scooter or a formula one car. They don't sell anything in between.

My sig is 1 pixel too big.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 20, 2014, 02:14 PM
 
Why is the Mac mini the equivalent of a scooter?
I don't suffer from insanity, I enjoy every minute of it.
     
ort888
Addicted to MacNN
Join Date: Feb 2001
Location: Your Anus
Status: Offline
Reply With Quote
Oct 20, 2014, 02:21 PM
 
Scooter might be harsh. Let's call it a Honda Civic.

My sig is 1 pixel too big.
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Oct 20, 2014, 03:33 PM
 
Originally Posted by OreoCookie View Post
Actually, the reason why there isn't a quad core mini this time around is pretty mundane: unlike Ivy Bridge, the 2-core and 4-core variant of Haswell uses different sockets, so Apple would have had to design two boards for the same machine. With Ivy Bridge, both used the same socket. Could Apple have designed a separate motherboard? Sure, but the Mac mini is a niche product as it is, so I don't think they've had a big incentive to do so. Does anyone know whether Broadwell's 2c and 4c versions will use the same pin layout again?
Yeah, I went over this in a different thread
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 20, 2014, 04:57 PM
 
Originally Posted by ort888 View Post
Scooter might be harsh. Let's call it a Honda Civic.
I still don't understand the comparison: back in the day you needed to pay $1600 to get a decent Mac one way or another. My first iBook cost me ~$4000. Since typical workloads for most don't scale well beyond 4 cores, I'll focus on the quad core case: A Mac mini has about the same cpu horse power than a 4-core Mac Pro from 2010. In terms of single-core performance, a 2011 quad core Mac mini is about as fast as a 2010 Mac Pro (W3565, 4 cores @ 3.2 GHz). Even in the multicore benchmark, the 4-core Mac Pro only has a 10 % performance advantage. That's still true for newer quad core Macs: the Mac Pro is just in the middle unless you scale up the core count.

However, the Mac Pros' advantage is not single-core performance, you only need one if you have very specific workloads (or reliability demands). So I don't think it's accurate to compare a Mac mini to a Honda Civic, they're significantly faster than you think.

I think there is a lot of psychology involved here: in the past, graphics professionals were pushing hardware to their limits (big and many cpus, powerful graphics cards), but now (with exceptions) they simply don't. Even the integrated graphics of modern Intel cpus is, compared to discrete mobile GPUs, quite good (roughly on par with the GeForce 650M from the 2012 Retina MacBook Pro).
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 20, 2014, 04:59 PM
 
Originally Posted by EstaNightshift View Post
Yeah, I went over this in a different thread
I noticed after I posted … sorry.
I don't suffer from insanity, I enjoy every minute of it.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Oct 20, 2014, 05:51 PM
 
Originally Posted by ort888 View Post
The problem is that there is a hole big enough to drive a truck through in Apple's desktop lineup. I know the iMac is supposed to fill that hole, but the iMac is something different. Not everyone wants a computer glued to the back of their super expensive monitor. You used to be able to get a decent pro quality desktop machine for $1,600. Now you can either get a Mini or a Mac Pro. You're getting a scooter or a formula one car. They don't sell anything in between.
You're missing the whole portable market, which, in this age of external expansion is the de facto desktop machine.
     
jmiddel
Grizzled Veteran
Join Date: Dec 2001
Location: Land of Enchantment
Status: Offline
Reply With Quote
Oct 21, 2014, 01:21 AM
 
Early last year i bought a Mini Server, upgraded the RAM to 16 for 190 and swapped an SSD in for another 200, so for roughly 1,400 bucks I got a very capable machine, quad core i7, lots of RAM, SSD. This is not not a scooter, it's not a Civic. It's a CRV or a Forrester, capable of handling a lot of jobs well. It's attached to a 450 bucks 27" 1440 and a 100 dollar 23" 1080 and I'm very happy.

I am not happy at all about the degradation of the new Minis, the base model is like a Fiat 500, it'll get you around, as long you're smallish, don't have kids dogs and groceries, even the top level model does not have the functionality of my Server at the same price. Losing quad core cripples the machine for serious work. The updated graphics card is nice, but the one in my Mini handles movies just fine, I don't game or edit videos though.
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 03:03 AM
 
Originally Posted by jmiddel View Post
I am not happy at all about the degradation of the new Minis, the base model is like a Fiat 500, it'll get you around, as long you're smallish, don't have kids dogs and groceries, even the top level model does not have the functionality of my Server at the same price. Losing quad core cripples the machine for serious work. The updated graphics card is nice, but the one in my Mini handles movies just fine, I don't game or edit videos though.
While I actually do not want to disagree, I am quite surprised everytime I read something like this! A professional reviewer on the internet even said, that the base model would be good enough for Email and Internet...!
Well, the base model has 1000x more RAM than my first Mac. It has 12500x more HD space. I am not able to compare processor and graphics speeds.
My first Mac was good enough for Email and Internet!!
***
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Oct 21, 2014, 05:21 AM
 
Originally Posted by jmiddel View Post
It's attached to a 450 bucks 27" 1440 and a 100 dollar 23" 1080 and I'm very happy.
That seems a lot for what is a fairly low resolution on a 27".
I have plenty of more important things to do, if only I could bring myself to do them....
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Oct 21, 2014, 05:25 AM
 
Originally Posted by badidea View Post
My first Mac was good enough for Email and Internet!!
Its less the case for email but 'the internet' has not stood still.
Once upon a time it meant viewing (garish, neon-coloured, poorly designed) static pages with text and a couple of horrible quality images thrown in. Now it means FaceBook and Youtube, HD video (try that on a decent G4) Flash and HTML 5 effects, animations and fancy menu systems.
I have plenty of more important things to do, if only I could bring myself to do them....
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 05:42 AM
 
Originally Posted by Waragainstsleep View Post
Its less the case for email but 'the internet' has not stood still.
Once upon a time it meant viewing (garish, neon-coloured, poorly designed) static pages with text and a couple of horrible quality images thrown in. Now it means FaceBook and Youtube, HD video (try that on a decent G4) Flash and HTML 5 effects, animations and fancy menu systems.
Yes, of course, I'm very well aware of that - it actually just proves my point that even the base Mac mini is capable of doing a lot more than just "internet" since this is not a very good description for the performance of a computer!
I also used to own a Quadra 840av in the 90ies and this was probably teh video editing machine of its time - not good enough for internet nowadays!
***
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 05:50 AM
 
Originally Posted by Spheric Harlot View Post
You're missing the whole portable market, which, in this age of external expansion is the de facto desktop machine.
Very well put! And that's why Apple needs to offer a MacBook Pro 17" again!
Since 2009 I don't use a desktop as my main computer anymore. I want my computer to be where I am - not the other way round ... and I want the screen to be as large as possible (retina if possible) ... and I don't have a desk anymore. That's why the new iMac is nice but not really an option for me!
I guess I am not the only one?
***
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Oct 21, 2014, 06:47 AM
 
Unless they find a new way to sell it, they don't need to build a 17" MacBook ever again.

Laptops are the de facto desktops because they're powerful and expandable enough these days, and portable, to boot.

That one was *a* de facto desktop machine because it wasn't portable enough. Hardly anybody bought them.
     
ShortcutToMoncton
Addicted to MacNN
Join Date: Sep 2000
Location: The Rock
Status: Offline
Reply With Quote
Oct 21, 2014, 07:08 AM
 
Yup. I lusted after the 17 and after a friend got one, realized it really was a pain in the ass to carry around. A far better solution was a smaller, easily portable model you can take anywhere without worrying about where you'll put it, with a nice external monitor/keyboard station for desktop work (which you could buy and still be at the same price as the more expensive 17).
Mankind's only chance is to harness the power of stupid.
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 07:11 AM
 
Originally Posted by Spheric Harlot View Post
That one was *a* de facto desktop machine because it wasn't portable enough. Hardly anybody bought them.
I would like to disagree but I guess you know that from your days working at Gravis, right?

I also have an HP EliteBook Workstation for work and have to carry that one around quite a lot ...and I hate it, since it is about double the size (thickness) and weight compared to the MBP! That's why I consider the MBP very portable!
***
     
Phileas
Mac Elite
Join Date: Jul 2002
Location: Toronto, Canada
Status: Offline
Reply With Quote
Oct 21, 2014, 09:25 AM
 
I played with the 17" MBP and decided against it for all the reasons already mentions. It was too large, and at the same time not large enough. Too large to carry, not large enough when you wanted a bigger screen.

To this day, my setup is a 15" MBP and a 27" APple display.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 21, 2014, 09:33 AM
 
I never understood the appeal of the 17": I have a 13" Retina MacBook Pro (a »downgrade« from a 2010 15" MacBook Pro), but I simply put an external monitor on my desk and an IBM Model M keyboard. Problem solved.
I don't suffer from insanity, I enjoy every minute of it.
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 10:43 AM
 
Originally Posted by OreoCookie View Post
I never understood the appeal of the 17"
Well, when I bought mine it was the only MacBook with a Full-HD (actually 1920x1200) resolution and therefor best for movies!
The built in speakers were also better than in any other MacBook!
I still like to watch movies with it in bed because it's really good for that!

I understand the argument for a 15" (more portable) plus external screen (bigger) though.
***
     
Cap'n Tightpants
Addicted to MacNN
Join Date: Oct 2014
Location: Shaddim's sock drawer
Status: Offline
Reply With Quote
Oct 21, 2014, 11:20 AM
 
I'm sure Apple can build a slim & trim 17" Macbook, I have a new Aorus 17" notebook at home w/ a 1440p IPS display, SLI Nvidia 980Ms, and it's "only" 2cm thick. If a tiny company like Aorus can produce such a graphics powerhouse that doesn't feel like toting around a manhole cover, I can only imagine what a company with Apple's resources could develop now. Personally, I think they just don't care about notebooks anymore, they haven't introduced anything compelling, much less innovative, in that area in over 6 years.
"I have a dream, that my four little children will one day live in a
nation where they will not be judged by the color of their skin,
but by the content of their character." - M.L.King Jr
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 21, 2014, 11:29 AM
 
Apple cares about notebooks alright. Retina got there first, Thunderbolt was made with at least one eye on the MBP market, they made the default desktop keyboards the same size as the MBP to make that the standard all apps were developed to, and they're updated more frequently. There has also been a massive focus on battery life from the OS team in recent years.

Yes, Apple could make a 17" MBP again, but the market for it is limited. 17" laptops are referred to as "desktop replacement" computers - essentially laptops for people who want a desktop but don't want a big ugly box. Apple has a "desktop replacement replacement" - the iMac (the wording comes from Apple and the iMac G5 launch). This means that the market for them is tiny from Apple's point of view.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 21, 2014, 11:40 AM
 
Originally Posted by Cap'n Tightpants View Post
Personally, I think they just don't care about notebooks anymore, they haven't introduced anything compelling, much less innovative, in that area in over 6 years.
Huh? Apple has introduced notebooks with Retina screen (which is a huge, huge boon to me) and literally reshaped the notebook market with its MacBook Air. Yes, they've moved away from bigger machines to smaller ones (13-17" to 11-15"), but I don't see how you can claim that they haven't been innovative or not cared about the notebook market anymore. On the side of the Air, they've pushed battery life to 12+ hours, which means that even on my 13" Retina I can work through a whole transcontinental flight. There is a good reason why Apple's Mac sales grew 21 % compared year-over-year, most of the Macs Apple sells are notebooks.
I don't suffer from insanity, I enjoy every minute of it.
     
badidea
Professional Poster
Join Date: Nov 2003
Location: Hamburg
Status: Offline
Reply With Quote
Oct 21, 2014, 01:27 PM
 
Originally Posted by Cap'n Tightpants View Post
I have a new Aorus 17" notebook at home w/ a 1440p IPS display, SLI Nvidia 980Ms, and it's "only" 2cm thick.
Are you sure about the specs? I did try to find this notebook because it did sound quite interesting but can nowhere find one with SLI 980Ms! Aorus lists one with SLI 970Ms though which seems to be available in Taiwan!? Newegg also has some models but "only" with 8XXM graphics!
Oh...and none of them has a 1440p IPS display!
***
     
Cap'n Tightpants
Addicted to MacNN
Join Date: Oct 2014
Location: Shaddim's sock drawer
Status: Offline
Reply With Quote
Oct 21, 2014, 03:14 PM
 
Originally Posted by OreoCookie View Post
Huh? Apple has introduced notebooks with Retina screen (which is a huge, huge boon to me)
that's more in line with evolutionary, not revolutionary. They weren't the first notebook manufacturer to offer a QHD+ display.

and literally reshaped the notebook market with its MacBook Air.
That's what I was alluding to when I said they haven't introduced anything compelling in 6 years, that was back in 2008. What have they do for us lately? Not a lot.
"I have a dream, that my four little children will one day live in a
nation where they will not be judged by the color of their skin,
but by the content of their character." - M.L.King Jr
     
Cap'n Tightpants
Addicted to MacNN
Join Date: Oct 2014
Location: Shaddim's sock drawer
Status: Offline
Reply With Quote
Oct 21, 2014, 03:19 PM
 
Originally Posted by badidea View Post
Are you sure about the specs? I did try to find this notebook because it did sound quite interesting but can nowhere find one with SLI 980Ms! Aorus lists one with SLI 970Ms though which seems to be available in Taiwan!? Newegg also has some models but "only" with 8XXM graphics!
Oh...and none of them has a 1440p IPS display!
It'll probably hit retail store shelves by Xmas, so I've heard. Not sure what its MSRP will be.
"I have a dream, that my four little children will one day live in a
nation where they will not be judged by the color of their skin,
but by the content of their character." - M.L.King Jr
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Oct 21, 2014, 03:27 PM
 
Originally Posted by Cap'n Tightpants View Post
that's more in line with evolutionary, not revolutionary. They weren't the first notebook manufacturer to offer a QHD+ display.
It's the first time a high-res display got usable with a desktop OS. Many Windows apps still don't play nice with high-pixel density displays.
Originally Posted by Cap'n Tightpants View Post
That's what I was alluding to when I said they haven't introduced anything compelling in 6 years, that was back in 2008. What have they do for us lately? Not a lot.
Battery life got longer and longer over the last few generations. A first-gen MacBook Air had a very slow harddrive or a very slow SSD, was an expensive yet slow machine with a 2.5~3 hour real-world battery life. The MacBook Airs you can buy today are lighter, have replaced the MacBook as cheapest Mac, were the first mass market notebooks to feature fast SSDs, leave out the optical drive and run 14+ hours in real life. Honestly, if that combo isn't compelling to you, I don't know what is.
I don't suffer from insanity, I enjoy every minute of it.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Oct 21, 2014, 05:32 PM
 
Originally Posted by Cap'n Tightpants View Post
That's what I was alluding to when I said they haven't introduced anything compelling in 6 years, that was back in 2008. What have they do for us lately? Not a lot.
Its easy for us to expect Apple to keep revolutionising things because they do it so much more often than anyone else but What else could they really do to a notebook? Thinner than the MBA and it might as well be an iPad.

No doubt some crazy haptic holographic thing will come along sooner or later but until the tech is ready I think we are stuck with iterations of the current line ups.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Oct 21, 2014, 05:56 PM
 
Originally Posted by Cap'n Tightpants View Post
that's more in line with evolutionary, not revolutionary. They weren't the first notebook manufacturer to offer a QHD+ display.
OK, so I'll bite: even if we don't bother with the OS support: what manufacturer showed a 2880*1800 (or more) 15" display before the Retina MBP in June 2012. A bigger screen with that res doesn't count - anyone can make that (you literally don't cut the panel) - I'm after a that ppi or better.

Originally Posted by Cap'n Tightpants View Post
That's what I was alluding to when I said they haven't introduced anything compelling in 6 years, that was back in 2008. What have they do for us lately? Not a lot.
The aqueduct?

Battery life, then. The 13" MBA has a 12 hour battery life. It was 7 hours as late as two years ago.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 12:37 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,