Fallout 4 on the Retina MacBook Pro

THIS ARTICLE HAS BEEN CORRECTED IN A BIG WAY. PLEASE VISIT http://techsupportrich.com/a-correction/ TO SEE JUST HOW BIG IT IS.

If you’re looking for a review of Fallout 4 I’m afraid this isn’t it. I’m here to talk about my experience running the game on my particular hardware setup, specifically a mid 2014 15″ retina MacBook Pro. It has the 2.5GHz Core i7 processor, 16GB of RAM, 500GB SSD, and an Nvidia GT750M graphics chip.

It seems PC gamers are very eager to let you know that the MacBook isn’t great for gaming, and that the GT750M graphics card is unreasonably underpowered, and they’d be right. If you were to spend MacBook money on a Windows laptop you’re going to get the GT970M or even GT980M. You don’t need to know the Nvidia graphics chip range to know that there are a lot of numbers between 750 and 980.

Fallout 4 turned out to be unplayable. Not just bad, unplayable. With the graphics set to the lowest possible settings, at the lowest possible resolution, the MacBook could only manage a few frames per second. I didn’t expect to be running ultra graphics settings at full retina resolution but what i was getting just wasn’t good enough.

Fallout isn’t available for OS X so I’m running it on Windows 10, running on Bootcamp, and I think that’s where the problem lies. You see the 15″ MacBook Pro doesn’t just have the GT750M, it also has the Intel Iris Pro on board graphics chip. That’s the very low powered graphics processor that lives inside the Intel processor. It’s terrible at 3D graphics but it uses very little power. For a laptop that can be the difference between getting your work finished and staring at a blank screen. OS X can decide whether it needs to use the Intel or Nvidia graphics depending on what you’re doing, only using the more power hungry chip when it’s absolutely necessary. As far as I can figure, Windows can’t do that. I’ve even read reports from Windows laptop owners with similar graphics arrangements complaining of the same problem. Windows isn’t using the big graphics card to run games. It’s just using the Iris graphics.

The work around I’ve found is to use an external monitor. With the laptop plugged into my TV via HDMI at 720p, I’m able to play using ultra graphics settings with a super smooth frame rate. I can even play the game nicely on a 1080p monitor via Thunderbolt and DVI.

So my guess is that Windows won’t use the dedicated graphics card when running on the internal monitor. Using an external screen forces it to use the right card. I’ve had similar problems with other games, I’ll give them a go as soon as I get bored with Fallout 4, which will be sometime in May.