Skip to content

Software Development Blogs: Programming, Software Testing, Agile Project Management

Methods & Tools

Subscribe to Methods & Tools
if you are not afraid to read more than one page to be a smarter software developer, software tester or project manager!

Coding Horror - Jeff Atwood
Syndicate content
programming and human factors
Updated: 4 hours 34 min ago

Thunderbolting Your Video Card

Fri, 03/24/2017 - 10:08

When I wrote about The Golden Age of x86 Gaming, I implied that, in the future, it might be an interesting, albeit expensive, idea to upgrade your video card via an external Thunderbolt 3 enclosure.

I'm here to report that the future is now.

Yes, that's right, I paid $500 for an external Thunderbolt 3 enclosure to fit a $600 video card, all to enable a plug-in upgrade of a GPU on a Skull Canyon NUC that itself cost around $1000 fully built. I know, it sounds crazy, and … OK fine, I won't argue with you. It's crazy.

This matters mostly because of 4k, aka 2160p, aka 3840 × 2160, aka Ultra HD.

4k compared to 1080p

Plain old regular HD, aka 1080p, aka 1920 × 1080, is one quarter the size of 4k, and ¼ the work. By today's GPU standards HD is pretty much easy mode these days. It's not even interesting. No offense to console fans, or anything.

Late in 2016, I got a 4k OLED display and it … kind of blew my mind. I have never seen blacks so black, colors so vivid, on a display so thin. It made my previous 2008 era Panasonic plasma set look lame. It's so good that I'm now a little angry that every display that my eyes touch isn't OLED already. I even got into nerd fights over it, and to be honest, I'd still throw down for OLED. It is legitimately that good. Come at me, bro.

Don't believe me? Well, guess which display in the below picture is OLED? Go on, guess:

Guess which screen is OLED?

@andrewbstiles if it was physically possible to have sex with this TV I.. uh.. I'd take it on long, romantic walks

— Jeff Atwood (@codinghorror) August 13, 2016

There's a reason every site that reviews TVs had to recalibrate their results when they reviewed the 2016 OLED sets.

In my extended review at Reference Home Theater, I call it “the best looking TV I’ve ever reviewed.” But we aren’t alone in loving the E6. Vincent Teoh at HDTVtest writes, “We’re not even going to qualify the following endorsement: if you can afford it, this is the TV to buy.” Rtings.com gave the E6 OLED the highest score of any TV the site has ever tested. Reviewed.com awarded it a 9.9 out of 10, with only the LG G6 OLED (which offers the same image but better styling and sound for $2,000 more) coming out ahead.

But I digress.

Playing games at 1080p in my living room was already possible. But now that I have an incredible 4k display in the living room, it's a whole other level of difficulty. Not just twice as hard – and remember current consoles barely manage to eke out 1080p at 30fps in most games – but four times as hard. That's where external GPU power comes in.

The cool technology underpinning all of this is Thunderbolt 3. The thunderbolt cable bundled with the Razer Core is rather … diminutive. There's a reason for this.

Is there a maximum cable length for Thunderbolt 3 technology?

Thunderbolt 3 passive cables have maximum lengths.

  • 0.5m TB 3 (40Gbps)
  • 1.0m TB 3 (20Gbps)
  • 2.0m TB 3 (20Gbps)

In the future we will offer active cables which will provide 40Gbps of bandwidth at longer lengths.

40Gbps is, for the record, an insane amount of bandwidth. Let's use our rule of thumb based on ultra common gigabit ethernet, that 1 gigabit = 120 megabytes/second, and we arrive at 4.8 gigabytes/second. Zow.

That's more than enough bandwidth to run even the highest of high end video cards, but it is not without overhead. There's a mild performance hit for running the card externally, on the order of 15%. There's also a further performance hit of 10% if you are in "loopback" mode on a laptop where you don't have an external display, so the video frames have to be shuttled back from the GPU to the internal laptop display.

This may look like a gamer-only thing, but surprisingly, it isn't. What you get is the general purpose ability to attach any PCI express card to any computer with a Thunderbolt 3 port and, for the most part, it just works!

Linus breaks it down and answers all your most difficult questions:

Please watch the above video closely if you're actually interested in this stuff; it is essential. I'll add some caveats of my own after working with the Razer Core for a while:

  • Make sure the video card you plan to put into the Razer Core is not too tall, or too wide. You can tell if a card is going to be too tall by looking at pictures of the mounting rear bracket. If the card extends significantly above the standard rear mounting bracket, it won't fit. If the card takes more than 2 slots in width, it also won't fit, but this is more rare. Depth (length) is rarely an issue.

  • There are four fans in the Razer Core and although it is reasonably quiet, it's not super silent or anything. You may want to mod the fans. The Razer Core is a remarkably simple device, internally, it's really just a power supply, some Thunderbolt 3 bridge logic, and a PCI express slot. I agree with Linus that the #1 area Razer could improve in the future, beyond generally getting the price down, is to use fewer and larger fans that run quieter.

  • If you're putting a heavy hitter GPU in the Razer Core, I'd try to avoid blower style cards (the ones that exhaust heat from the rear) in favor of those that cool with large fans blowing down and around the card. Dissipating 150w+ is no mean feat and you'll definitely need to keep the enclosure in open air … and of course within 0.5 meters of the computer it's connected to.

  • There is no visible external power switch on the Razer Core. It doesn't power on until you connect a TB3 cable to it. I was totally not expecting that. But once connected, it powers up and the Windows 10 Thunderbolt 3 drivers kick in and ask you to authorize the device, which I did (always authorize). Then it spun a bit, detected the new GPU, and suddenly I had multiple graphics card active on the same computer. I also installed the latest Nvidia drivers just to make sure everything was ship shape.

  • It's kinda ... weird having multiple GPUs simultaneously active. I wanted to make the Razer Core display the only display, but you can't really turn off the built in GPU – you can select "only use display 2", that's all. I got into several weird states where windows were opening on the other display and I had to mess around a fair bit to get things locked down to just one display. You may want to consider whether you have both "displays" connected for troubleshooting, or not.

And then, there I am, playing Lego Marvel in splitscreen co-op at glorious 3840 × 2160 UltraHD resolution on an amazing OLED display with my son. It is incredible.

Beyond the technical "because I could", I am wildly optimistic about the future of external Thunderbolt 3 expansion boxes, and here's why:

  • The main expense and bottleneck in any stonking gaming rig is, by far, the GPU. It's also the item you are most likely to need to replace a year or two from now.

  • The CPU and memory speeds available today are so comically fast that any device with a low-end i3-7100 for $120 will make zero difference in real world gaming at 1080p or higher … if you're OK with 30fps minimum. If you bump up to $200, you can get a quad-core i5-7500 that guarantees you 60fps minimum everywhere.

  • If you prefer a small system or a laptop, an external GPU makes it so much more flexible. Because CPU and memory speeds are already so fast, 99.9% of the time your bottleneck is the GPU, and almost any small device you can buy with a Thunderbolt 3 port can now magically transform into a potent gaming rig with a single plug. Thunderbolt 3 may be a bit cutting edge today, but more and more devices are shipping with Thunderbolt 3. Within a few years, I predict TB3 ports will be as common as USB3 ports.

  • A general purpose external PCI express enclosure will be usable for a very long time. My last seven video card upgrades were plug and play PCI Express cards that would have worked fine in any computer I've built in the last ten years.

  • External GPUs are not meaningfully bottlenecked by Thunderbolt 3 bandwidth; the impact is 15% to 25%, and perhaps even less over time as drivers and implementations mature. While Thunderbolt 3 has "only" PCI Express x4 bandwidth, many benchmarkers have noted that GPUs moving from PCI Express x16 to x8 has almost no effect on performance. And there's always Thunderbolt 4 on the horizon.

The future, as they say, is already here – it's just not evenly distributed.

I am painfully aware that costs need to come down. Way, way down. The $499 Razer Core is well made, on the vanguard of what's possible, a harbinger of the future, and fantastically enough, it does even more than what it says on the tin. But it's not exactly affordable.

I would absolutely love to see a modest, dedicated $200 external Thunderbolt 3 box that included an inexpensive current-gen GPU. This would clobber any onboard GPU on the planet. Let's compare my Skull Canyon NUC, which has Intel's fastest ever, PS4 class embedded GPU, with the modest $150 GeForce GTX 1050 Ti:

1920 × 1080 high detail Bioshock Infinite15 → 79 fps Rise of the Tomb Raider12 → 49 fps Overwatch43 → 114 fps

As predicted, that's a 3x-5x stompdown. Mac users lamenting their general lack of upgradeability, hear me: this sort of box is exactly what you want and need. Imagine if Apple was to embrace upgrading their laptops and all-in-one systems via Thunderbolt 3.

I know, I know. It's a stretch. But a man can dream … of externally upgradeable GPUs. That are too expensive, sure, but they are here, right now, today. They'll only get cheaper over time.

[advertisement] Find a better job the Stack Overflow way - what you need when you need it, no spam, and no scams.
Categories: Programming

Thunderbolting Your Video Card

Fri, 03/24/2017 - 10:08

When I wrote about The Golden Age of x86 Gaming, I implied that, in the future, it might be an interesting, albeit expensive, idea to upgrade your video card via an external Thunderbolt 3 enclosure.

I'm here to report that the future is now.

Yes, that's right, I paid $500 for an external Thunderbolt 3 enclosure to fit a $600 video card, all to enable a plug-in upgrade of a GPU on a Skull Canyon NUC that itself cost around $1000 fully built. I know, it sounds crazy, and … OK fine, I won't argue with you. It's crazy.

This matters mostly because of 4k, aka 2160p, aka 3840 × 2160, aka Ultra HD.

4k compared to 1080p

Plain old regular HD, aka 1080p, aka 1920 × 1080, is one quarter the size of 4k, and ¼ the work. By today's GPU standards HD is pretty much easy mode these days. It's not even interesting. No offense to console fans, or anything.

Late in 2016, I got a 4k OLED display and it … kind of blew my mind. I have never seen blacks so black, colors so vivid, on a display so thin. It made my previous 2008 era Panasonic plasma set look lame. It's so good that I'm now a little angry that every display that my eyes touch isn't OLED already. I even got into nerd fights over it, and to be honest, I'd still throw down for OLED. It is legitimately that good. Come at me, bro.

Don't believe me? Well, guess which display in the below picture is OLED? Go on, guess:

Guess which screen is OLED?

@andrewbstiles if it was physically possible to have sex with this TV I.. uh.. I'd take it on long, romantic walks

— Jeff Atwood (@codinghorror) August 13, 2016

There's a reason every site that reviews TVs had to recalibrate their results when they reviewed the 2016 OLED sets.

In my extended review at Reference Home Theater, I call it “the best looking TV I’ve ever reviewed.” But we aren’t alone in loving the E6. Vincent Teoh at HDTVtest writes, “We’re not even going to qualify the following endorsement: if you can afford it, this is the TV to buy.” Rtings.com gave the E6 OLED the highest score of any TV the site has ever tested. Reviewed.com awarded it a 9.9 out of 10, with only the LG G6 OLED (which offers the same image but better styling and sound for $2,000 more) coming out ahead.

But I digress.

Playing games at 1080p in my living room was already possible. But now that I have an incredible 4k display in the living room, it's a whole other level of difficulty. Not just twice as hard – and remember current consoles barely manage to eke out 1080p at 30fps in most games – but four times as hard. That's where external GPU power comes in.

The cool technology underpinning all of this is Thunderbolt 3. The thunderbolt cable bundled with the Razer Core is rather … diminutive. There's a reason for this.

Is there a maximum cable length for Thunderbolt 3 technology?

Thunderbolt 3 passive cables have maximum lengths.

  • 0.5m TB 3 (40Gbps)
  • 1.0m TB 3 (20Gbps)
  • 2.0m TB 3 (20Gbps)

In the future we will offer active cables which will provide 40Gbps of bandwidth at longer lengths.

40Gbps is, for the record, an insane amount of bandwidth. Let's use our rule of thumb based on ultra common gigabit ethernet, that 1 gigabit = 120 megabytes/second, and we arrive at 4.8 gigabytes/second. Zow.

That's more than enough bandwidth to run even the highest of high end video cards, but it is not without overhead. There's a mild performance hit for running the card externally, on the order of 15%. There's also a further performance hit of 10% if you are in "loopback" mode on a laptop where you don't have an external display, so the video frames have to be shuttled back from the GPU to the internal laptop display.

This may look like a gamer-only thing, but surprisingly, it isn't. What you get is the general purpose ability to attach any PCI express card to any computer with a Thunderbolt 3 port and, for the most part, it just works!

Linus breaks it down and answers all your most difficult questions:

Please watch the above video closely if you're actually interested in this stuff; it is essential. I'll add some caveats of my own after working with the Razer Core for a while:

  • Make sure the video card you plan to put into the Razer Core is not too tall, or too wide. You can tell if a card is going to be too tall by looking at pictures of the mounting rear bracket. If the card extends significantly above the standard rear mounting bracket, it won't fit. If the card takes more than 2 slots in width, it also won't fit, but this is more rare. Depth (length) is rarely an issue.

  • There are four fans in the Razer Core and although it is reasonably quiet, it's not super silent or anything. You may want to mod the fans. The Razer Core is a remarkably simple device, internally, it's really just a power supply, some Thunderbolt 3 bridge logic, and a PCI express slot. I agree with Linus that the #1 area Razer could improve in the future, beyond generally getting the price down, is to use fewer and larger fans that run quieter.

  • If you're putting a heavy hitter GPU in the Razer Core, I'd try to avoid blower style cards (the ones that exhaust heat from the rear) in favor of those that cool with large fans blowing down and around the card. Dissipating 150w+ is no mean feat and you'll definitely need to keep the enclosure in open air … and of course within 0.5 meters of the computer it's connected to.

  • There is no visible external power switch on the Razer Core. It doesn't power on until you connect a TB3 cable to it. I was totally not expecting that. But once connected, it powers up and the Windows 10 Thunderbolt 3 drivers kick in and ask you to authorize the device, which I did (always authorize). Then it spun a bit, detected the new GPU, and suddenly I had multiple graphics card active on the same computer. I also installed the latest Nvidia drivers just to make sure everything was ship shape.

  • It's kinda ... weird having multiple GPUs simultaneously active. I wanted to make the Razer Core display the only display, but you can't really turn off the built in GPU – you can select "only use display 2", that's all. I got into several weird states where windows were opening on the other display and I had to mess around a fair bit to get things locked down to just one display. You may want to consider whether you have both "displays" connected for troubleshooting, or not.

And then, there I am, playing Lego Marvel in splitscreen co-op at glorious 3840 × 2160 UltraHD resolution on an amazing OLED display with my son. It is incredible.

Beyond the technical "because I could", I am wildly optimistic about the future of external Thunderbolt 3 expansion boxes, and here's why:

  • The main expense and bottleneck in any stonking gaming rig is, by far, the GPU. It's also the item you are most likely to need to replace a year or two from now.

  • The CPU and memory speeds available today are so comically fast that any device with a low-end i3-7100 for $120 will make zero difference in real world gaming at 1080p or higher … if you're OK with 30fps minimum. If you bump up to $200, you can get a quad-core i5-7500 that guarantees you 60fps minimum everywhere.

  • If you prefer a small system or a laptop, an external GPU makes it so much more flexible. Because CPU and memory speeds are already so fast, 99.9% of the time your bottleneck is the GPU, and almost any small device you can buy with a Thunderbolt 3 port can now magically transform into a potent gaming rig with a single plug. Thunderbolt 3 may be a bit cutting edge today, but more and more devices are shipping with Thunderbolt 3. Within a few years, I predict TB3 ports will be as common as USB3 ports.

  • A general purpose external PCI express enclosure will be usable for a very long time. My last seven video card upgrades were plug and play PCI Express cards that would have worked fine in any computer I've built in the last ten years.

  • External GPUs are not meaningfully bottlenecked by Thunderbolt 3 bandwidth; the impact is 15% to 25%, and perhaps even less over time as drivers and implementations mature. While Thunderbolt 3 has "only" PCI Express x4 bandwidth, many benchmarkers have noted that GPUs moving from PCI Express x16 to x8 has almost no effect on performance. And there's always Thunderbolt 4 on the horizon.

The future, as they say, is already here – it's just not evenly distributed.

I am painfully aware that costs need to come down. Way, way down. The $499 Razer Core is well made, on the vanguard of what's possible, a harbinger of the future, and fantastically enough, it does even more than what it says on the tin. But it's not exactly affordable.

I would absolutely love to see a modest, dedicated $200 external Thunderbolt 3 box that included an inexpensive current-gen GPU. This would clobber any onboard GPU on the planet. Let's compare my Skull Canyon NUC, which has Intel's fastest ever, PS4 class embedded GPU, with the modest $150 GeForce GTX 1050 Ti:

1920 × 1080 high detail Bioshock Infinite15 → 79 fps Rise of the Tomb Raider12 → 49 fps Overwatch43 → 114 fps

As predicted, that's a 3x-5x stompdown. Mac users lamenting their general lack of upgradeability, hear me: this sort of box is exactly what you want and need. Imagine if Apple was to embrace upgrading their laptops and all-in-one systems via Thunderbolt 3.

I know, I know. It's a stretch. But a man can dream … of externally upgradeable GPUs. That are too expensive, sure, but they are here, right now, today. They'll only get cheaper over time.

[advertisement] Find a better job the Stack Overflow way - what you need when you need it, no spam, and no scams.
Categories: Programming

Thunderbolting Your Video Card

Fri, 03/24/2017 - 10:08

When I wrote about The Golden Age of x86 Gaming, I implied that, in the future, it might be an interesting, albeit expensive, idea to upgrade your video card via an external Thunderbolt 3 enclosure.

I'm here to report that the future is now.

Yes, that's right, I paid $500 for an external Thunderbolt 3 enclosure to fit a $600 video card, all to enable a plug-in upgrade of a GPU on a Skull Canyon NUC that itself cost around $1000 fully built. I know, it sounds crazy, and … OK fine, I won't argue with you. It's crazy.

This matters mostly because of 4k, aka 2160p, aka 3840 × 2160, aka Ultra HD.

4k compared to 1080p

Plain old regular HD, aka 1080p, aka 1920 × 1080, is one quarter the size of 4k, and ¼ the work. By today's GPU standards HD is pretty much easy mode these days. It's not even interesting. No offense to console fans, or anything.

Late in 2016, I got a 4k OLED display and it … kind of blew my mind. I have never seen blacks so black, colors so vivid, on a display so thin. It made my previous 2008 era Panasonic plasma set look lame. It's so good that I'm now a little angry that every display that my eyes touch isn't OLED already. I even got into nerd fights over it, and to be honest, I'd still throw down for OLED. It is legitimately that good. Come at me, bro.

Don't believe me? Well, guess which display in the below picture is OLED? Go on, guess:

Guess which screen is OLED?

@andrewbstiles if it was physically possible to have sex with this TV I.. uh.. I'd take it on long, romantic walks

— Jeff Atwood (@codinghorror) August 13, 2016

There's a reason every site that reviews TVs had to recalibrate their results when they reviewed the 2016 OLED sets.

In my extended review at Reference Home Theater, I call it “the best looking TV I’ve ever reviewed.” But we aren’t alone in loving the E6. Vincent Teoh at HDTVtest writes, “We’re not even going to qualify the following endorsement: if you can afford it, this is the TV to buy.” Rtings.com gave the E6 OLED the highest score of any TV the site has ever tested. Reviewed.com awarded it a 9.9 out of 10, with only the LG G6 OLED (which offers the same image but better styling and sound for $2,000 more) coming out ahead.

But I digress.

Playing games at 1080p in my living room was already possible. But now that I have an incredible 4k display in the living room, it's a whole other level of difficulty. Not just twice as hard – and remember current consoles barely manage to eke out 1080p at 30fps in most games – but four times as hard. That's where external GPU power comes in.

The cool technology underpinning all of this is Thunderbolt 3. The thunderbolt cable bundled with the Razer Core is rather … diminutive. There's a reason for this.

Is there a maximum cable length for Thunderbolt 3 technology?

Thunderbolt 3 passive cables have maximum lengths.

  • 0.5m TB 3 (40Gbps)
  • 1.0m TB 3 (20Gbps)
  • 2.0m TB 3 (20Gbps)

In the future we will offer active cables which will provide 40Gbps of bandwidth at longer lengths.

40Gbps is, for the record, an insane amount of bandwidth. Let's use our rule of thumb based on ultra common gigabit ethernet, that 1 gigabit = 120 megabytes/second, and we arrive at 4.8 gigabytes/second. Zow.

That's more than enough bandwidth to run even the highest of high end video cards, but it is not without overhead. There's a mild performance hit for running the card externally, on the order of 15%. There's also a further performance hit of 10% if you are in "loopback" mode on a laptop where you don't have an external display, so the video frames have to be shuttled back from the GPU to the internal laptop display.

This may look like a gamer-only thing, but surprisingly, it isn't. What you get is the general purpose ability to attach any PCI express card to any computer with a Thunderbolt 3 port and, for the most part, it just works!

Linus breaks it down and answers all your most difficult questions:

Please watch the above video closely if you're actually interested in this stuff; it is essential. I'll add some caveats of my own after working with the Razer Core for a while:

  • Make sure the video card you plan to put into the Razer Core is not too tall, or too wide. You can tell if a card is going to be too tall by looking at pictures of the mounting rear bracket. If the card extends significantly above the standard rear mounting bracket, it won't fit. If the card takes more than 2 slots in width, it also won't fit, but this is more rare. Depth (length) is rarely an issue.

  • There are four fans in the Razer Core and although it is reasonably quiet, it's not super silent or anything. You may want to mod the fans. The Razer Core is a remarkably simple device, internally, it's really just a power supply, some Thunderbolt 3 bridge logic, and a PCI express slot. I agree with Linus that the #1 area Razer could improve in the future, beyond generally getting the price down, is to use fewer and larger fans that run quieter.

  • If you're putting a heavy hitter GPU in the Razer Core, I'd try to avoid blower style cards (the ones that exhaust heat from the rear) in favor of those that cool with large fans blowing down and around the card. Dissipating 150w+ is no mean feat and you'll definitely need to keep the enclosure in open air … and of course within 0.5 meters of the computer it's connected to.

  • There is no visible external power switch on the Razer Core. It doesn't power on until you connect a TB3 cable to it. I was totally not expecting that. But once connected, it powers up and the Windows 10 Thunderbolt 3 drivers kick in and ask you to authorize the device, which I did (always authorize). Then it spun a bit, detected the new GPU, and suddenly I had multiple graphics card active on the same computer. I also installed the latest Nvidia drivers just to make sure everything was ship shape.

  • It's kinda ... weird having multiple GPUs simultaneously active. I wanted to make the Razer Core display the only display, but you can't really turn off the built in GPU – you can select "only use display 2", that's all. I got into several weird states where windows were opening on the other display and I had to mess around a fair bit to get things locked down to just one display. You may want to consider whether you have both "displays" connected for troubleshooting, or not.

And then, there I am, playing Lego Marvel in splitscreen co-op at glorious 3840 × 2160 UltraHD resolution on an amazing OLED display with my son. It is incredible.

Beyond the technical "because I could", I am wildly optimistic about the future of external Thunderbolt 3 expansion boxes, and here's why:

  • The main expense and bottleneck in any stonking gaming rig is, by far, the GPU. It's also the item you are most likely to need to replace a year or two from now.

  • The CPU and memory speeds available today are so comically fast that any device with a low-end i3-7100 for $120 will make zero difference in real world gaming at 1080p or higher … if you're OK with 30fps minimum. If you bump up to $200, you can get a quad-core i5-7500 that guarantees you 60fps minimum everywhere.

  • If you prefer a small system or a laptop, an external GPU makes it so much more flexible. Because CPU and memory speeds are already so fast, 99.9% of the time your bottleneck is the GPU, and almost any small device you can buy with a Thunderbolt 3 port can now magically transform into a potent gaming rig with a single plug. Thunderbolt 3 may be a bit cutting edge today, but more and more devices are shipping with Thunderbolt 3. Within a few years, I predict TB3 ports will be as common as USB3 ports.

  • A general purpose external PCI express enclosure will be usable for a very long time. My last seven video card upgrades were plug and play PCI Express cards that would have worked fine in any computer I've built in the last ten years.

  • External GPUs are not meaningfully bottlenecked by Thunderbolt 3 bandwidth; the impact is 15% to 25%, and perhaps even less over time as drivers and implementations mature. While Thunderbolt 3 has "only" PCI Express x4 bandwidth, many benchmarkers have noted that GPUs moving from PCI Express x16 to x8 has almost no effect on performance. And there's always Thunderbolt 4 on the horizon.

The future, as they say, is already here – it's just not evenly distributed.

I am painfully aware that costs need to come down. Way, way down. The $499 Razer Core is well made, on the vanguard of what's possible, a harbinger of the future, and fantastically enough, it does even more than what it says on the tin. But it's not exactly affordable.

I would absolutely love to see a modest, dedicated $200 external Thunderbolt 3 box that included an inexpensive current-gen GPU. This would clobber any onboard GPU on the planet. Let's compare my Skull Canyon NUC, which has Intel's fastest ever, PS4 class embedded GPU, with the modest $150 GeForce GTX 1050 Ti:

1920 × 1080 high detail Bioshock Infinite15 → 79 fps Rise of the Tomb Raider12 → 49 fps Overwatch43 → 114 fps

As predicted, that's a 3x-5x stompdown. Mac users lamenting their general lack of upgradeability, hear me: this sort of box is exactly what you want and need. Imagine if Apple was to embrace upgrading their laptops and all-in-one systems via Thunderbolt 3.

I know, I know. It's a stretch. But a man can dream … of externally upgradeable GPUs. That are too expensive, sure, but they are here, right now, today. They'll only get cheaper over time.

[advertisement] Find a better job the Stack Overflow way - what you need when you need it, no spam, and no scams.
Categories: Programming

Password Rules Are Bullshit

Fri, 03/10/2017 - 12:16

Of the many, many, many bad things about passwords, you know what the worst is? Password rules.

If we don't solve the password problem for users in my lifetime I am gonna haunt you from beyond the grave as a ghost pic.twitter.com/Tf9EnwgoZv

— Jeff Atwood (@codinghorror) August 11, 2015

Let this pledge be duly noted on the permanent record of the Internet. I don't know if there's an afterlife, but I'll be finding out soon enough, and I plan to go out mad as hell.

The world is absolutely awash in terrible password rules:

But I don't need to tell you this. The more likely you are to use a truly random password generation tool, like us über-geeks are supposed to, the more likely you have suffered mightily – and daily – under this regime.

Have you seen the classic XKCD about passwords?

To anyone who understands information theory and security and is in an infuriating argument with someone who does not (possibly involving mixed case), I sincerely apologize.

We can certainly debate whether "correct horse battery staple" is a viable password strategy or not, but the argument here is mostly that length matters.

That's What She Said

No, seriously, it does. I'll go so far as to say your password is too damn short. These days, given the state of cloud computing and GPU password hash cracking, any password of 8 characters or less is perilously close to no password at all.

So then perhaps we have one rule, that passwords must not be short. A long password is much more likely to be secure than a short one … right?

What about this four character password?

Categories: Programming

I'm Loyal to Nothing Except the Dream

Mon, 01/30/2017 - 10:19

There is much I take for granted in my life, and the normal functioning of American government is one of those things. In my 46 years, I've lived under nine different presidents. The first I remember is Carter. I've voted in every presidential election since 1992, but I do not consider myself a Democrat, or a Republican. I vote based on leadership – above all, leadership – and issues.

In my 14 years of blogging, I've never written a political blog post. I haven't needed to.

Until now.

It is quite clear something has become deeply unglued in the state of American politics.

As of 2017, the United States, through a sequence of highly improbable events, managed to elect an extremely controversial president.

A president with historically low approval ratings, elected on a platform many considered too extreme to even be taken literally:

Asked about Trump’s statements proposing the construction of a wall on the US-Mexico border and a ban on all Muslims entering the country, Thiel suggested that Trump supporters do not actually endorse those policies.

“I don’t support a religious test. I certainly don’t support the specific language that Trump has used in every instance,” he said. “But I think one thing that should be distinguished here is that the media is always taking Trump literally. It never takes him seriously, but it always takes him literally.”

The billionaire went on to define how he believes the average Trump supporter interprets the candidate’s statements. “I think a lot of voters who vote for Trump take Trump seriously but not literally, so when they hear things like the Muslim comment or the wall comment their question is not, ‘Are you going to build a wall like the Great Wall of China?’ or, you know, ‘How exactly are you going to enforce these tests?’ What they hear is we’re going to have a saner, more sensible immigration policy.”

A little over a week into the new presidency, it is obvious that Trump meant every word of what he said. He will build a US-Mexico wall. And he signed an executive order that literally, not figuratively, banned Muslims from entering the US — even if they held valid green cards.

As I said, I vote on policies, and as an American, I reject these two policies. Our Mexican neighbors are not an evil to be kept out with a wall, but an ally to be cherished. One of my favorite people is a Mexican immigrant. Mexican culture is ingrained deeply into America and we are all better for it. The history of America is the history of immigrants seeking religious freedom from persecution, finding a new life in the land of opportunity. Imagine the bravery it takes to leave everything behind, your relatives, your home, your whole life as you know it, to take your entire family on a five thousand mile journey to another country on nothing more than the promise of a dream. I've never done that, though my great-great grandparents did. Muslim immigrants are more American than I will ever be, and I am incredibly proud to have them here, as fellow Americans.

Help Keep Your School All American!

Trump is the first president in 40 years to refuse to release his tax returns in office. He has also refused to divest himself from his dizzying array of businesses across the globe, which present financial conflicts of interest. All of this, plus the hasty way he is ramrodding his campaign plans through on executive orders, with little or no forethought to how it would work – or if it would work at all – speaks to how negligent and dangerous Trump is as the leader of the free world. I want to reiterate that I don't care about party; I'd be absolutely over the moon with President Romney or President McCain, or any other rational form of leadership at this point.

It is unclear to me how we got where we are today. But echoes of this appeal to nationalism in Poland, and in Venezula, offer clues. We brought fact checkers to a culture war … and we lost. During the election campaign, I was strongly reminded of Frank Miller's 1986 Nuke story arc, which I read in Daredevil as a teenager — the seductive appeal of unbridled nationalism bleeding across the page in stark primary colors.

Daredevil issue 233, page excerpt

Nuke is a self-destructive form of America First nationalism that, for whatever reasons, won the presidency through dark subvocalized whispers, and is now playing out in horrifying policy form. But we are not now a different country; we remain the very same country that elected Reagan and Obama. We lead the free world. And we do it by taking the higher moral ground, choosing to do what is right before doing what is expedient.

I exercised my rights as a American citizen and I voted, yes. But I mostly ignored government beyond voting. I assumed that the wheels of American government would turn, and reasonable decisions would be made by reasonable people. Some I would agree with, others I would not agree with, but I could generally trust that the arc of American history inexorably bends toward justice, towards freedom, toward equality. Towards the things that make up the underlying American dream that this country is based on.

This is no longer the case.

I truly believe we are at an unprecedented time in American history, in uncharted territory. I have benefited from democracy passively, without trying at all, for 46 years. I now understand that the next four years is perhaps the most important time to be an activist in the United States since the civil rights movement. I am ready to do the work.

  • I have never once in my life called my representatives in congress. That will change. I will be calling and writing my representatives regularly, using tools like 5 Calls to do so.

  • I will strongly support, advocate for, and advertise any technical tools on web or smartphone that help Americans have their voices heard by their representatives, even if it takes faxing to do so. Build these tools. Make them amazing.

  • I am subscribing to support essential investigative journalism such as the New York Times, Los Angeles Times, and Washington Post.

  • I have set up large monthly donations to the ACLU which is doing critical work in fighting governmental abuse under the current regime.

  • I have set up monthly donations to independent journalism such as ProPublica and NPR.

  • I have set up monthly donations to agencies that fight for vulnerable groups, such as Planned Parenthood, Center for Reproductive Rights, Refugee Rights, NAACP, MALDEF, the Trevor Project, and so on.

  • I wish to see the formation of a third political party in the United States, led by those who are willing to speak truth to power like Evan McMullin. It is shameful how many elected representatives will not speak out. Those who do: trust me, we're watching and taking notes. And we will be bringing all our friends and audiences to bear to help you win.

  • I will be watching closely to see which representatives rubber-stamp harmful policies and appointees, and I will vote against them across the ticket, on every single ticket I can vote on.

  • I will actively support all efforts to make the National Popular Vote Interstate Compact happen, to reform the electoral college.

  • To the extent that my schedule allows, I will participate in protests to combat policies that I believe are harmful to Americans.

  • I am not quite at a place in my life where I'd consider running for office, but I will be, eventually. To the extent that any Stack Overflow user can be elected a moderator, I could be elected into office, locally, in the house, even the senate. Has anyone asked Joel Spolsky if he'd be willing to run for office? Because I'd be hard pressed to come up with someone I trust more than my old business partner Joel to do the right thing. I would vote for him so hard I'd break the damn voting machine.

I want to pay back this great country for everything it has done for me in my life, and carry the dream forward, not just selfishly for myself and my children, but for everyone's children, and our children's children. I do not mean the hollow promises of American nationalism

We would do well to renounce nationalism and all its symbols: its flags, its pledges of allegiance, its anthems, its insistence in song that God must single out America to be blessed.

Is not nationalism—that devotion to a flag, an anthem, a boundary so fierce it engenders mass murder—one of the great evils of our time, along with racism, along with religious hatred?

These ways of thinking—cultivated, nurtured, indoctrinated from childhood on— have been useful to those in power, and deadly for those out of power.

… but the enduring values of freedom, justice, and equality that this nation was founded on. I pledge my allegiance to the American dream, and the American people – not to the nation, never to the nation.

Daredevil issue 233, page excerpt

I apologize that it's taken me 46 years to wake up and realize that some things, like the American dream, aren't guaranteed. There will come a time where you have to stand up and fight for them, for democracy to work. I will.

Will you?

[advertisement] At Stack Overflow, we help developers learn, share, and grow. Whether you’re looking for your next dream job or looking to build out your team, we've got your back.
Categories: Programming

An Inferno on the Head of a Pin

Tue, 01/17/2017 - 12:37

Today's processors contain billions of heat-generating transistors in an ever shrinking space. The power budget might go from:

  • 1000 watts on a specialized server
  • 100 watts on desktops
  • 30 watts on laptops
  • 5 watts on tablets
  • 1 or 2 watts on a phone
  • 100 milliwatts on an embedded system

That's three four orders of magnitude. Modern CPU design is the delicate art of placing an inferno on the head of a pin.

Look at the original 1993 Pentium compared to the 20th anniversary Pentium:

Intel Pentium 66 1993
Pentium
66 Mhz
16kb L1
3.2 million transistors
Intel Pentium G3258 20th Anniversary Edition 2014
Pentium G3258
3.2 Ghz × 2 cores
128kb L1, 512kb L2, 3MB L3
1.4 billion transistors

I remember cooling the early CPUs with simple heatsinks; no fan. Those days are long gone.

A roomy desktop computer affords cooling opportunities (and thus a watt budget) that a laptop or tablet could only dream of. How often will you be at peak load? For most computers, the answer is "rarely". The smaller the space, the higher the required performance, the more … challenging your situation gets.

Sometimes, I build servers.

Inspired by Google and their use of cheap, commodity x86 hardware to scale on top of the open source Linux OS, I also built our own servers. When I get stressed out, when I feel the world weighing heavy on my shoulders and I don't know where to turn … I build servers. It's therapeutic.

Servers are one of those situations where you may be at full CPU load more often than not. I prefer to build 1U servers which is the smallest rack mountable unit, at 1.75" total height.

You get plenty of cores on a die these days, so I build single CPU servers. One reason is price; the other reason is that clock speed declines proportionally to the number of cores on a die (this is for the Broadwell Xeon V4 series):

coresGHz E5-163043.7$406 E5-165063.6$617 E5-168083.4$1723 E5-2680122.4$1745 E5-2690142.6$2090 E5-2697182.3$2702

Yes, there are server CPUs with even more cores, but if you have to ask how much they cost, you definitely can't afford them … and they're clocked even slower. What we do is serviced better by a smaller number of super fast cores than a larger number of slow cores, anyway.

With that in mind, consider these two Intel Xeon server CPUs:

As you can see from the official Intel product pages for each processor, they both have a TDP heat budget of 140 watts. I'm scanning the specs, thinking maybe this is an OK tradeoff.

Unfortunately, here's what I actually measured with my trusty Kill-a-Watt for each server build as I performed my standard stability testing, with completely identical parts except for the CPU:

  • E5-1630: 40w idle, 170w mprime
  • E5-1650: 55w idle, 250w mprime

I am here to tell you that Intel's TDP figure of 140 watts for the 6 core version of this CPU is a terrible, scurrilous lie!

This caused a bit of a problem for me as our standard 1U server build now overheats, alarms, and throttles with the 6 core CPU — whereas the 4 core CPU was just fine. Hey Intel! From my home in California, I stab at thee!

But, you know..

Better Heatsink

The 1.75" maximum height of the 1U server form factor doesn't leave a lot of room for creative cooling of a CPU. But you can switch from an Aluminum cooler to a Copper one.

Copper is significantly more expensive, plus heavier and harder to work with, so it's generally easier to throw an ever-larger mass of aluminum at the cooling problem when you can. But when space is a constraint, as it is with a 1U server, copper dissipates more heat in the same form factor.

The famous "Ninja" CPU cooler came in identical copper and aluminum versions so we can compare apples to apples:

  • Aluminum Ninja — 24C rise over ambient
  • Copper Ninja — 17C rise over ambient

You can scale the load and the resulting watts of heat by spinning up MPrime threads for the exact number of cores you want to "activate", so that's how I tested:

  • Aluminum heatsink — stable at 170w (mprime threads=4), but heat warnings with 190w (mprime threads=5)
  • Copper heatsink — stable at 190w (mprime threads=5) but heat warnings with 230w (mprime threads=6)

Each run has to be overnight to be considered successful. This helped, noticeably. But we need more.

Better Thermal Interface

When it comes to server builds, I stick with the pre-applied grey thermal interface pad that comes on the heatsinks. But out of boredom and a desire to experiment, I …

  • Removed the copper heatsink.
  • Used isopropyl alcohol to clean both CPU and heatsink.
  • Applied fancy "Ceramique" thermal compound I have on hand, using an X shape pattern.

I wasn't expecting any change at all, but to my surprise with the new TIM applied it took 5x longer to reach throttle temps with mprime threads=6. Before, it would thermally throttle within a minute of launching the test, and after it took ~10 minutes to reach that same throttle temp. The difference was noticeable.

That's a surprisingly good outcome, and it tells us the default grey goop that comes pre-installed on heatsinks is ... not great. Per this 2011 test, the difference between worst and best thermal compounds is 4.3°C.

But as Dan once bravely noted while testing Vegemite as a thermal interface material:

If your PC's so marginal that a CPU running three or four degrees Celsius warmer will crash it [or, for modern CPUs, cause the processor to auto-throttle itself and substantially reduce system performance], the solution is not to try to edge away from the precipice with better thermal compound. It's to make a big change to the cooling system, or just lower the darn clock speed.

An improved thermal interface just gets you there faster (or slower); it doesn't address the underlying problem. So we're not done here.

Ducted Airflow

Most, but not all, of the SuperMicro cases I've used have included a basic fan duct / shroud that lays across the central fans and the system. Given that the case fans are pretty much directly in front of the CPU anyway, I've included the shroud in the builds out of a sense of completeness more than any conviction that it was doing anything for the cooling performance.

This particular server case, though, did not include a fan duct. I didn't think much about it at the time, but considering the heat stress this 6-core CPU and its 250 watt heat generation was putting on our 1U build, I decided I should build a quick duct out of card stock and test it out.

(I know, I know, it's a super janky duct! But I was prototyping!)

Sure enough, this duct, combined with the previous heatsink and TIM changes, enabled the server to remain stable overnight with a full MPrime run of 12 threads.

I think we've certainly demonstrated the surprising (to me, at least) value of fan shrouds. But before we get too excited, let's consider one last thing.

Define "CPU Load"

Sometimes you get so involved with solving the problem at hand that you forget to consider whether you are, in fact, solving the right problem.

In these tests, we defined 100% CPU load using MPrime. Some people claim MPrime is more of a power virus than a real load test, because it exerts so much heat pressure on the CPUs. I initially dismissed these claims since I've used MPrime (and its Windows cousin, Prime95) for almost 20 years to test CPU stability, and it's never let me down.

But I did more research and I found that MPrime, since 2011, uses AVX2 instructions extensively on newer Intel CPUs:

The newer versions of Prime load in a way that they are only safe to run at near stock settings. The server processors actually downclock when AVX2 is detected to retain their TDP rating. On the desktop we're free to play and the thing most people don't know is how much current these routines can generate. It can be lethal for a CPU to see that level of current for prolonged periods.

That's why most stress test programs alternate between different data pattern types. Depending on how effective the rotation is, and how well that pattern causes issues for the system timing margin, it will, or will not, catch potential for instability. So it's wise not to hang one's hat on a single test type.

This explains why I saw such a large discrepancy between other CPU load programs like BurnP6 and MPrime.

MPrime does an amazing job of generating the type of CPU load that causes maximum heat pressure. But unless your servers regularly chew through zillions of especially power-hungry AVX2 instructions this may be completely unrepresentative of any real world load your server would actually see.

Your Own Personal Inferno

Was this overkill? Probably. Even with the aluminum heatsink, no change to thermal interface material, and zero ducting, we'd probably see no throttling under normal use in our server rack. But I wanted to be sure. Completely sure.

Is this extreme? Putting 140 TDP of CPU heat in a 1U server? Not really. Nick at Stack Overflow told me they just put two 22 core, 145W TDP Xeon 2699v4 CPUs and four 300W TDP GPUs in a single Dell C4130 1U server. I'd sure hate to be in the room when those fans spin up. I'm also a little afraid to find out what happens if you run MPrime plus full GPU load on that box.

Servers are an admittedly rare example of big CPU performance heat and size tradeoffs, one of the few left. It is fun to play at the extremes, but the SoC inside your phone makes the same tradeoffs on a smaller scale. Tiny infernos in our pockets, each and every one.

[advertisement] At Stack Overflow, we put developers first. We already help you find answers to your tough coding questions; now let us help you find your next job.
Categories: Programming