Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Where is Thunderbolt Headed? (anandtech.com)
74 points by srikar on Sept 14, 2014 | hide | past | favorite | 44 comments


To illustrate how bad a job Intel has done on the marketing of this technology, I for a long time thought that display port was a closed mac-only thing, and until today I thought the same of thunder port.

If they want to increase adaptation, maybe they should consider marketing it more, and outside of sanfran's macbook-based hipster blogger circles.


Don't misconstrue Displayport and Thunderbolt - one is a royalty free standard, the other is crippled Intel only and patent encumbered. Dockport is a VESA standard that adds USB and DC power to DP to compete with Thunderbolt. I have much more interest in using that than some profiteering move on Intels part to achieve vendor lock in to its chipsets by getting end users stuck with Thunderbolt devices.


Sounds like the major issue is MSFT. I suspect Microsoft sees Thunderbolt as an opportunity to recreate the Firewire experience for most people: poor experience will kill the technology and drive would-be adopters away from Mac. Back into the arms of Microsoft.


> and outside of sanfran's macbook-based hipster blogger circles.

Man, this is a really tired trope. At the last conference I attended, ~9/10 engineers were running macbooks (and pretty much 10/10 engineers under 40).


It all depends to which conferences one goes to.

If you attended a Games Developer Conference than it would be Windows everywhere, even on the MacBooks. Except maybe for the engineers that only do iOS games.

On the other hand, it saddened me to attend FOSDEM and see people bashing Microsoft about OSS practices, while walking around with a MacBook running Mac OS X. Talk about dysfunctional behavior.


Don't make them think too hard about it or their head might explode.


>At the last conference I attended, ~9/10 engineers were running macbooks

They're both tired tropes. There is a pretty reasonable mix of hardware and software run by 'engineers' and no where near 90% are macs.

edit: https://www.statwing.com/demos/dev-survey#workspaces/2496

That site has some pretty good statistics related to OS usage among programmers and some hardware usage, like iPod/iPhone, which I think is a decent indicator of whether or not these people own macs.


I went to a security conference about a year ago, and not being a security guy, I was surprised to see about half the people using Windows.

This being said, I am a little sad to see the prevalence of OSX-based laptops in 'hacker spaces', partly because I don't like Apple's practices, but mostly because diversity is one of the things that makes a hacker to me; the ability to make anything turn tricks. To see a monoculture starting to emerge is a little sad.


I tend to assume this has more to do with the piss-poor state of windows laptop manufacturers than with Apple. I'm sure those guys run Linux or Windows on their desktops, but seriously you expect them to put up with a heavy and crudware-riddled HP? A fragile Lenovo? A monstrously ugly MSI? A cheap Dell that falls apart the day the warranty runs out?


'Fragile' Lenovo? Using a cheap Dell as a comparator? Not trying to stack the deck there?


at the last conference I attended, ~9/10 of the engineers were using thinkpads and latitudes

personal examples are worthless because of the ridic small sample size


Thunderbolt is still closed and proprietary, just not by apple in this case.


As the article comments note, Thunderbolt is also not taking taking off because Intel demands all Thunderbolt peripherals have OS X drivers.

Companies like Asus/Silverstone (and more who don't spring to mind right now) have created Thunderbolt GPU enclosures to allow laptops to do 3D work. However Intel won't let them go to market because they don't have OS X drivers, and Apple has no interest in supporting Thunderbolt GPUs when they cold be selling people additional Macs instead.


[citation needed]

The GPU enclosures have been held up by Intel refusing to license them but not because of osx drivers, there are licensed thunderbolt peripherals that do not have osx drivers.


I read about it from a bunch of sources, which eventually pointed to https://www.change.org/p/intel-allow-silverstone-and-asus-to....

But there hasn't been an official statement about why they've been refused from Intel, so I can't link to it. You mention you still believe Intel has refused to license the hardware - what reason do you think applies?


I'm confused too because they are already for sale http://www.villageinstruments.com/tiki-index.php?page=Store


The final ViDock 4 is actually a ExpressCard to PCIe. You have to buy a separate Thunderbolt to ExpressCard.

Having ExpressCard in the middle kills your bandwidth, and ExpressCard is way narrower than PCIE or Thunderbolt.


> Apple has no interest in supporting Thunderbolt GPUs when they cold be selling people additional Macs instead.

One of the reasons I don't buy Macs, even though I like Mac OS X, is the fact that only high end Macs have proper GPUs for serious 3D programming.


As a developer on Mac, I think game programming is one of the last resorts where WinPCs are clearly better and/or needed - except for Unity based programming maybe. VS is quite dominant in that area, isn't it?


VS dominates for Unity development too, at least in my little circle of the world.

The developers I work with all love their Macs, but they hop onto Windows for serious Unity work, just so they can use Visual Studio with ReSharper instead of MonoDevelop.


Yes, because the whole game industry has been created on the home markets.

The gaming community has a different mindset than UNIX or FOSS communities and Mac wasn't never good at games (at least in Europe).

The rise of consoles happened in parallel with the downfall of Atari ST and Amiga.

So eventually the devkits for consoles happened to exist only on PCs.

Also the demoscene groups started to focus on the PC as well.

It doesn't help that Apple, except for the Pippin and now iOS, never had a gaming culture.

So you will never see a Mac game ring with subzero temperature cooling system and a pair of SLI graphic cards powered by their own power plant. :)


You are vastly underestimating the amount of work that would be required to support Thunderbolt GPUs on OS X. Its lack of support has nothing to do with any nefarious plot to make people buy more computers. Hot-pluggable GPUs are a Hard Problem and the alternative (requiring anything with a GPU to remain connected while the OS is up, and requiring a reboot if somebody wants to connect such a device) isn't worth doing, at least not for them it's not.


I understand support eGPUs on OS X a lot of work, but you misunderstand my position: I don't care if that work happens or not: I'm quite happy using Windows / SteamOS for games.


Source for your claim that it's not worth it for "them" to do?

I'd counter by arguing that the computer brand that still to this day laymen consider "better for graphics" would find value in supporting a spec that's very unveiling discussed the possibility of external GPUs.


searching Youtube for "egpu" pulls up a ton of Macs...


Damn, a Thunderbolt 2 PCIe card with two ports costs ~55€. That would make for one hell of a replacement for 10G Ethernet. Double the speed and cheaper. Does anyone know how the linux support for this is?


10GbE cost is dominated by expensive switches, not by the cost of ethernet cards.

This aside, Thunderbolt is not switched, so comparison is meaningless. There are very few opportunities where you use either Thunderbolt or Ethernet for the same application. The topology is very different.


If you're willing to leave the cisco/Arista/etc cocoon, NIC costs are actually the big ticket items.

http://whiteboxswitch.com/collections/10-gigabit-ethernet-sw...

$5595 for the switch, $699 for the software. You can split the 4 40G ports into 16 10G ports using splitter cables, giving a total of 64 10G ports: (5595+699)/64 = $98.34 per 10G port.

A 1M cable is $30 from here: http://www.fiberyes.com/sfp-cable-cab-10gsfp-p1m-30

So $122.34 for switch and cable.

I've never seen a 10G NIC that cheap per port.

(disclosure: I co-founded Cumulus, who is behind the software in this price comparison)


The problem for hobbyists is that they usually want to buy more like 8 ports, not 64.


Still, for simple high-bandwidth networking needs it should suffice.


Look at used infiniband gear, you will get much better support (hardware, software, cabling). Same price or lower.


All the addon cards I've seen are PCIe-2.0 x4 cards, thats 2GB/sec each direction. A single thunderbolt 2 ports would need ~2.5GB/sec each direction to 'max out', a pair would need 5GB/sec. They seem to be assuming a significant amount of the TB bandwidth will be used by the video passthrough capability. That makes these cards a not so great option if your goal is only to connect high speed external devices, not monitors.


Windows users are not going to like Thunderbolt until Microsoft fixes the Windows drivers. (Hint: Thunderbolt is 5 years old, the drivers still don't work. Don't hold your breath.) If you can't plug in a device reliably on an external connection it is pretty much useless. Maybe sometime after that motherboard vendors will stop hobbling the interface chips.

Thunderbolt pretty well rocks on OS X.

I wonder what the state of Linux is. And I mean really is. I still have corrupted MP3s in my library from Linux's firewire support. It almost always put the blocks in the right spot on the disk. (This was in the 90s or early 00s, must be better now, probably.)


Until thunderbolt stop being a proprietary technology, I hope it will stay where it is right now: ready to disappear into irrelevance when OCuLink (finally) comes out.


I can't see that happening.

For one Thunderbolt sharing the DisplayPort connector is critical for many laptop makers who simply don't have the space to fit any other ports. And secondly Apple has a strong presence in the content creation markets which will be hard to overcome.

But the biggest issue of all is that there really isn't any benefit for the mass consumer in any of these technologies. USB is more than capable for storage and many other needs.


Depends on the connector, to be honest. My laptop from a couple years back has an esata port that's shared with a USB port. Similar could be done with a full-size Displayport or USB C connector, for example. Also, one of the big problems I have with thunderbolt as a technology at all is that you have to play mother-may-I with Intel in order to even get a look at the chip datasheets so you can think about creating a thunderbolt device.

Though you hit the nail on the head with the last paragraph. Thunderbolt creates more problems than it solves. USB 3.1, with the new type C connector, gives a reversible cable, a lot of bandwidth, and a low price, while Thunderbolt's stuck with its needlessly expensive active cable which further discourages people from looking at the technology. Were Intel to revamp the spec and get more people involved, Thunderbolt could have had a chance, but as it stands, it's a dead protocol walking.


Apple's strong presence in content creation didn't help Firewire. It's tough to see if their newer, stronger position might help, but I think we'll see history repeat itself and find that USB3.x will eventually support high-performance graphics. Alternatively you'll just see PC makers shipping 4k+-capable HDMI ports.


I always see comments like this and wonder what they expected from firewire. I've never had trouble finding firewire hard drives and peripherals. Best Buy and CompUSA always carried them (staples and office depot never really carried them, but firewire is/was only useful for needing large/fast throughput like that needed for video editing). Firewire was never as ubiquitous as USB and I don't believe Apple ever intended it to be, but it was far from being a failure.


I doubt it's even possible for USB3 to support high-end graphics. The bandwith for intensive applications is just not there, and not having direct access to memory (like with PCI-E) could be another obstacle. Everything is possible, sure, but...

I don't see thunderbolt-like technologies to be an USB replacement, though. More like a definitive standardization of docks for laptops with interesting side effects.


I'm not sure if you're talking about connecting a monitor or connecting a graphics card. Graphics card is easy, USB is only a small multiple slower than thunderbolt, and as long as you have room for the textures 4Gbit/s is fine.

Uncompressed 4k is trickier. It looks like thunderbolt 2 has barely enough bandwidth, and USB 3.1 has barely not enough. No reason to expect they won't have a 3.x that supports it.


Is it still the case that USB causes heavy CPU usage? Even if there was enough bandwidth for 4k video, it'd be a drag if it meant pegging a CPU or two to 100%...


The only thing I would use Thunderbolt for is to add an eGPU to something like a Macbook Air or something else with an integrated graphics chipset


There are some who claim Intel is trying to kill eGPUs (external GPUs) and dGPUs (discrete GPUs) [1].

You're actually spot-on with the one killer use case of a thunderbolt connector. It's the same question dock connectors have also answered -- what's the need for a dock connector? A better GPU.

Thunderbolt is simply Intel's way of killing a good idea before it gains enough traction to threaten their hegemony with USB.

[1] http://semiaccurate.com/2012/12/17/intel-slams-the-door-on-d...


Nowhere, very fast.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: