To illustrate how bad a job Intel has done on the marketing of this technology, I for a long time thought that display port was a closed mac-only thing, and until today I thought the same of thunder port.
If they want to increase adaptation, maybe they should consider marketing it more, and outside of sanfran's macbook-based hipster blogger circles.
Don't misconstrue Displayport and Thunderbolt - one is a royalty free standard, the other is crippled Intel only and patent encumbered. Dockport is a VESA standard that adds USB and DC power to DP to compete with Thunderbolt. I have much more interest in using that than some profiteering move on Intels part to achieve vendor lock in to its chipsets by getting end users stuck with Thunderbolt devices.
Sounds like the major issue is MSFT. I suspect Microsoft sees Thunderbolt as an opportunity to recreate the Firewire experience for most people: poor experience will kill the technology and drive would-be adopters away from Mac. Back into the arms of Microsoft.
> and outside of sanfran's macbook-based hipster blogger circles.
Man, this is a really tired trope. At the last conference I attended, ~9/10 engineers were running macbooks (and pretty much 10/10 engineers under 40).
If you attended a Games Developer Conference than it would be Windows everywhere, even on the MacBooks. Except maybe for the engineers that only do iOS games.
On the other hand, it saddened me to attend FOSDEM and see people bashing Microsoft about OSS practices, while walking around with a MacBook running Mac OS X. Talk about dysfunctional behavior.
That site has some pretty good statistics related to OS usage among programmers and some hardware usage, like iPod/iPhone, which I think is a decent indicator of whether or not these people own macs.
I went to a security conference about a year ago, and not being a security guy, I was surprised to see about half the people using Windows.
This being said, I am a little sad to see the prevalence of OSX-based laptops in 'hacker spaces', partly because I don't like Apple's practices, but mostly because diversity is one of the things that makes a hacker to me; the ability to make anything turn tricks. To see a monoculture starting to emerge is a little sad.
I tend to assume this has more to do with the piss-poor state of windows laptop manufacturers than with Apple. I'm sure those guys run Linux or Windows on their desktops, but seriously you expect them to put up with a heavy and crudware-riddled HP? A fragile Lenovo? A monstrously ugly MSI? A cheap Dell that falls apart the day the warranty runs out?
As the article comments note, Thunderbolt is also not taking taking off because Intel demands all Thunderbolt peripherals have OS X drivers.
Companies like Asus/Silverstone (and more who don't spring to mind right now) have created Thunderbolt GPU enclosures to allow laptops to do 3D work. However Intel won't let them go to market because they don't have OS X drivers, and Apple has no interest in supporting Thunderbolt GPUs when they cold be selling people additional Macs instead.
The GPU enclosures have been held up by Intel refusing to license them but not because of osx drivers, there are licensed thunderbolt peripherals that do not have osx drivers.
But there hasn't been an official statement about why they've been refused from Intel, so I can't link to it. You mention you still believe Intel has refused to license the hardware - what reason do you think applies?
As a developer on Mac, I think game programming is one of the last resorts where WinPCs are clearly better and/or needed - except for Unity based programming maybe. VS is quite dominant in that area, isn't it?
VS dominates for Unity development too, at least in my little circle of the world.
The developers I work with all love their Macs, but they hop onto Windows for serious Unity work, just so they can use Visual Studio with ReSharper instead of MonoDevelop.
You are vastly underestimating the amount of work that would be required to support Thunderbolt GPUs on OS X. Its lack of support has nothing to do with any nefarious plot to make people buy more computers. Hot-pluggable GPUs are a Hard Problem and the alternative (requiring anything with a GPU to remain connected while the OS is up, and requiring a reboot if somebody wants to connect such a device) isn't worth doing, at least not for them it's not.
I understand support eGPUs on OS X a lot of work, but you misunderstand my position: I don't care if that work happens or not: I'm quite happy using Windows / SteamOS for games.
Source for your claim that it's not worth it for "them" to do?
I'd counter by arguing that the computer brand that still to this day laymen consider "better for graphics" would find value in supporting a spec that's very unveiling discussed the possibility of external GPUs.
Damn, a Thunderbolt 2 PCIe card with two ports costs ~55€. That would make for one hell of a replacement for 10G Ethernet. Double the speed and cheaper. Does anyone know how the linux support for this is?
10GbE cost is dominated by expensive switches, not by the cost of ethernet cards.
This aside, Thunderbolt is not switched, so comparison is meaningless. There are very few opportunities where you use either Thunderbolt or Ethernet for the same application. The topology is very different.
$5595 for the switch, $699 for the software. You can split the 4 40G ports into 16 10G ports using splitter cables, giving a total of 64 10G ports:
(5595+699)/64 = $98.34 per 10G port.
All the addon cards I've seen are PCIe-2.0 x4 cards, thats 2GB/sec each direction. A single thunderbolt 2 ports would need ~2.5GB/sec each direction to 'max out', a pair would need 5GB/sec. They seem to be assuming a significant amount of the TB bandwidth will be used by the video passthrough capability. That makes these cards a not so great option if your goal is only to connect high speed external devices, not monitors.
Windows users are not going to like Thunderbolt until Microsoft fixes the Windows drivers. (Hint: Thunderbolt is 5 years old, the drivers still don't work. Don't hold your breath.) If you can't plug in a device reliably on an external connection it is pretty much useless. Maybe sometime after that motherboard vendors will stop hobbling the interface chips.
Thunderbolt pretty well rocks on OS X.
I wonder what the state of Linux is. And I mean really is. I still have corrupted MP3s in my library from Linux's firewire support. It almost always put the blocks in the right spot on the disk. (This was in the 90s or early 00s, must be better now, probably.)
Until thunderbolt stop being a proprietary technology, I hope it will stay where it is right now: ready to disappear into irrelevance when OCuLink (finally) comes out.
For one Thunderbolt sharing the DisplayPort connector is critical for many laptop makers who simply don't have the space to fit any other ports. And secondly Apple has a strong presence in the content creation markets which will be hard to overcome.
But the biggest issue of all is that there really isn't any benefit for the mass consumer in any of these technologies. USB is more than capable for storage and many other needs.
Depends on the connector, to be honest. My laptop from a couple years back has an esata port that's shared with a USB port. Similar could be done with a full-size Displayport or USB C connector, for example. Also, one of the big problems I have with thunderbolt as a technology at all is that you have to play mother-may-I with Intel in order to even get a look at the chip datasheets so you can think about creating a thunderbolt device.
Though you hit the nail on the head with the last paragraph. Thunderbolt creates more problems than it solves. USB 3.1, with the new type C connector, gives a reversible cable, a lot of bandwidth, and a low price, while Thunderbolt's stuck with its needlessly expensive active cable which further discourages people from looking at the technology. Were Intel to revamp the spec and get more people involved, Thunderbolt could have had a chance, but as it stands, it's a dead protocol walking.
Apple's strong presence in content creation didn't help Firewire. It's tough to see if their newer, stronger position might help, but I think we'll see history repeat itself and find that USB3.x will eventually support high-performance graphics. Alternatively you'll just see PC makers shipping 4k+-capable HDMI ports.
I always see comments like this and wonder what they expected from firewire. I've never had trouble finding firewire hard drives and peripherals. Best Buy and CompUSA always carried them (staples and office depot never really carried them, but firewire is/was only useful for needing large/fast throughput like that needed for video editing). Firewire was never as ubiquitous as USB and I don't believe Apple ever intended it to be, but it was far from being a failure.
I doubt it's even possible for USB3 to support high-end graphics. The bandwith for intensive applications is just not there, and not having direct access to memory (like with PCI-E) could be another obstacle.
Everything is possible, sure, but...
I don't see thunderbolt-like technologies to be an USB replacement, though. More like a definitive standardization of docks for laptops with interesting side effects.
I'm not sure if you're talking about connecting a monitor or connecting a graphics card. Graphics card is easy, USB is only a small multiple slower than thunderbolt, and as long as you have room for the textures 4Gbit/s is fine.
Uncompressed 4k is trickier. It looks like thunderbolt 2 has barely enough bandwidth, and USB 3.1 has barely not enough. No reason to expect they won't have a 3.x that supports it.
Is it still the case that USB causes heavy CPU usage? Even if there was enough bandwidth for 4k video, it'd be a drag if it meant pegging a CPU or two to 100%...
There are some who claim Intel is trying to kill eGPUs (external GPUs) and dGPUs (discrete GPUs) [1].
You're actually spot-on with the one killer use case of a thunderbolt connector. It's the same question dock connectors have also answered -- what's the need for a dock connector? A better GPU.
Thunderbolt is simply Intel's way of killing a good idea before it gains enough traction to threaten their hegemony with USB.
If they want to increase adaptation, maybe they should consider marketing it more, and outside of sanfran's macbook-based hipster blogger circles.