Now when you've got a Type-C Port, you just don't know what it can do. It could be anything from a USB 2 Port, up to a USB 3.1 + 20A Power + Thunderbolt Port.
You can't see this from looking at it, and Spec Sheets today have a problem of listing the version numbers of HDMI and DisplayPort Ports, I don't see them suddenly becoming very concise about this.
The promise of USB has always been one of standardization. If I knew my mothers notebook is from around somewhere after 2008, if it has a USB Port, it will be USB 2 compatible.
In 2020 i won't be able to know if the monitor i recommend my mother will be able to connect to her laptop over the USB Type-C Port or not, or if it will run at full resolution or not, if it will be able to charge the notebook over it, or not... etc.
Its just a mess. If you make a standard, please make it a standard, not a pick and choose affair where the result ist just confusion.
I look very, very forward to the day when a USB C port means that I have a very strong chance of getting 15 watts for my tablet/phone from any vendor - and the sooner apple adopts USB C for their iphones, the better.
I think it's great how a USB 3.0 drive will still work on a USB 1.0 computer from 1999. Sure, it'll run slower, but it'll work, and even without an adapter! With Apple's previous various high-speed ports (Firewire 400, Firewire 800, Thunderbolt), they have no backwards compatibility -- from FW 400 -> 800 you could use an adaptor, but for Thunderbolt you're out of luck completely.
Of course other Devices will follow that come without external power supplies and other features that don't allow for graceful degradation. Thats my point.
If it all were backwards compatible as it has been until now, things would be great, but that door has been shut now.
from FW 400 -> 800 you could use an adaptor,
but for Thunderbolt you're out of luck completely
No you aren't:http://www.amazon.com/Apple-Thunderbolt-to-FireWire-Adapter/...
If you're trying to plug in a USB device, just plug it in. It'll work, regardless of whether it's TB 3 or just plain USB 3. (update: this is probably for 99% of cases since USB is much more common than TB)
If you're trying to plug in a thunderbolt 3 device, it's very likely you know if you have thunderbolt.
This will cause confusion (especially with added confusion around DisplayPort standard support over the USB connector) and it's unfortunate that a new connectivity standard is a regression.
- Type C with Thunderbolt gets the Thunderbolt zig-zag logo
- Type C with DisplayPort alternate mode gets the "screen" rectangle
- Type C with USB only is unmarked
People will soon wonder why their device is charging so slowly/having such low data transfer rates.
Now there are 2 additional thunderbolt cables (powered and unpowered) with type c on both ends. People will think it's an USB cable and wonder why their connection isn't working at all, or try to replace it with an USB cable
2020? Even in 2015 you can just look up the laptop model on Google and get results about whether the monitor will work with it or not, the resolution etc.
It just doesn't really make sense to worry about this kind of stuff from a practical point of view. Sure, maybe it'll be a pain for historians and conservationists. But for everyone else, we'll just be constantly buying new computers anyways, right?
Or will all Thunderbolt 3 devices be able to scale back and communicate with devices over USB if the controller is not available? How will that work with displays over USB-C connectors? Is there a possibility that now instead of just checking for a USB port, we'll have to read the list of controllers/protocols available in devices connecting over USB-C ports?
1) Apple was the first to use USB even though it wasn't an Apple tech
2) Apple was the first to use Thunderbolt even though it wasn't an Apple tech
3) Apple is now the first to use USB 3 Type C (aka "USB-C") even though it wasn't an Apple tech (although rumor has it that Apple engineers had a lot of input on it)
The technology you're probably confusing it with is FireWire, which WAS Apple tech.
Not actually true. USB was shipping on PC motherboards for about a year before Apple went whole hog on USB. At the time, however, almost no one actually used USB (I think the first primary use of USB was keyboards and mice, and the first generation of USB keyboards and mice were notably inferior to their PS2 equivalents). Once Apple backed USB, other companies started to do the same.
Didn't the Chromebook Pixel 2 come out slightly before the MacBook?
It's already used in Type C to enable plugging in USB, chargers (PD spec), HDMI or DP adapters (https://www.chromium.org/chromium-os/dingdong) and I think even to route PCIe over Type C. Adding Thunderbolt to the mix shouldn't be too hard.
It's something that could possibly be solved through the markings on the devices etc and I still think doing thunderbolt over the USB-C port is great. Perhaps the display example could fall back to supporting a display-over-usb standard, it won't work as well but could help.
If you have a rare TB device, you're probably a professional, and yes, you'll need to ensure the TB logo is on the port. If it's an extremely common USB device, it'll work on any port, without exception.
The Apple Thunderbolt monitors are Apple's fault. There was no reason they couldn't have accepted a raw DisplayPort signal, but they still didn't.
This is why we can't have nice things.
Put in regular terms - you can now use your USB 3.1 device in any port that you can plug into.
Since TB devices are so uncommon, I don't see a problem.
That's why you can't buy a macbook air and plug in a little $300 box with a GeForce 970 and play high-end games on it right now. There are no technical reasons why this won't work-- in fact, people have hacked together solutions that work great.
Intel doesn't want to let you do it.
From: http://www.anandtech.com/show/9331/intel-announces-thunderbo...
>>> Meanwhile gamers will be happy to hear that Intel is finally moving forward on external graphics via Thunderbolt, and after more than a few false starts, external GPUs now have the company’s blessing and support. While Thunderbolt has in theory always been able of supporting external graphics (it’s just a PCIe bus), the biggest hold-up has always been handling what to do about GPU hot-plugging and the so-called “surprise removal” scenario. Intel tells us that they have since solved that problem, and are now able to move forward with external graphics. The company is initially partnering with AMD on this endeavor – though nothing excludes NVIDIA in the long-run – with concepts being floated for both a full power external Thunderbolt card chassis, and a smaller “graphics dock” which contains a smaller, cooler (but still more powerful than an iGPU) mobile discrete GPU.
[0] http://www.anandtech.com/show/7987/running-an-nvidia-gtx-780...
[1] http://forum.techinferno.com/implementation-guides-apple/427...
This port is going to be very expensive for the manufactures. If it does everything I'm going to need a bunch. Does anything stop OEMs making a row of identical ports that only 1 charges my laptop, only a couple take can use the fastest cable and I'm sure all sorts of potential shenanigans,
We'll just have to get used to ports having a load of obscure symbols next to them.
And yes the lightning bolt will charge things without powering your laptop on. Very handy.
The alt-mode capability cannot traverse a hub.
More likely you'd have dual cables running to dual monitors. Each monitor could provide internal USB 3.1 hubs and associated devices plus feed 100W back up the line if needed.
What mechanism can stop the £0.99 charge only cables people expect to plug in their portable hard drives through.
(Conveniently this also saves Apple on the new MPBs, they can now have both USB A and USB C ports without it being weird)
I’m expecting them to kill off USB type A ports entirely on future laptops. We’ll see what happens to Magsafe, HDMI, and SD card slots. I wouldn’t be too surprised to see a Macbook Pro with 4–6 USB type C (Thunderbolt 3) ports, a headphone jack, and nothing else.
> Imagine screens, laptops, TVs, phones, mp3 players, docks, hard drives and toasters all using the same plug (well, maybe not the last one)
I don’t think toasters are a good fit, but USB Type C with its 100W DC could be great for powering other small appliances (LED desk lamps, small fans, printers, scanners, video cameras, small TVs, routers, modems, electric toothbrushes, ...), if USB Type C starts showing up in outlets in homes/cars/airplanes/airports/classrooms/...
I agree on SD cards though, just because most SD Card slots can impress the card into the frame of the machine for portability.
For example, when can/can't you plug your USB-C webcam into your USB-C monitor and expect your computer to detect it? If I plug my USB-C toaster into a USB-C computer, will it power up? Does a USB-C hub (like the ones we use with USB-A when you don't have enough ports) necessarily support every possible protocol, or can you only plug a subset of computer peripherals into it? Will all USB-C cables have the amperage ratings to safely power my laptop from the wall, or will there have to be different cable types for different uses?
What do you get when you plug you USB phone in your USB computer ? What about the tablet ? Why can I read files from my phone via USB but my phone can't read file from my USB HDD ?
Why is my Bluetooth headphone mono with the computer, but stereo with the phone. Why can's my NFC yubikey work with my NFC phone.
Right now, people mostly ignore it because there is generally broad support in devices provided by manageable number of generic protocols. So you mostly only have to care about the few cases when stuff don't work rather than thinking what works.
There is no reason to be overly pessimistic about USB-C, it is an incremental improvement over technologies already suffering from the same issue.
Well, the connector will have a universal shape. But the cables won't work universally (you need special ones for high-speed Thunderbolt), and two things sharing a connector won't mean they'll work together due to different protocols (USB 3, DisplayPort, Thunderbolt, others).
I have no doubts they will continue to have VGA ports until 2025...
I actually use the VGA port on my ThinkPad somewhat regularly—it's still the most common display connector in many offices. However, I do carry dongles for HDMI and DVI (adapting from the built-in DisplayPort) just in case. I'm looking forward to having USB-C replace all of them.
Apple is clearly pushing heavily behind the scenes in both the USB-IF and with Intel/Thunderbolt to drive USB Type C ports. Reportedly their engineers did much of the work designing the port. Looking at their one-port Macbook, they’re obviously heavily invested in USB Type C’s success. I assume every future Mac is going to be mostly USB Type C ports. Can you clarify what you mean with your comment?
I’m guessing iPhones will stick with Lightning on one end for at least the near future though.
A couple months ago on HN I asked about PC's getting the new USB C connector and no one seemed in a hurry. It won't be a standard until PC's ship with at least one port.
(Aside: but I wonder if magnetic connectors are even appropriate for a cable that can transmit data as well as power. With MagSafe, if the cable was disconnected, the laptop battery would take over and you'd be fine, but with thew new MacBook you'd potentially have data loss if you had a hard drive daisy-chained in. So perhaps that's a factor in why they got rid of magnetic connectors?)
I don't think DMA attacks have been fully solved yet via software, or am I not up to date here? I guess you could blacklist the driver.
And yeah, we really need device firewalls that isolate everything via IOMMU and don't allow drivers to do any memory mappings until the user confirmed the device.
Do you also think people who use screenlocks are tinfoil hatters?
I still wish the industry would have standardized on the 2.5mm jack. It can be plugged in any direction. You could pull a cord out of a mess of cables and it would not snag as there is nothing to snag on.
the C port is interesting in that it provides both USB pins and a set of pins that can be used for a number of functions (Displayport being its out of the gate use).
If this leave the USB pins alone, and make use of the supplemental pins for Thunderbolt, there will be no port confusion. Especially as Tunderbolt is already set up to carry Displayport data anyways.
Then it just comes down to the chipset to negotiate the right protocols.
I got an old one I want to sell before it gets obsolete. I guess the time is now.
Edit: "Thunderbolt 3 integrates USB 3.1, optional 100W power delivery, 5K @ 60Hz display."
Optional power delivery? Optional nowadays just mean "will be removed before even reaching production"
I think Roritharr is right, this is the beginning of the end for the great era of standardization that USB brought. If there was a port and a cable, you pretty much knew it would work for everything the port should do.
DisplayPort and HDMI port/cable versioning have been a colossal pain for me. GPU spec lists on shopping sites don't even list what version they support, so you have to constantly cross reference things to manufacturer datasheets.
http://i.imgur.com/ffMR5gj.png
What's sad is that we had the same sorts of differentiations in the past, like with DVI-I (digital and analog) and DVI-D (digital only) and Dual-link DVI (for big screens). So what did we do? We called them different things and then actually advertised what it was capable of.
But apparently saying "DisplayPort 1.2" is too complicated now or something? Maybe there are too many different optional features that may or may not be supported by a GPU or the cable or the display, so they just assume none of them will work? I don't even know.
That screenshot isn't an old graphics card either, it's a GTX 970.
Not looking forward to USB turning into the same mess.
EDIT: This isn't all new, I recall seeing some laptops with strange things like yellow USB ports that were still powered for charging when the computer is sleeping. But that's a minor feature that most people wouldn't even notice. Major capabilities missing from some ports is different, and can get more confusing than "the blue ports are faster but all of them will work".
USB is already "that mess" depending on what it is you need the port to do. Let me know how that USB-powered device that needs 90W does when you plug it into a port that's only 1.0 capable.
Your point is taken, but we also have to consider if that would even be possible. 100W is not a power level that a laptop device is going to be able to output, so it clearly can't be standard. It's good to have a standardised option, however.
And I guess the sticky part will be all those 12" Macbook's with Type C connectors that wouldn't work with such a display...
Anyone think they'll pre-announce it at WWDC (like they did the Mac Pro)?
At least until we have some sort of IOMMU-based hotplug device firewalls in our operating systems.