USB 3.1 can do 100 Watt power according to specs

>USB 3.1 can do 100 Watt power according to specs
>USB 3.1 can do DisplayPort
>Displayport can do anything that DVI can do, even has more bandwidth with newer standards
>Graphics cards already handle multiple hundreds of Watts easily
>3.1-powered monitors could have included USB-hubs at almost no extra effort

When do we get monitors that are fully USB-powered and graphics cards that do a PSU-passthrough to have the monitor powered off of the same cable as it's using to connect to the GPU?
Has any company even implied any idea in this direction? The only thing I'm finding is a USB-C connected portable monitor for notebooks.

Also, what would be better, USB-A or USB-C? I would assume A has a more stable seating in the socket.

>not WiDi

Wouldn't the lag be disastrous and the radiation from a 100 Watt wireless power transmission be kinda dangerous and inefficient?

>speed of light
>lag

>EM induction
>dangerous

>speed of light
>no decoding time involved
u wat

I can't imagine the response times to be around the current 1ms that is standard.

>EM induction
>dangerous
Tell that to the guy with the pacemaker.

>USB 3.1 can do 100 Watt power according to specs
>100 Watt
>5V
>20A
I want to see that connector.

it's 100W at 20V

oh, nevermind then. I expected it to be the usual 5V.

See Wikipedia
>USB Power Delivery revision 2.0 (version 1.2
>2016-03-25
>20 V, 5 A

It would likely be slightly thicker than a current monitor cable. The monitor power cables tend to be thinner than the video cables, in my experience.

Wait, wouldn't that require a 20V rail from the PSU, which the GPU would need to support?

DC-DC converter

good luck getting 100W without a 4AWG wire.

>the electrical knowledge of a typical Sup Forums user

have fun with your housefire

Yeah, I guess it wouldn't be problematic for a PCI-E cable to give that additional power. At most, all GPUs would need two connectors at all times.

>random Sup Forums user knows more about power delivery than the people who made the standard in the first place
I wonder why none of the USB 3.1 marketing included information about possible house fires.
I'm sure they will soon be sued into oblivion from all of those devices that burn through PCs because they have power draws inside of the specifications.

see
its just 5A
20AWG will do

good lord take a fucking electrical engineering 101 course or at least watch a youtube video, you guys are absolute fucking morons

>5A

Enlighten us, sitter on horses.

Not him, but rule uf thumb is 0.1mm2 per amp.
It's 5A so 0.5mm2 will do. And in American Willie Grit thats 20

P=IV

cables don't care about P, they care about I (up until an extremely high V when you can start arcing). If you keep I the same and increase V (so you get more P) the cable doesn't give a shit

Light moves really fucking slow

it is literally impossible to beat 0.1-0.2 seconds across the atlantic unless we figure out some weird wormhole higher dimension tunnelling

>Light moves really fucking slow

>someone whose job relies on travelling faster than light is laughing at someone saying light is slow

>thatsthejoke.jpg

>cables don't care about P
you are fucking retarded
both the cross section AND the insulation determine the ampacity and voltage rating of the cable
the cross section will determine the power dissipation per unit length at a given current and the insulation will determine the maximum temperature it can withstand, as well as how easily that heat will dissipate
the insulation also determines the voltage rating of the cable

fucking lel

what do you think P is a function of?

VOLTAGE AND CURRENT YOU FUCKTWAT

Quit trying to lecture an electrical engineer

>the cross section will determine the power dissipation per unit length at a given current and the insulation will determine the maximum temperature it can withstand,
>given current

>insulation also determines the voltage rating of the cable
>up until an extremely high V when you can start arcing

lrn2read you moron

if you increase voltage, you are gonna have to put thicker insulation on the cable, which means the heat dissipation will reduce, therefore reducing your ampacity
you can't just increase voltage without changing anything else, you dimwit

must have been some shit university you attended, bud

holy shit

a typical cable is rated for 600V

the range we are talking about here isn't even remotely close to pushing the insulation limits of a standard cable

anyone with any sense whatsoever can see you're just being a fuckwit moron for the hell of it

This.
Even enameled wire would be fine for 20V
and with you are on the save side anyways.

why don't they just put a graphics card inside the monitor

AMD ASYNC COMPOOT 27" 1440P GAYMUR 144hz 1ms DELAY NANO FURY RED CURRY RAGE DISPLAY
now available at your local best buy

>USB-A or USB-C? I would assume A has a more stable seating in the socket.
USB-A is goat.
Fuck all these "new usbs", usb micro is a peice of shit and now everything has it because "its so good XD".
All it does is fall out of the connector and interupt the charging cycle.
IIRC lion batteries only have a limited number of charges, and if you interupt the charging and start it again it counts as two charging sessions.

I even had a usb micro cable break from normal usage. It was shorting out inside the sleeve and burned the shit out of my finger when I touchted it.
USB Micro is literally the nvidia of usb cables, its probably started so many fires and cheap wall adapters took the blame

WiDi sucks soooooo much ass. I used it and the response time is unusable.
>move mouse
>3 seconds later
>see mouse move

I cant imagine trying to watch a movie how bad the audio synch would be, people talking without their lips moving.. hear a huge explosion but the scene is still all calm.

Try to play a game with that, youll be crashing planes into buildings or just straight getting knifed all day long.

what if we tunnel to the other countries in a straight line?

I only watched dubbed anime so i guess you could say it

>werks4me

>mfw WiDi was literally developed by, tested by, and made for weebs

>IIRC lion batteries only have a limited number of charges, and if you interupt the charging and start it again it counts as two charging sessions.
lolno

Bandwidth between the GPU and the rest of the computer is higher than the bandwidth between the GPU and the monitor. Nobody wants to run a PCIe x16 cable between their computer and monitor...

Lol wow. Yes, as you charge and discharge the battery, you will change the internal chemistry of the battery, and it'll degrade over time. However that's just a natural process that will leave you with a lower-capacity and eventually useless battery. The number of times you plug it in is irrelevant. Maybe your phone has a shitty counter on it, but that doesn't matter. Battery degradation is a result of charging and discharging cycles, not sessions. Charge it to full and use till empty, that's a cycle. Also heat will degrade the battery more, so when possible avoid high-amperage charging since it will heat the device more.

>Battery degradation is a result of charging and discharging cycles, not sessions. Charge it to full and use till empty, that's a cycle.
its an obscure number.
What I read basically said starting the charging process is a cycle.
>lol wow
faggot. You basically repeated what I said.

>100 Watt
UP TO 100 Watt. meaning 75 Watts 80% of the time

Dubs means he's on to something ;)

upto means 75 watts is pushing it and shits gonna be hot.
more like 40 watts to be comfy

it seems you're also quite slow

they already do that where it's suitable

I genuinely don't understand the damage someone must have to do what they did in that pic, if it's legit.

>20V
LOL

i see no reason why it wouldn't be, i would probably get into a similar situation if i suddenly had enough money to make contact with the world avoidable

I don't get it, personally. Like, I don't mind not having contact with people, but I'd still take out rubbish and shit like that. Not just ignore it for months on end.

is that screen on the left held up by magnets?

It's held up by one of those stiff-bendable cables that the wires for the usb connection are sleeved in.

what makes them bendable?..

That's actually an interesting idea for a Thunderbolt display. Apple should get on that.

i don't think i'd get that bad with toilet stuff, i'd call a plumber over or something
but i can understand how one can become that neglectful

you mean what makes them stiff, normal cables are bendy because they're made of soft plastic, but they're soft, so they can't hold something like that in place
hard plastic is too rigid for this, so it probably has a soft metal or something in it

but metal is not soft, metal is hard.

Imagine a Surface phone with an Atom SoC running Wangblows 10 Botnet Edition in phone mode
>plug USB-C into monitor
>phone charges up
>monitor gets video, audio, and USB
>displays Windows 10 desktop running off the phone
>mouse and keyboard hooked up to USB hub on monitor
>all of this through a single cable

some metals are softer than others
a thin enough rod of lead for example can be easily bent into any shape, and it will hold that shape as long as there's little force applied

plastic on the other hand, tends to either spring back into its original shape, or crack/snap

my dick is sometimes soft and sometimes hard so there's no reason metal couldn't be both if it wanted to

Post soft dick

Lies.

Aren't they already doing this or rumored to do it?

>muh rigidityfluid identity
fuck off sjw