00:00CheetahPixie: i wonder why that happens
00:01CheetahPixie: i mean
00:01CheetahPixie: i have a big ass suspicion as to why exactly
00:01karolherbst: I think there is some issue with the offloading code in general
00:01CheetahPixie: this thing is running off an x1 bus
00:01CheetahPixie: that works too
00:01karolherbst: why is that?
00:02CheetahPixie: it's in an external pcie bridge
00:02CheetahPixie: i did it purely for fun
00:02karolherbst: I see
00:02CheetahPixie: and now that i have *this* working
00:02CheetahPixie: i learnt something
00:02karolherbst: well yeah.. with x1 you won't get much out of it
00:02karolherbst: and I expect there are some serious perf issues
00:02karolherbst: or I know there are
00:02karolherbst: I just don't know where exactly
00:02CheetahPixie: and thanks to you folk, i at least have a vga display adapter, and a testbench for other cards
00:03CheetahPixie: i know X will absolutely go batshit if I so much as unplug/hotplug the USB3 cable that connects the thing to the card in the slot
00:04CheetahPixie: at least youtube works with video decode
00:06RSpliet: Ok, next challenge: Get one of those high end NVIDIA RTX 2080Ti, plug in a USB-C external PCIe bridge, and plug an NVIDIA card into that bridge.
00:06RSpliet: Yo dawg
00:07CheetahPixie: give me some cash and that's challenge fucken' accepted
00:07RSpliet: Hahahaha, oh if only I had that magical money thing!
00:07CheetahPixie: about that
00:08CheetahPixie: "what do you mean you can't sli with a turing?"
00:08CheetahPixie: now if only oldschool SLI ever worked like that...
00:08RSpliet: "Ever since I had that triple-GPU set-up, my heating bill has gone to €0, it's amazing!"
00:09CheetahPixie: hey look, that's why I have an 8150
00:23HdkR: RSpliet: You know that eGPU bridge over the virtuallink connector wouldn't work :)
00:24CheetahPixie: why are you so sure?
00:24HdkR: Because USB != Thunderbolt even running over type-c
00:25HdkR: Although a thunderbolt controller running on a next-gen GPU would be pretty cool
00:26HdkR: The Asmedia controller on the 2080ti is definitely not Thunderbolt
00:42RSpliet: HdkR: I never looked into that USB-C port much... but I already hate it now. If it can be incompatible, it shouldn't be able to plug in IMHO.
00:43HdkR: Ah, you hate the physical connection versus protocol bit? It's definitely confusing for end users
00:45HdkR: That's already confusing with USB features, time to throw TB in to the mess
00:45RSpliet: HdkR: yep. And in many ways I think like an end user, if not just because it's a sanity check for wacky ideas ;-)
00:46CheetahPixie: USB C is an unfiltered mess
00:47CheetahPixie: I mean, someone proposed something like mode flags or whatever be printed on whatever to indicate what it can do, but have those be in the form of numbers
00:47CheetahPixie: so if you need a cable that can do everything, you have 1023 on the packaging
00:47CheetahPixie: and then you can instantly call someone out if you query this cable and figure out it comes out to a different number altogether
00:48HdkR: RSpliet: Oh also, active Thunderbolt 3 cables don't work as USB cables. So you can mess up there
00:49HdkR: Found that out when trying to get 3 meter type-c cables for some things
00:58CheetahPixie: alright, so
00:58RSpliet: I'm going to go and blame Intel
00:59CheetahPixie: far as I can see, my main screens will run as fast as they can when they're doing stuff by themselves
00:59CheetahPixie: but if the monitor on nouveau slows down in anything, *everything* then lags
00:59CheetahPixie: as if it's syncing or something
01:02RSpliet: I don't think X.org was designed with multi-monitor, multi-GPU set-ups featuring one severely bandwidth-starved GPU in mind 16 years ago...
01:02CheetahPixie: my other card should still run fine despite this
01:03RSpliet: In an ideal world yes. But this is the kind of corner case that nobody bothered to optimise for
01:03RSpliet: Wayland might be slightly better at this, being designed around the era of Optimus
01:04RSpliet: But that's just speculation
01:04CheetahPixie: it's also continuing to be designed
01:07CheetahPixie: although when I kill the compositor, performance becomes not bad
01:08imirkin: CheetahPixie: multi-gpu in X is a bit of a hack. the screen-on-remote-gpu bit is definitely a huge hack.
01:09imirkin: a useful one, to be sure, but ... far from ideal
01:09imirkin: remote displays must tear, the way they're designed
01:09imirkin: no amount of compositor will fix that
01:09imirkin: with wayland, that's not the case
01:10imirkin: but with X, you have one big happy giant root pixmap, which has to be distributed across GPUs
01:16CheetahPixie: and that causes desyncs and so?
01:16CheetahPixie: unless we had access to some fancy mosaic
01:16imirkin: i dunno what a desync is
01:17imirkin: most GPUs don't run displays in sync even when they're off the same board
01:17imirkin: framelocking was an advanced feature of Quadro GPUs, but i haven't seen it in recent times.
01:21HdkR: imirkin: Today's Quadros still have the framelocking
01:23imirkin: ah, didn't know that
01:23imirkin: you have to do something to enable it, presumably
01:23imirkin: and no external time sync anymore?
01:23HdkR: You still need the external sync card
01:24HdkR: Plugs in to a header on to the cards still
01:24HdkR: Quadro Sync II, costs $900
01:25HdkR: "Up to 50 Nodes Up to 200 GPUs" woof
01:25HdkR: Nothing like buying 80 Quadro Sync cards to sync up a video wall
01:25RSpliet: "For high vibration environments, self-latching/locking cable connectors are utilized, which hold all cables securely in place on both the Quadro Sync II card and Quadro GPU."
01:25RSpliet: I really want to know what the use-cases are now
01:33RSpliet: Also: it seems to have an Altera Cyclone IV FPGA on the card rather than an ASIC
01:36imirkin: they probably don't sell a whole ton of those..
01:36imirkin: ASICs ain't cheap
01:36imirkin: modern FPGA's are crazy-fast
01:44HdkR: Low volume FPGAs are cheaper than low volume ASICs yea
01:44RSpliet: Yep. FPGA is £14 on Farnell
01:45imirkin: i haven't used altera since the 10k70, but i assume cyclone iv goes for more than that
01:45RSpliet: That's the exact FPGA
01:45imirkin: wow ok
09:24uis: How many good news?
18:48lanodan: Given the track of NVidia and nouveau probably not much :D