05:48VanackSabbadium: good morning karolherbst imirkin
05:48VanackSabbadium: i tried what you suggested (nouveau.config=NvGrUseFW=1) but it gives me this error:
05:48VanackSabbadium: [ 2.759047] nouveau 0000:05:00.0: Direct firmware load for nouveau/nvcf_fuc409c failed with error -2[ 2.759061] nouveau 0000:05:00.0: Direct firmware load for nouveau/fuc409c failed with error -2[ 2.759063] nouveau 0000:05:00.0: gr: failed to load fuc409c
05:49VanackSabbadium: i "downloaded" (just because it bothers me to extract the firmware from my card) the right firmware and put into /lib/firmware/nouveau, but i still have that issue
05:50VanackSabbadium: i also tried nouveau.config=NvMSI=0 but still freezes
10:00karolherbst: VanackSabbadium: yeah, you need to extract the firmware first
10:01karolherbst: VanackSabbadium: so you mean, after placing the firmware the loading errors were gone but you still saw the issue, right?
10:06VanackSabbadium: nope, i see the loading error AND see the issue
10:13karolherbst: yeah.. then you are still using the nouveau firmware
10:20VanackSabbadium: i think so
10:37VanackSabbadium: that's my /lib/firmware/nouveau -> https://cloud.disroot.org/s/mfo9bnwgFm6PbzW
10:38VanackSabbadium: i put all these firmware inside the folder -> https://slackware.pkgs.org/current/slackonly-x86_64/nvidia-firmware-325.15-noarch-2_slonly.txz.html
10:39VanackSabbadium: in this thread imirkin_ helped this girl Rebecca having my same issue, but i don't know if she solved it -> https://bugs.freedesktop.org/show_bug.cgi?id=71659
10:41VanackSabbadium: also, if needed, here's my /proc/interrupts file -> https://cloud.disroot.org/s/ncy9yJSC3w2zMbH
10:50VanackSabbadium: how do i check if i'm actually using nvidia firmware or nouveau firmware?
10:51VanackSabbadium: here's my "dmesg | grep nouveau" output -> https://cloud.disroot.org/s/BJzJpywyfT94D2A
10:58VanackSabbadium: gotta go to work, you have something to work on, i'll connect later this afternoon, have a nice day :)
16:28VanackSabbadium: hi there again karolherbst
16:31karolherbst: VanackSabbadium: I think you might have to regenerate initramfs for the firmware to be available
16:33VanackSabbadium: oh.... sudo update-initramfs?
16:36karolherbst: VanackSabbadium: probably
16:39VanackSabbadium: i'll try
17:14VanackSabbadium: do you think that it's reliable to have a downloaded nvidia firmware instead of extracting it on my own?
17:15karolherbst: VanackSabbadium: you mean from the slackware package?
17:15karolherbst: should be fine
17:15karolherbst: they just run the same script probably
17:15VanackSabbadium: yeah, maybe they run ilia's script
17:15karolherbst: well.. there are no other scripts :p
17:15VanackSabbadium: i see the version of nvidia drivers is the same
17:15VanackSabbadium: well, you can also do the "normal" procedure
17:15karolherbst: you mean the mmiotrace stuff?
17:15VanackSabbadium: but i understand that the script is much easeir
17:16karolherbst: well.. "normal"...
17:16karolherbst: but yeah
17:16VanackSabbadium: fine, the one RECOMMENDED by nouveau main site XD
17:17karolherbst: I wouldn't recommend it :p I think we should rewrite the wiki page :d
17:18VanackSabbadium: yeah, you keep giving advices for downloading and installing nouveau on Ubuntu Oneiric XD
17:18VanackSabbadium: a century ago xd
17:40macc24: how well is 9600gt supported?
17:48RSpliet: macc24: What are you trying to do with it?
17:48macc24: RSpliet: draw triangles
17:49RSpliet: Oh nouveau can make the 9600gt do that last time I checked.
17:50RSpliet: Not an awful lot of them, but a fair amount
17:50macc24: what nouveau can't do on 9600gt?
17:50RSpliet: Draw lots of triangles (it's stuck at boot clocks I believe)
17:51HdkR: It's also stuck in GL 3.3 land
17:51RSpliet: It also can't do Cuda or OpenCL, if the GPU was able to do that in the first place (not sure tbh, it's a long time ago)
17:52HdkR: My first Macbook had a 9xxx series in it which is where I first played with OpenCL :P
17:53HdkR: That fun OpenCL 1.0
17:53macc24: HdkR: it's for getting display to test x3x coreboot
17:53macc24: and for some light gaming
17:54RSpliet: HdkR: I recall there being something fishy about that Macbook 9xxx thing. Like they shipped with an NV9x _and_ an NVAC or something like that
17:54HdkR: Yea, it was...9400m and 9600m?
17:54HdkR: Nightmare machine
17:55macc24: hehe, mcp chipset
17:56HdkR: 9400m was worthless and the 9600m was a battery drain.
17:57macc24: that geforce was faster than gpu that i played games on
17:58RSpliet: The 9400m was useful for h.264 video decoding and rendering your desktop, not much more.
17:59karolherbst: RSpliet: was this with the "I need to logout" system or was it more advanced later on?
17:59macc24: RSpliet: i was playing kerbal space program on worse gpu
17:59HdkR: Yea, you needed to logout to switch GPUs
17:59RSpliet: karolherbst: I never owned it. I have an NVIDIA ION machine, same GPU.
17:59macc24: i went to moon on potato gpu
18:00karolherbst: HdkR: yeah well :D
18:00macc24: HdkR: better than reboot :)
18:00karolherbst: I am convinced we would have laptops these days were you'd need to reboot to switch the GPU if apple didn't push for makeing it work just like that later on :p
18:00VanackSabbadium: potato gpu? potato pc? HERE I AM
18:01HdkR: Potentially yes
18:01macc24: karolherbst: i'm sure that some people would have been too impatient to wait for reboot and made it work
18:01macc24: VanackSabbadium: yes, i was playing games on i3 thinkpad x201
18:03VanackSabbadium: i tried to play Cities Skylines with 4 gb ram, I'M A HERO!
18:03RSpliet: karolherbst: you sure it was Apple pushing? I get the impression it's mainly been Intel pushing IGPs to both drive down slim-and-light laptop costs as well as driving up their revenue
18:03karolherbst: macc24: I don't think so :p
18:03macc24: VanackSabbadium: i was playing minecraft with 100 mods on 4gb of ram
18:03VanackSabbadium: (well, i failed miserably, to be honest)
18:03karolherbst: RSpliet: apple was the first vendor having proper laptops supporting it
18:03karolherbst: maybe there would be others
18:03macc24: VanackSabbadium: and i was playing cities skylines on 4gb of ram
18:03karolherbst: but they would have implement it in a sucky way
18:04VanackSabbadium: but my SWAP IS BIGGER THAN YOURS XD
18:04macc24: with barebones X server
18:04macc24: and almost no services in background
18:04RSpliet: karolherbst: yeah, I still wonder why on earth Apple wanted that MCP chipset in their machine. Their incentives were different I guess :-)
18:04VanackSabbadium: hell, with a very lightweight system it did not load any fucking map :(
18:04karolherbst: RSpliet: think about it, even on linux today it sucks, because nobody cares enough to make it not suck
18:04karolherbst: or well
18:04karolherbst: nobody cares enough to spend money on it
18:05macc24: RSpliet: on gm45, pm45 and ironlake northbridges intel didn't follow dram specs and you couldn't use 8gb ram sticks
18:05karolherbst: RSpliet: battery lifetime :p
18:05karolherbst: pre HDA intel GPUs just sucked
18:05macc24: nvidia did do that and they could use 8gb dimms on a c2d laptops
18:05karolherbst: GMA945 was just pure shit
18:05karolherbst: not because the hw sucked, but the windows driver sucked so much
18:06RSpliet: karolherbst: well "nobody cares", NVIDIA sided with Intel better than AMD, so afaik most "hybrid" laptops have an Intel/NVIDIA combo. And NVIDIA never cared about the Linux desktop quite at the same level as AMD
18:07karolherbst: RSpliet: I blame on vendors equally on hybrid graphics. None cares
18:07macc24: RSpliet: thinkpad t400 had radeon gpu
18:08karolherbst: at least if it comes to linux
18:08karolherbst: linux literally has the worst support for hybrid graphics
18:08macc24: karolherbst: is PRIME that bad?
18:08karolherbst: it's not prime
18:08karolherbst: userspace sucks
18:08karolherbst: we have to rewrite all X and all wayland compositors to make it not suck
18:09karolherbst: display offlaoding
18:09karolherbst: right now if you attach a 4k display to the dedicated GPU
18:09karolherbst: the full desktop gets rendered on the intel one and gets transfered to the dedicate GPU for scanout
18:09macc24: i'm glad to have laptop with only one gpu now
18:09karolherbst: if you play a game on the exeternal display, you use the dedicated gpu, copy the content to the intel one, do compositing there, copy it back
18:10karolherbst: pcie is not fast enough
18:10karolherbst: so it sucks
18:10macc24: i was gaming on that setup too :D
18:10karolherbst: with fullhd it's good enough
18:10karolherbst: not with 4k
18:10karolherbst: not with 5k
18:10karolherbst: especially not if you do 144fps
18:10karolherbst: macc24: that's.. nothing :p
18:10macc24: on pcie2.0x8
18:11karolherbst: that's low "low end spec"
18:11macc24: including opengl load :D
18:11RSpliet: karolherbst: only thing you can blame the vendors for is not getting involved with Wayland design from the get-go to get this right. Well, and NVIDIA for not supporting Wayland properly ofc :')
18:11karolherbst: RSpliet: yeah, "caring" is the proper term here in my world :p
18:11karolherbst: we are like where apple was 15 years ago
18:12macc24: karolherbst: i will do same thing on 1x pcie :DD
18:12karolherbst: just that the hw doesn't have the support for this use case anymore
18:12karolherbst: muxed designs is really the only thing which is "perfectly fine" on linux
18:13macc24: karolherbst: what if there is no compositing?
18:13karolherbst: then you can't render offload
18:13karolherbst: and with dri2 you can't do proper vsync
18:13karolherbst: seriously.. this is all just terrible
18:13RSpliet: karolherbst: well, the regular PRIME use-case works alright doesn't it? It's mostly reverse prime that's borked
18:13karolherbst: RSpliet: no
18:14karolherbst: it works on low spec displays
18:14karolherbst: but that's it
18:14karolherbst: it's fine as long as you don't attach any displays
18:14karolherbst: but.. uff
18:14RSpliet: that kind of _is_ the regular PRIME use-case
18:14karolherbst: or if displays are all connected to the igpu
18:15RSpliet: render on the discrete GPU, compose on the IGP, display on the laptop screen
18:15karolherbst: RSpliet: yeah.. but well.. if you want to play games, you normally prefer doing it on proper screens :p
18:15karolherbst: but even that sucks... but for different reasons
18:15karolherbst: laptops come with 4k displays these days
18:15karolherbst: and because we don't increase the pcie link speed, we suffer a lot for fullscreen applications
18:15karolherbst: and amdgpu has broken pcie relinking...
18:16macc24: karolherbst: why does regular prime suck?
18:16karolherbst: which... is only broken for egpu though
18:16karolherbst: macc24: 4k displays.. but it also sucks a bit on windows.. so I wouldn't blame linux here
18:17karolherbst: it's "fine"
18:17karolherbst: it's just limiting and gives you like 10% of the actual possibilities
18:17HdkR: Rendering in 4k an a thunderbolt eGPU and pushing it back to the 4k internal display means the framerate runs at ~48FPS even in Windows. Have fun with that :P
18:18karolherbst: but.. at least we have individuals caring enough to fix it
18:18karolherbst: so.. maybe next year it will be fixed
18:18karolherbst: HdkR: yeah.. but without external displays it's "fine"
18:18karolherbst: just.. uff
18:18karolherbst: buying 2k in hardware to game on the internal laptop display
18:18RSpliet: karolherbst: fair. I'm not much of a gamer myself, I just enjoy staring at Unigine benchmars for entire train journeys or flights.
18:18karolherbst: although you are probably more close to 3
18:19karolherbst: and 500 for a proper display
18:19linkmauve: karolherbst, why does that suck? That’s my current setup, and aside from the lack of physical connectors on the motherboard (but that’s easily fixable with Thunderbolt adapters providing DP ports) that works fine.
18:19karolherbst: linkmauve: it's fine on low spec systems
18:19linkmauve: I don’t think it’s low-spec. ^^
18:19karolherbst: linkmauve: it's low spec :p
18:19linkmauve: Is it?
18:19karolherbst: or do you have a 4k external display?
18:19karolherbst: with 144hz?
18:19linkmauve: Ah, you mean in resolution.
18:20karolherbst: I wasn't limiting it to CPU/GPU :p
18:20linkmauve: I haven’t upgraded any screen yet, the two I have work fine.
18:20karolherbst: yeah.. full hd is fine
18:20karolherbst: 4k is where the fun begins
18:20karolherbst: and 4k is what I considere "middle range"
18:20HdkR: Anything that gets close to saturating DP 1.4 is where madness lies
18:20karolherbst: 4k@144 is mid-high :p
18:20linkmauve: Do you mean the UHD630 wouldn’t be able to drive it?
18:20karolherbst: linkmauve: not enough pcie bandwidth for reverse prime and prime offloading
18:21linkmauve: What is reverse prime again?
18:21karolherbst: external display on discrete gpu
18:21macc24: karolherbst: is pcie2.0x1 enough to do 1600x1200 prime?
18:21karolherbst: macc24: ufff...
18:21karolherbst: could be close
18:21linkmauve: Ah, that’s exactly what I’m *not* doing, since I want it to power off when I don’t use it. :p
18:21macc24: karolherbst: at 85hz :p
18:21HdkR: 500MB/s with some overheads...That's quite close
18:22linkmauve: Although, with a broken one and a Nouveau one, the UHD630 is still the most usable GPU I have out of the three. :(
18:22karolherbst: 1600*1200*85*4 / 1024 / 1024 = 622 MB /s?
18:22karolherbst: or is there a mistake im my calculation?
18:22linkmauve: I have to try putting the Radeon one in the oven, apparently that’s likely to fix it.
18:22macc24: 4 bits per pixel?
18:22karolherbst: well.. maybe you could use 3, but uff
18:23karolherbst: not sure any hw does 24 bit scanout for real
18:23karolherbst: but they might
18:23HdkR: Probably need to drop down to 60Hz and it is still very close
18:23karolherbst: HdkR: do you know if one has to *3 or *4 per pixel for scanout?
18:23macc24: HdkR: i don't want a seizure from blinking :p
18:24karolherbst: macc24: anyway.. it's close enough for getting into trouble :p
18:24RSpliet: There's also other traffic going over that bus.
18:24karolherbst: yeah.. and encoding
18:24karolherbst: it just sucks
18:25HdkR: karolherbst: I don't know the alignment requirements for scanout formats :)
18:25macc24: ugh, i guess ill use internal display, 1600x900@60hz
18:25karolherbst: HdkR: me neither
18:25karolherbst: but I guess the gpu copies over a 32 bit buffer
18:25karolherbst: and the gpu might convert it for scanout
18:25HdkR: Both 3bytes and 4bytes will still be close over such a small connection
18:26RSpliet: does it transfer the whole buffer before scan out (e.g. does it introduce a 1-frame delay)?
18:26karolherbst: RSpliet: prime has some syncing stuff going on
18:26karolherbst: so the vsync can get delayed until the other gpu is done rendering
18:26karolherbst: and you can do request a buffer both gpus can render to/scanout from
18:26karolherbst: or so
18:26karolherbst: so I think you don't have to have a delay
18:27karolherbst: at least when usind DRI3 offloading
18:27RSpliet: karolherbst: sure, but if you occupy your bus 100% that doesn't work.
18:27karolherbst: ohh, your desktop fps drops :p
18:28VanackSabbadium: i'm learning arabic and it's more clear to me
18:28karolherbst: it was better when we didn't use to block on the kernel side
18:28karolherbst: so you got tearing, but immediate scanout
18:28karolherbst: no we bock and delay scanout
18:29karolherbst: uhm.. compositing
18:29karolherbst: not scanout
18:29RSpliet: And if you try to not introduce a 1-frame latency (in a sort of triple-buffering thing), you have to take vblank into account for your scanout bandwidth. E.g. a frame must be transferred in 16ms rather than 16.6666(repeating)ms :-)
18:29HdkR: Just ensure that the cursor plane keeps moving at 60hz and no one will be able to tell that the desktop itself is running at less than 60 ;)
18:29karolherbst: RSpliet: that's what already happens with dri3
18:29karolherbst: the igpu waits on a fence until the dgpu is done
18:30RSpliet: ... I hope that for desktops we can do damage rendering?
18:30karolherbst: and just continues
18:30karolherbst: HdkR: ufff..
18:30karolherbst: HdkR: that's like broken with wayland anyway
18:30karolherbst: at least with plasma
18:30karolherbst: no clue about gnome
18:30karolherbst: but probably there as well
18:31karolherbst: so the cursor planes fps can also drop
18:31karolherbst: but I didn't actually verify this behaviour yet
18:31karolherbst: just noticed that it can suck
18:31karolherbst: yeah.. it's terrible
18:31HdkR: I'll have to test that in Sway when I try it again
18:32karolherbst: I might be wrong though and it could have had a different reasons
18:32karolherbst: or missremembering
18:32HdkR: Also to see how well Sway handles tiled monitors
18:32karolherbst: but I think you can stress the intel gpu do suck at the cursor stuff
18:32karolherbst: .. mh wait
18:32karolherbst: easy way to test it
18:35karolherbst: with pixmark_piano I see small stutters on the cursor