00:00 Horizon_Brave: imirkin_: so you and nvidia *do* agree on something lol
00:00 gnarface: Horizon_Brave: the state of the market support for other hardware in Linux is not such that you can reasonably expect everyone to switch vendors away from Nvidia just to gain support for Wayland, a window server that has no commercial game support.
00:01 gnarface: Horizon_Brave: in fact, until you can compile wayland on a stock debian install i don't see how they expect anyone to even help them test it
00:01 gnarface: it didn't seem to me like the type of project that was actually even making a reasonable effort to PRETEND they wanted to actually succeed at their stated goals
00:01 gnarface: i've been assuming its just a boondoggle someone has been tricked into paying salaries for
00:01 phoenixz: So again, I have a lot of hopes put on wayland to fix those issues.. Its been 10 years, IMHO I think we should be at the point where I can just easily specify the monitors how it want them, and it works. Weather I have 1 card or 5, 1 monitor per card or 5, for all I care.. I know I might be oversimplifying things, but at the same time, multimonitor and 3D effects etc are not top of the line things anymore, its common place and it should just
00:01 phoenixz: work out of the box..
00:02 airlied: wayland doesn't fix any of that
00:03 airlied: multicard is still messy
00:03 gnarface: phoenixz: guys like you should make their own distros to try to prove this type of thing. on the way you'll realized it's not that easy, but if you can get there without forgetting your current opinion, you might succeed at something rare...
00:04 gnarface: phoenixz: (making a linux distro that actually steals windows users)
00:04 Horizon_Brave: I think Ubuntu is as close to that as humanly possible at this point
00:04 imirkin_: like slackware? i used windows, and then moved to slackware...
00:05 imirkin_: [hm, did i? might have been RH 5 or 6...]
00:05 gnarface: Horizon_Brave: i don't think they're as close as *humanly* possible. in fact i think they serve their distro up with the same "it's barely good enough but so what, nobody else even tried" contempt that nvidia serves their linux drivers up with
00:05 gnarface: Horizon_Brave: but i agree that they're the last (mabye only) example of this in the wild
00:05 gnarface: a distro that regular *casual* users like
00:06 gnarface: it could still be done better. they cut too many corners on QA
00:06 gnarface: no excuse for a binary distro to have a higher failure rate on upgrades than gentoo
00:06 Horizon_Brave: speaking of upgrades...I need to upgrade to stretch...
00:07 Horizon_Brave: curious to see the new packages and support they've mixed in..and finally debian stable gets a kernel in the 4.0 range!
00:08 gnarface: the one place i see is mobile devices may adopt wayland much more aggressively, for performance enhancements only relevant to framebuffer drivers
00:09 airlied: gnurou: is that the only fix patch you have? I've applied it to -next as skeggsb is away
00:16 phoenixz: gnarface: I think there are already way way too many distros, spreading the available resources too thin.. If only people would focus on.. I dunno, 5 core distros, with a few around there.. I do think wayland is the future, because its been 10 years and I still haven't seen basic improvements on X.. Again, absolutely not trying to say that X devs are doing a bad job, just trying to be practical, at one point basic problems should be resolved, not
00:16 phoenixz: .. There are so so many problems that still hven't been resolved, I really hope somehting happes there soon, be it X or wayland :)
00:16 Horizon_Brave: make X great again
00:16 phoenixz: Lol
00:17 Horizon_Brave: Hey..so... I have two questions for tonight... of course dealing with drivers and firmware..my eternal struggle to 'get it'...
00:17 Horizon_Brave: when it comes to the firmware and software driver for a device..like an ethernet card or graphics card... do both have to be written in the same language?
00:17 dboyan: No
00:18 Horizon_Brave: lol okay, easy enough...in general, what language is firmware originally written in before it's compiled into binary?
00:18 Horizon_Brave: Assembly? or..C?
00:19 dboyan: Any language as long as you can compile it into binary. Most probably assembly or c
00:19 gnarface: phoenixz: most the complaints i hear about xorg these days amount to parsing failures in the config file which is soundly in the jurisdiction of the drivers themselves. unfortunately switching to Wayland won't obviate the need for drivers.
00:20 gnarface: phoenixz: (and i have no reason to expect Wayland to be capable of somehow magically improving 3rd party vendors' driver quality by simple osmosis)
00:21 Horizon_Brave: gotcha..my last question then is...can the firmware and software driver be compiled to binary together in one blob? Like from the manufacturer, let's use AMD for a change... Can they write a driver that is both the driver and firmware?
00:21 Horizon_Brave: or would that be non-sensical?
00:22 phoenixz: gnarface: oh for sure, we need better drivers too, but guys like imirkin_ won't be able to do too much about that until nvidia and ati start open sourcing stuff.. Well, then again, if they were to do that, imirkin_ would be out of a job lol.. But beyond that, I have a lot of problems with X.. multi monitor support, for example, has been, and still is, horrible at best.. I've been fighting for days now to get a 3 monitor solution up.. on 2 other
00:22 phoenixz: work stations with 2 monitors, well, sometimes it works, sometimes it doesn and you need to reboot.. That is just not something that should be happening @2017, IMHO.. AGain, not blaming anybody, just trying to point out that the state of affairs in linux video still is rather shoddy..
00:22 gnarface: Horizon_Brave: non-sensical? yes, quite. but raspberry pi's (and probably many other ARM devices) basically work that way
00:22 Horizon_Brave: I just fail to see the point in having two needed parts be completely separate... nouveau for example... if you guys had and knew the firmware source code for some of nvidia's cards...would you just bundle the it all together?
00:23 gnarface: in the nvidia firmware situation, the ONLY reason they do that is for obfuscation
00:23 airlied: phoenixz: if you plug multiple monitors into one gpu, it all works fine
00:24 gnarface: Horizon_Brave: you're probably having trouble wrapping your head around it because you're looking for some technical justification other than greed and malice
00:24 airlied: multi-gpu isn't trivial multi monitor
00:24 dboyan: Horizon_Brave: I think for most blob drivers, firmware is compiled into the binary as an data array of some kind. But it is decided that it is not a clean way in the open-source world imho, because of potential licencing issue.
00:24 dboyan: *licensing issue
00:24 phoenixz: airlied: in which case it would be nice to have one card with 3(+) HDMI outputs, but they all always are some weird combo of HDMI, DVI and VGA..
00:25 phoenixz: airlied: but even if there are multiple GPU's, should it still be such a problem? I know, I'm not near a driver dev, but what would be the problem to support multiple GPU's?
00:25 airlied: phoenixz: just get one with one hdmi and two displayport and use adapters :-)
00:25 nyef: phoenixz: You can get a four-DPort card if you go looking.
00:25 gnarface: phoenixz: oh oh! i know the answer to this one. i just saw an asus gtx 1060 with like 8 HDMI ports and 1 DVI port. i think the search buzzword bingo you need is "VR Ready"
00:25 airlied: phoenixz: someone who cares about the problem
00:26 airlied: phoenixz: supporting multiple gpus with a desktop on gpu is one problem, but that isn't what most people want
00:26 airlied: they want one desktop spanning the gpus
00:26 airlied: that's where it gets complicated, and the number of developers with time to expend on the niche drops to 0
00:27 phoenixz: airlied: That might work, but its still a "patch and mess around and hey it works" thing.. Why can't we be at the point where stuff like that simply works? Linux can run 10.000 CPU frameworks, but a second GPU makes a mess of things?
00:27 phoenixz: nyef: yeah, but now I need some specialized card.. Why cant I use simply off the shelf hardware?
00:27 airlied: phoenixz: linux isn't magic, someone has to do the work
00:27 airlied: nobody has done the work, since nobody cares enough to do the work
00:27 airlied: since the work is for a niche case
00:27 phoenixz: gnarface: same what I wrote with nyef, I doubt that card is what I'd call "affordable" :)
00:28 airlied: and the nice isn't paying anyone to enough to justify doing int
00:28 airlied: niche
00:28 airlied: nvidia binary drivers have actually the best support in that area
00:28 airlied: but even there it's messy
00:28 gnarface: phoenixz: i dunno your scale of economies, but it was on sale for ~300USD
00:28 phoenixz: airlied: I get that.. If I had a milion lying around, we'd have multi GPU support by tomorrow :D
00:28 nyef: http://www.ebay.com/itm/Nvidia-Quadro-NVS450-512mb-PCI-e-DisplayPort-Video-Graphics-Card-4-monitor-/292045712519?hash=item43ff47ec87:g:yh8AAOSwSlBYu3uw as an early hit on eBay.
00:28 gnarface: phoenixz: (and it was a 6GB model)
00:29 gnarface: phoenixz: agood deal, really
00:29 gnarface: a good*
00:29 nyef: Not the cheapest NVS450 on eBay, either.
00:29 airlied: I wrote most of the code that allows multi-gpu laptops to work at all, I had notthing left to deal with multi-gpu generically
00:30 phoenixz: airlied: but, for example, KDE.. I've seen the same software pieces being developed, 6 montnsh later scrapped, and redeveloped, and that times 10 or somethjing.. I feel (and this is just my poor feelies) that this happens on too many places in Linux, coupled with so many distros reinventing the wheel over and over.. I think things could be a lot more efficient
00:31 phoenixz: airlied: Also, it might be a niche market, but what about gaming? If there wo9uld ever be good drivers for Linux, we'd get real gaming on linux, and hell.. You want users? gaming and office works is the way to go
00:31 airlied: of course they are, but you have to pay people to do what *you* want, otherwise they have a bad habit of doing what *they* want
00:31 phoenixz: nyef: I'll take a look at that.. only issue is that (havent seen it yet) I bet that it has ventilators, and I really was hoping for a quiet solution :D
00:31 airlied: phoenixz: there are other niches more important for gamingf
00:32 airlied: multi-gpu isn't SLI
00:32 phoenixz: airlied: I'm not a gamer myself, just saying that I"ve seen a lot of "gaming rigs" with multiple monitors setup.. But I guess they all use one CPU?
00:32 phoenixz: nyef: hey, no vents!
00:33 nyef: "I see no fan here".
00:33 phoenixz: damit.. had I known this before..
00:33 phoenixz: yeah
00:33 airlied: phoenixz: for gaming monitors on one gpu is the standard
00:33 nyef: So, roughly $100, give or take about $20.
00:33 airlied: hence why amd can plug 6 monitors into some gpus
00:34 imirkin_: nyef: note that NVS 450 is actually 2 GPUs
00:34 phoenixz: nyef: next problem, I"m in mexico, I gotta see how I can get that over here... but def. worth looking into
00:34 nyef: Really? Damn.
00:34 airlied: imirkin_: only one with a monitor
00:34 nyef: So much for that. There are other quad-DP cards, or tri-DP+HDMI cards, but they're more expensive.
00:34 imirkin_: airlied: it's an old GPU no? pre-kepler can only do 2 heads per gpu...
00:34 phoenixz: imirkin_: so NVS450 would still give the same problem
00:35 nyef: Also newer.
00:35 airlied: imirkin_: ah yes those are mainly for trading floors not gamers I suspect :)
00:35 imirkin_: probably
00:35 phoenixz: Anyway, I'm here now at this point.. I've seen this stuff working, it has to work..
00:35 nyef: G98? Yeah, that's probably a dual.
00:35 phoenixz: imirkin_: what if.. I pull out the 2nd card and use the internal Intel video + 2 outputs from only 1 card? Would that perhaps work?
00:35 imirkin_: well, definitely only 2 heads. i suppose all 4 could be connected to one of them, but then you'd only be able to use 2 at a time :)
00:36 phoenixz: imirkin_: Or would that be worse, mixing 2 different GPUs?
00:36 imirkin_: phoenixz: perhaps
00:36 imirkin_: nyef: but you can get quad-output kepler GPUs
00:36 imirkin_: not fanless though.
00:36 phoenixz: I'll look into that
00:36 imirkin_: and as someone mentioned, AMD chips go up to 6
00:37 phoenixz: imirkin_: But I specifically went for nvidia because (supposedly) AMD was horrible on linux...
00:37 imirkin_: ok
00:37 imirkin_: i'd definitely recommend amd over nvidia :)
00:37 imirkin_: bbl
00:37 phoenixz: Ah?
00:37 phoenixz: k
00:38 gnarface: i must assume he's not a gamer
00:38 gnarface: or just making a principled statement (which is fine too)
00:39 gnarface: there are serious compatibility and performance issues with many games in Linux with regards to AMD's drivers
00:39 phoenixz: gnarface: who is "he"?
00:39 gnarface: imirkin_
00:40 gnarface: the problem is across the board with all their closed-source and open-source drivers, but it's hit&miss depending on the games and what mesa version you have
00:40 gnarface: their driver behavior with regards to the rest of linux is much better than nvidia though
00:40 gnarface: the same for nouveau in that regard, actually
00:41 gnarface: unfortunately because of nvidia's contempt for not putting square pegs in round holes, we're still in a situation where you have to choose between compatibility with [most games] or compatibility with [everything else]
00:43 gnarface: so like, on one hand, Nvidia's drivers can't even parse simple screen position settings in xorg.conf like "LeftOf" and if you get tricked into using their installer script it irrevocably hoses your distro's package dependency tree
00:44 gnarface: but on the other hand 60% of the games available for Linux on Steam won't even *start* if you're using AMD (open or closed-source) drivers
00:44 gnarface: it's a really pitiable situation actually
00:44 airlied: gnarface: seems a bit high
00:45 gnarface: agreed,
00:45 airlied: but I suppose there are lots of games nobody cares about \
00:45 gnarface: but competence taunts us with it's absence
00:45 gnarface: *pointed* absence
00:46 gnarface: airlied: more than that can probably be massaged to get working, by switching to/from open source drivers, but then the changes you make to get your drivers working for one game end up breaking it in others... and consistently you have to deal with significantly larger windows:linux framerate gaps than with nvidia
00:47 gnarface: airlied: and god help you if you're trying to do it through Wine
00:48 gnarface: for out-of-the-box compatibility with games on Linux, Nvidia is even behind their own Windows drivers, but not as far as AMD
00:48 gnarface: but i'm REALLY hoping for that to change one day soon
00:49 gnarface: i was hoping for it to change THIS year, but i'm not holding my breath
00:53 gnarface: airlied: i should clarify *won't start, or starts then immediately crashes/blackscreens/locks up etc
01:06 imirkin: gnarface: there are even more issues in nouveau. so amd gpu's are really the only choice for perf. intel is fine for regular stuff.
01:07 imirkin: [and i do try to make nouveau better, but i'm but one man, and only doing it in spare time]
01:07 gnarface: imirkin: but at least nouveau parses xorg.conf files as documented! you're doing that part much better than nvidia at least...
01:08 gnarface: imirkin: i asked about it in #nvidia and they actually told me it didn't matter because you could still configure the screen layout after startup with nvidia-settings or xrandr >:(
01:08 gnarface:is so angry about that type of thinking
01:11 imirkin: [and in case it's not apparent, i don't consider closed-source drivers to be an option for anyone interested in running linux]
01:11 gnarface: yea we gathered that
01:12 gnarface: or at least, i gathered it, i was just trying to explain the unstated assumption to Horizon_Brave earlier
01:13 gnarface: i'd REALLY like my next card to be AMD and i almost bought one of those RX 480s, but they're just not moving quite fast enough on their open source drivers to inspire confidence yet
01:16 gnarface:has bought only AMD CPUs for 4 generations now
01:16 Lyude: imirkin: I'm very close, but this keeps coming out to a triangle. if you're not busy mind taking a quick skim at https://paste.fedoraproject.org/paste/8NgPJDiCQ1cOosG3pR1TSl5M1UNdIGYhyRLivL9gydE= ? the polygon mode thing there is something I added into shader_runner.c to call glPolygonMode()]
01:17 imirkin: Lyude: point me at your tree again?
01:18 Lyude: imirkin: sure, gimme a sec to update it
01:18 gnarface: imirkin: (and yes, i can admit to myself that bending to Nvidia to play games on Linux is hypocritical and exposes a fundamental character flaw)
01:18 phoenixz: gnarface: Just out of curiosity, would you perhaps know a thing or two about this Xorg.0.log error? (EE) /dev/dri/card1: failed to set DRM interface version 1.4: Permission denied
01:19 gnarface: phoenixz: weird... maybe forget to install Xorg setuid root, then launch it as a regular user?
01:19 airlied: that generally means something already has the card open
01:19 gnarface: oh, another running Xorg instance seems possible too
01:19 imirkin: Lyude: #extension GL_NV_fill_rectangle : enable
01:20 imirkin: might want to nuke that line :)
01:20 gnarface: phoenixz, airlied in theory couldn't it also be a kernel/driver version mismatch?
01:20 airlied: nop
01:20 airlied: nope
01:20 imirkin: Lyude: also might want to do GL_TRIANGLES instead of polygon
01:21 imirkin: Lyude: the #extension stuff is in GLSL to enable features that aren't specified by the relevnat #version
01:21 imirkin: Lyude: also can i see your shader_runner.c diff
01:21 gnarface: oh
01:22 Lyude: 200~https://github.com/Lyude/mesa/tree/wip/NV_fill_rectangle for mesa and https://github.com/Lyude/piglit/tree/wip/nv_fill_rectangle for piglit
01:22 Lyude: whoops, remove that 200~
01:22 Lyude: imirkin: should all be right there
01:22 gnarface: phoenixz: a quick google shows me people in 2015 having this error were all using hybrid graphics laptops. same with you?
01:25 phoenixz: gnarface: Nope, desktop work station with 3 nvidia 210 cards... well, it was working with 3 but horribly slow which appeared to be caused by the last card which was on a slower PCI slot, so I removed that one and plugged 2 monitors on one card, but then I got this incompatibility issue between xorg and nouveau, whcih I could fix with a PPA upgrade to xserver nouveau, which I did, but that again caused THIS error (whcih I started with
01:25 phoenixz: yesterday, to make it even better)
01:25 phoenixz: So now I have 2 nvidia 210 cards, one with 1 monitor and the other with two
01:25 gnarface: phoenixz: interesting... and there isn't even like, an intel video card onboard there?
01:26 phoenixz: I'm thinking about pulling another and trying to do the integrated intel with one card that has the other two monitors
01:26 phoenixz: gnarface: yeah, there is
01:26 gnarface: phoenixz: ah, can you disable it in the BIOS to see if all 3 nvidia cards behave if it's not there?
01:27 phoenixz: Yesterday @ #systemd channel, we *thought* it might be caused by plymount starting before the ACLs were applied.. We more or less confirmed this by ctrl-alt-backspace killing X, and then at restart it would be able to access the files correctly
01:27 phoenixz: gnarface: 2 cards.. I pulled the 3rd because it caused everything to speed drop to 5FPS
01:28 phoenixz: I'm pretty sure it already ran without it.. Had a rough day, so not 100% sure but I do think it was disabled before
01:28 phoenixz: Ill try that again
01:28 phoenixz: rebooting
01:35 Lyude: imirkin: ahhh, I figured out a little bit of what's going on
01:35 Lyude: the reg isn't getting set properly cause it looks like cso->fill_front is 0 for some reason, I'll go from there and let you know if I need help again :)
01:36 imirkin: ok cool
01:36 imirkin: https://github.com/Lyude/mesa/commit/2ae44fbda202e039197c8be9466924566b994a3c
01:36 imirkin: this commit is missing a change in st_atom_rasterizer.c at least
01:37 Lyude: ahhh, that might actually be what's causing this to not get set
01:52 Lyude: imirkin: oh snap, I think I might have gotten it this time! changing from GL_FILL to GL_FILL_RECTANGLE_NV causes the whole window the shader generated to become white instead of just the triangle. enlarging the window indeed reveals what appears to be a triangle
01:53 Lyude: *appears to be a rectangle
01:53 Lyude: does that sound right? i can take a pic if you need
02:01 gnurou: airlied: I had two fixes actually - both about bad pointer usage, one reported by Julia, the other by Dan. I think you missed the one by Julia which I sent on 3/10
02:06 airlied: gnurou: indeed I did, will grab that one
02:08 nyef: The first thing that comes to mind in terms of possible issues with a multi-GPU desktop is that each GPU would have its own memory space and rendering contexts, so you'd be looking at some interesting synchronization to use a single opengl context across the whole desktop, or need to have every opengl using program include code to support multi-GPU.
02:08 nyef: On the one side: Hard problem. On the other: Intractable problem.
02:08 nyef: Sound about right?
02:10 airlied: nyef: yeah getting 3D ine one window spanning all cards is where it starts getting difficult
02:10 airlied: and is sort of tractable until you get to compositing
02:11 nyef: No, the intractable problem is updating every opengl-using program.
02:13 airlied: oh you could do it with a single context, just would be horrid
02:14 nyef: Mmm. Especially when the GPUs have different features available... Or are simply different types altogether.
02:14 airlied: nyef: the other option is to render the app all on one gpu and split out the result across the display
02:15 nyef: Plausible, I suppose. The logical extension of DRI PRIME, right?
02:15 airlied: pretty much
02:15 nyef: Which would also mean that you could assign different apps to specific GPUs.
02:16 airlied: currently we mostly do that using providers
02:16 airlied: one gpu is the renderer, everyone else just displays
02:17 nyef: Changing the subject a bit, do you have any advice as far as adding stereo 3d support to X goes, presuming that the DRM driver supports it?
02:18 nyef: (Or, in the case of the 3D Vision kit, that the emitter is available?)
02:19 airlied: people have looked at it at various times, but I've no real idea how the hw interfaces
02:19 airlied: mostly depends on that
02:20 nyef: I've got the hardware mostly figured on the HDMI 3D side, but it amounts to running a combined framebuffer that has both the left and right eye views at the userspace level.
02:21 airlied: so you have one single framebuffer object, with two images side by side in it?
02:22 nyef: The 3D Vision kit has some vblank synchronization requirement, and I have yet to work out the overall timing chain, but it doesn't need to have separate framebuffers as long as you can redraw the bits that change between eyes in time.
02:22 nyef: Side by side or top and bottom, typically.
02:25 airlied: nyef: so how does GL expose it? using GL_FRONT_LEFT/RIGHT?
02:25 airlied: you'd just translate them into some offset into a single massive framebuffer most likely
02:26 airlied: I assume you'd also need to be fullscren
02:27 nyef: Ideally, I'd like it to work windowed. But at the very least, there's a matter of exposing 3D-capable modes and setting up the GLX visuals for them.
02:29 airlied: yeah just not sure how you'd deal with the rest of the desktop in windowed
02:29 nyef: There's also a small matter of DRI2 having appropriate-looking APIs for 3D work, but DRI3 not, unless it interacts well with... there's some old-school multi-something extension from the SGI days.
02:29 airlied: unless the compositor is 3D enabled
02:30 airlied: well DRI3 uses Present
02:30 nyef: ... Or you arrange to do all non-OpenGL rendering to both framebuffer images at the same time?
02:30 airlied: nyef: yeah that's where thign get messy
02:30 airlied: esp with pageflip
02:34 airlied: though maybe pageflip won't matter for that
02:35 nyef: So, yeah. I'm looking to get the v2 of my patch series for getting the basic stereo support out the door "soon", but once that's done my only real option for using it is to leave X for a terminal and do something in terms of a fullscreen hack using the DRM interfaces directly.
02:37 airlied: that's always a good place to start :)
02:37 nyef: And I'm substantially at a loss for what to do for X support.
02:37 nyef: Sure, a place to start. I don't know about *good*, but a place to start.
02:37 gnarface: nyef: no need for X with SDL?
02:37 airlied: nyef: step two would be fullscreen X app then
02:38 airlied: or that could be step one
02:38 nyef: Fullscreen X would be a good starting point, yes.
02:38 airlied: nyef: still not sure how the output shows up to the user though, is it just one screen size
02:38 airlied: how do you tel lthe crtc to scanout 3D from it
02:39 nyef: There's a couple of examples, including testdisplay in intel-gpu-tools.
02:39 nyef: But basically the 3D mode flags tell the userland to set the framebuffer up in one of $n$ layouts, and the overall framebuffer dimensions (and effective size) are based on that.
02:40 airlied: yeah sounds like X would just need to reconfigure the frambuffer, then the 3D app could present to it
02:40 nyef: So a 1920x1080 mode in SBSH would be two 960x1080 images, in TB would be two 1920x540 images, and in FP would be two 1920x1080 images stacked vertically with a gap in between.
02:41 nyef: And then there can be lower resolutions, display type and quality affecting which pixels actually get displayed where, and so on.
02:43 nyef: AFAICT, xrandr doesn't have any hint of support for 3D mode information, and I'm fairly sure that xf86vidmode doesn't either.
02:44 nyef: ... How does an opengl application even request fullscreen operation at a specific resolution?
02:46 airlied: nyef: yes we don't expose 3D modes to all clients
02:46 airlied: and X is one of those
02:46 airlied: GL apps don't generally
02:46 airlied: they use SDL or whatever to setup the screen
02:47 gnarface: i thought one of the fundamental things about OpenGL initially was that it was rendered in such a way as to be resolution agnostic. it's not supposed to care what resolution the viewport is.
02:47 gnarface: that's up to the OS
02:48 gnarface: "OS" (SDL in most cases these days)
02:48 nyef: I found one tutorial that basically creates a screen-sized window with substructure override redirect, and used that.
02:48 gnarface:remembers GLUT and shudders
02:48 nyef: s/used/uses/.
02:48 nyef: Doesn't do a resolution change, though.
02:50 gnarface: in https://wiki.libsdl.org/SDL_SetWindowFullscreen ?
02:50 gnarface: nyef: ^
02:51 nyef: gnarface: Must I dig through the SDL sources to figure out how that's implemented?
02:51 gnarface: nyef: well i sure as hell can't tell you
02:51 gnarface:would probably not even understand the sources
02:54 nyef: ... Lovely, no web-facing source repository browser.
02:55 gnarface: nyef: apt-get source libsdl2-2.0-0
02:57 nyef: Already unpacked the tarball from /usr/portage/distfiles/ and chucking it into a git repository so that I can use git grep.
02:59 gnarface: libsdl1.2 might still be in the repos too
03:00 nyef: Down to SDL_SetDisplayModeForDisplay()...
03:02 nyef: There's an SDL_GetClosestDisplayModeForDisplay() and then _this->SetDisplayMode().
03:03 nyef: ... And X11_SetDisplayMode() goes via xrandr or xf86vidmode.
03:04 nyef: Thus putting us right back to the windowed desktop case.
03:08 nyef: Okay, what if the X server represents 3D-capable modes as their "visible" resolution, so a 1920x1080 SBSH mode appears as a 960x1080 mode, but actually switches to something 1920x1080 with a horizontal pixel doubler so the effective framebuffer is 960x1080?
03:08 nyef: Can we even do that?
03:09 airlied: nyef: not really, X isn't that complicated
03:09 nyef: From there, once a fullscreen OpenGL context is activated, we do the mode switch to the SBSH mode.
03:10 nyef: Which bit couldn't we do?
03:12 airlied: nyef: randr sizes the framebuffer you want to display into from the mode you give it
03:12 airlied: so there would be a disconnect in allocating the framebuffer somewhere
03:12 nyef: There'd be some trickery involved somewhere, yes.
03:15 nyef: Otherwise we pretty much have to revise the xrandr protocol for this.
03:16 airlied: that's is usually the correct answer
03:20 nyef: We probably want to expose the "official" resolution as well as the framebuffer size for each eye... Plus have some way to indicate that a non-3D mode is capable of 3D via the 3D Vision kit (basically, anything up over about 110 Hz on a suitable monitor)...
03:21 nyef: ... and some indication of *which* 3D type a given mode uses for the HDMI modes (SBSH, TB, or FP).
03:22 nyef: And this does boil down, overall, to having two framebuffers that need updating for each desktop rendering command.
03:23 airlied: or just don't draw into them unless a 3D app is running
03:23 airlied: we are facing similiar problems trying to stop the desktop drawing into VR HMDs
03:24 nyef: If we're in a 3D display mode, we need to draw into both framebuffers unless we're specifically doing 3D rendering.
03:26 nyef: Hrm. Does xf86-video-nouveau export a bare framebuffer to the X server in any way, or does all rendering (and framebuffer reading) occur via APIs?
03:27 airlied: glamor does it via glamor, EXA via EXA
03:29 nyef: If I've ever dug this deeply into how the X server works, it was a long time ago. /-:
03:33 nyef: I guess I know what I need to be digging into next, then: How the X server and xf86-video-nouveau communicate, and how xrandr works and can be revised for 3D support.
03:34 nyef: airlied: Thank you. I'm sure I'll have more questions later. (-:
03:35 nyef: gnarface: Thank you, as well. Knowing that the "commonly done thing" boils down to the interfaces that I already knew about is useful.
03:59 imirkin: Lyude: that sounds right. fwiw, xwd works much better than photos ;)
04:03 Lyude: imirkin: my only question is it supposed to always be the entire window? changing the verticies to make the triangle smaller doesn't seem to change the size
04:03 imirkin: Lyude: that has to do with the shader you're using
04:04 Lyude: ah, I figured it'd be something like that
04:04 imirkin: Lyude: so a vertex shader outputs vertices in a normalized space... -1..1
04:04 imirkin: and then there's a viewport mapping which maps that onto the framebuffer
04:04 imirkin: s/mapping/transform/
04:06 Lyude: ah
04:07 imirkin: you're using some of the older ways of writing shaders iirc
04:08 imirkin: i'd recommend just doing [vertex shader passthrough]
04:08 imirkin: and doing "draw rect"
04:08 imirkin: er
04:08 imirkin: i guess not draw rect. hm.
04:08 imirkin: if you adjust your vertices to not span the full range, you should see a smaller thing
04:10 Lyude: I was just about to try that actually, ended up reading into glsl stuff to figure out a bit of how it worked
04:12 gnarface: nyef: no problem
04:16 imirkin: Lyude: btw, i realize you're not done yet, but looking at your patches, some of the stuff seemed misordered and misgroupped. just wanted to make sure you were aware of this
04:16 Lyude: oh?
04:18 imirkin: e.g. https://github.com/Lyude/mesa/commit/e5ebe06fb1009fa8272583785d6c974a31847b25 shouldn't enable the cap anywhere
04:18 imirkin: https://github.com/Lyude/mesa/commit/62167c9e9bfdf1e64d8e6e306501774d8c8fd745 this should come before the st/mesa patch
04:19 imirkin: as the st/mesa patch will need to make use of that stuff
04:19 Lyude: right
04:19 imirkin: https://github.com/Lyude/mesa/commit/e436200576dd586e660fac992a08647201594246 should enable the feature for GM200_3D_CLASS and later
04:21 imirkin: anyways, i know that it's unfair to review a WIP patchset, just wanted to make sure you were aware of some issues
04:24 imirkin: nyef: i'd highly recommend getting the framepacking modes worked out before doing any crazy X work..
04:24 imirkin: i suspect it's just done with timing tricks on the doubleheight+vblank fb...
04:26 Lyude: imirkin: np! it's useful to know
04:27 imirkin: Lyude: oh, and you can drop the ES bits - it won't work without NV_polygon_mode, which we don't support
04:29 imirkin: Lyude: also it's not an error to set the polygon front/back modes separately. just at draw time they have to match up.
04:29 Lyude: Ah
04:29 nyef: imirkin: Essentially, "finish up the first bit, so you're only working on one thing at once"?
04:29 imirkin: (yeah, i hate specs that do that)
04:30 imirkin: Lyude: so you need to add something to _mesa_valid_to_render()
04:30 imirkin: nyef: kinda, yea
04:31 imirkin: also you've been putting it off for quite a while now :p
04:32 nyef: Somewhat, yes.
04:32 nyef: That said, O
04:33 imirkin: well said.
04:33 nyef: Err... That said, I'm looking to get a v2 patch series ready within the next week.
04:34 imirkin: and it'd be awesome if it could be used to enable the 3d feature. however i'm not sure that we can really do that until the required-by-the-spec framepacking modes work
04:34 nyef: Yeah, it's probably not good unless there's a mechanism to say "don't export frame-packing modes".
11:42 xen: hi
11:42 xen: http://pastebin.com/F7vENHPB
11:43 xen: wth am i seeing? :)
13:00 dboyan: imirkin: I have a problem implementing the binary shader cache. The pointer to FixupApply function is directly stored in each fixup entry, so it won't work across loads
13:01 dboyan: I'm not very sure how to solve it.
13:08 pmoreau: xen: Do you know if that happened when the screensaver kicked in, or the monitor when to sleep or resumed?
13:08 pmoreau: And is that with xf86-video-nouveau 1.0.13? If so, try updating to 1.0.14.
13:08 xen: it happens when i try to set up the offload rendering magic with my script
13:09 xen: 2 nv cards 6 displays
13:09 xen: oh, i dont know the exact version, but i will check it when i get home
13:10 pmoreau: "Hardware name: To Be Filled By O.E.M. To Be Filled By O.E.M." best name! :-D
13:10 xen: yep, that caught my eye too
13:11 dboyan: imirkin: I wonder how I can get current emitter in nv50_ir_apply_fixups. If I can get current emitter, I think I can come up with a solution.
13:11 pmoreau: I think there is also another bug related to atomic modesetting, which is not solved by using 1.0.14.
13:12 xen: this is going to be a looot of fun
13:12 orbea: honestly a lot of bugs went away for me when I started using modesetting instead of the nouveau ddx...
13:13 xen: my 1070 is on the way, moar bugs ahead. ( if it boots up )
13:14 pmoreau: If you use Linux 4.12 + firmwares, you should get acceleration. Though xf86-video-nouveau does not support Pascal chipsets yet.
13:16 kwizart: aren't modern xorg server aimed to use modesetting everywhere anyway ?
13:16 orbea: i guess that recent commit for the nouveau ddx might of fixed at least one of the bugs I know...
13:21 xen: aaah, i'm on 4.10.3, that may be a problem
13:22 xen: not a big fan of testing on that box since that's my workststion
13:22 xen: not much to loose now i guess :)
13:34 imirkin: dboyan: you only need to store the data... oh, i record the functions too, don't i...
13:35 imirkin: dboyan: yeah that stinks. you're going to have to come up with a better registration/naming system for that.
13:41 dboyan: imirkin: One way I can think of now is to pass chipset id to nv50_ir_apply_fixups, and let each emitter(?) resolve its own apply functions
13:41 imirkin: there are diff apply functions
13:41 imirkin: gtg
14:21 kattana: is there performance boost with llvm 4?
14:27 pmoreau: kattana: Nouveau does not use LLVM, so no
14:29 pmoreau: (assuming you were asking whether updating to LLVM 4 would make Nouveau faster)
14:31 kattana: ok
14:31 kattana: well mesa and stuff
14:36 pmoreau: OpenCL on AMD cards could benefit from it, but no idea whether it does improve performance or not
16:47 dboyan: I was just taught a lesson about how linking in c++ is different from c
16:48 imirkin_: the real lesson to learn is "don't worry about linking" :)
16:48 imirkin_: and if you do want to worry about it, don't use ld. use g++. it knows.
16:49 dboyan: well, it was about overloading and symbol names. I was nearly mad just now
16:50 dboyan: I nearly decided to use extern "C" just before I found the real solution
16:56 nyef: Said "real solution" being to use C in the first place? (-:
16:56 imirkin_: harsh.
17:10 dboyan: imirkin: I think I've finished one version of binary shader cache. There are some hacks maybe, but at least Portal 2 and Dota 2 don't crash.
17:10 imirkin_: nice
17:11 imirkin_: do they even render correctly? or is that too much to ask? :)
17:12 dboyan: I didn't test very much, but I haven't seen misrendering
17:13 dboyan: I guess I have to go to sleep for now, I'll do more testing and hopefully send the patches out in the weekend
17:18 phoenixz: Hey imirkin, just FYI, had to reinstall my system again (pulling out one more card this time messed something in BIOS, had to reset that, lost RAID config, RAID refused to use existing drives, yay, so had to reinstall).. So I've tried 1 nvidia card + internal intel, didnt work at all. I'm now on 2 cards, each doing one monitor and one monitor dark, its the best I"ve been able to get and since I really gotta get some work done, I'll leave it at
17:18 phoenixz: this for now. I'll just wait a few months and get a single video card to power the three monitors.. In any case, thanks a lot for your help so far, and once I have some extra money for a different video card I'll ask for advice here before I buy it :)
18:19 atomi: I have what seems like a simple request for the nouveau driver: I want to set performance level to p8 or p14 (as low performance as possible)
18:19 atomi: I've gone as far as doing this https://sites.google.com/site/akohlmey/random-hacks/nvidia-gpu-coolness (genius script by the way)
18:20 atomi: but it seems like something simple nouveau should be able to do, which is just set 'performance level' to p12
18:21 imirkin_: which gpu do you have?
18:21 atomi: it's an old laptop 8600m
18:21 imirkin_: lspci -nn -d 10de:
18:21 atomi: 01:00.0 VGA compatible controller [0300]: NVIDIA Corporation G84M [GeForce 8600M GT] [10de:0407] (rev a1
18:22 imirkin_: sorry, no reclocking on G84 =/
18:22 atomi: no I know
18:22 imirkin_: so what's your question?
18:23 atomi: I got it done anyway using the cool_gpu script and nvidia-settings
18:23 imirkin_: ok. doing stuff like that requires a driver which knows how to reclock.
18:23 nyef: Is it specifically g84, or is there a gpu family thing involved in the lack of support?
18:24 imirkin_: currently there's reclocking support for nv4x, GT21x, GKxxx, and GM10x
18:24 imirkin_: oh, and G94-G200
18:24 imirkin_: although that's less-well-tested
18:24 nyef: ... So, the only hardware that I have that might not be supported is the GF119?
18:25 imirkin_: nyef: well, GF119 is definitely not supported by reclocking. also i'm not sure if it's hooked up for MCP89
18:25 imirkin_: the IGP's are fairly different in that regard
18:25 imirkin_: it is hooked up for MCP77/79 though
18:26 imirkin_: my guess is that it should mostly work with maybe a little fiddling
18:26 atomi: so I did get to P12 from p0 on the darn thing but the sad thing is the Temp is the same
18:27 atomi: at least that's what nvidia-smi is reporting
18:27 atomi: haha
18:27 atomi: is there a way to just not send power to the card?
18:27 imirkin_: sure - requires system to be able to do that though
18:28 atomi: yeah like a kernel patch
18:28 imirkin_: many dual-gpu systems will have a platform method (via ACPI) to power off the "discrete" GPU
18:28 imirkin_: no, i mean on a physical level
18:28 atomi: yeah this old thing doesn't have a bios option for that
18:28 atomi: it's not dual-gpu
18:29 atomi: I was trying to repurpose an old laptop as a pfsense router
18:29 imirkin_: a lot of G84/G86's had issues with heat
18:29 imirkin_: and de-soldering themselves
18:30 atomi: yeah it runs too hot
18:31 atomi: okay I just checked the laptop and it's at the same temp as it was but the fan is completely off now
18:31 atomi: so P12 performance level seems to be doing it's job, I think that will be good enough
18:39 atomi: can I go lower than `nvidia-settings -a GPUOverclockingState=1 -a GPU2DClockFreqs=165,237 -a GPU3DClockFreqs=165,237`
18:39 atomi: you know what nevermind that's not really a nouveau question
18:39 imirkin_: correct :)
18:40 atomi: yeah I know
19:23 kattana: in linux, what does generally use gpu ram?
19:24 kattana: is a 1GB gpu card enough for wine/dolphin games?
19:27 nyef: I don't know if 1GB is enough, but I haven't really had any problem with a 2GB card for dolphin or pcsx2... but I'm running the nvidia blob. YMMV.
19:27 kattana: ok
19:29 imirkin_: depends on the game - you should ask the dolphin folks for recommendations.
19:31 nyef: That's also fair. At this point I use dolphin for GC emulation, not Wii emulation or anything later.
20:32 Lyude: btw imirkin_ you said the rectangle from the fill_rectangle_nv shader test only took up the whole screen because of the vertex shader? Switching to vertex shader passthrough didn't seem to change anything hefre
20:32 Lyude: *here
20:32 imirkin_: it wouldn't
20:32 imirkin_: pastebin your shader_test again
20:33 Lyude: imirkin_: https://paste.fedoraproject.org/paste/BBhGD0K7umOPjzcrsrG8nF5M1UNdIGYhyRLivL9gydE=
20:33 imirkin_: hmmmmmmmmmmm
20:33 imirkin_: are you sure?
20:33 imirkin_: ;)
20:33 imirkin_: so when you don't enable it
20:34 imirkin_: you see a isosceles triangle, pointing up, right?
20:35 Lyude: yep, this is just with me running the shader on the other machine directly with ./bin/shader_runner. And yeah, using "polygon mode GL_FRONT_AND_BACK GL_FILL" I can make it larger or smaller by playing with the verticies. With GL_FILL_RECTANGLE_NV though it always fills the entire window
20:35 Lyude: regardless of the vertex
20:35 imirkin_: that sounds wrong.
20:36 imirkin_: let's read the spec again...
20:36 imirkin_: and filling its axis-aligned screen-space bounding box
20:36 imirkin_: so it should basically be the screen-aligned bounding box that gets drawn
20:36 imirkin_: so something is wrong.
20:36 Lyude: so the smallest rectangle we can fit over the triangle, correct?
20:37 imirkin_: screen-aligned, yeah
20:38 imirkin_: i.e. from the min of all the x,y coordinates to the max of all the x,y coordinates
20:38 imirkin_: so .... this means that my analysis was incomplete
20:38 imirkin_: as it often is
20:39 imirkin_: BUT!
20:39 imirkin_: there are 2 separate bright sides
20:39 imirkin_: (a) you can test this shader_test with the blob and see what the blob does
20:40 imirkin_: (b) there already are mmt traces for this, so you can glance at them and see what i missed
20:40 Lyude: with the binary blob, do you still have to do all that annoying Xorg.conf stuff?
20:40 Lyude:hasn't used it in a while
20:40 imirkin_: dunno, sorry
20:41 Lyude: I'll take a look at the mmt traces though
20:41 imirkin_: i generally have a config to tell it to (a) use a specific gpu and (b) to load up without any monitors attached
20:41 imirkin_: the config can definitely be pretty spartan, doesn't have to be the giant thign their configurator generates
20:41 imirkin_: unless you have the weirdest of setups
20:41 Lyude: mind showing me? that sounds like something that would be useful to have
20:41 imirkin_: while i don't mind showing you, i don't have the config on you
20:42 imirkin_: i'll try to remember tonight
20:42 Lyude: sounds good
20:42 imirkin_: on me*
20:51 Lyude: getting this when I try to use demmt on that trace? unknown type: 0x1b
20:51 imirkin_: that's unfortunate.
20:51 imirkin_: well, i can say with some certainty that i used demmt on the trace...
20:51 imirkin_: let's see...
20:52 imirkin_: worksforme
20:52 imirkin_: let's try updating the repo
20:52 Lyude: whose, mine?
20:52 imirkin_: mine
20:52 Lyude: ah
20:53 imirkin_: yep, looks like it still works
20:53 imirkin_: ~/src/envytools/demmt/demmt -l ~/traces/gm206-fill_rectangle.mmt.xz
20:53 imirkin_: is how i run it
20:53 Lyude: ah, I didn't use -l
20:54 imirkin_: oh
20:54 imirkin_: i bet i know.
20:54 Lyude: -l makes it work
20:54 imirkin_: did you set POLYGON_MODE_FRONT/BACK to FILL?
20:55 imirkin_: [coz you should]
20:55 imirkin_: my guess is you're setting them to something that causes explosions
20:55 Lyude: You're talking about https://github.com/envytools/envytools/blob/master/rnndb/graph/gf100_3d.xml#L1341?
20:56 Lyude: whoops, remove the ? at the end of that
20:56 imirkin_: sure, why not
20:56 imirkin_: well
20:56 imirkin_: that's the NVRM macro
20:56 imirkin_: (NVRM = nvidia resource manager)
20:56 Lyude: ah
20:56 imirkin_: i think we have our own macro
20:56 imirkin_: or maybe we don't use a macro at all
20:56 imirkin_: either way, you have to feed it fill
20:56 imirkin_: rather than "illegal value"
20:56 Lyude: ohhhh, I see
20:57 imirkin_: SB_BEGIN_3D(so, MACRO_POLYGON_MODE_FRONT, 1);
20:57 imirkin_: SB_DATA (so, nvgl_polygon_mode(cso->fill_front));
20:58 imirkin_: https://cgit.freedesktop.org/mesa/mesa/tree/src/gallium/drivers/nouveau/nouveau_gldefs.h#n128
20:58 imirkin_: hmmmm
20:58 imirkin_: it already defaults to FILL
20:58 Lyude: I override the fill front/back mode in the rasterizer state with the new fill mode value though
20:58 Lyude: so it won't be on it's default value
20:59 imirkin_: "the new fill mode value"?
20:59 imirkin_: branch please
20:59 Lyude: https://github.com/Lyude/mesa/tree/wip/NV_fill_rectangle
21:00 imirkin_: right, but fill_front/back still go through nvgl_polygon_mode
21:00 imirkin_: which defaults to FILL
21:00 imirkin_: so all's well
21:00 Lyude: nvc0_state.c:265 would end up ... no, I'm wrong
21:00 imirkin_: ALL IS WELL! REMAIN CALM!
21:00 Lyude: hehe
21:03 imirkin_: ok, well, your impl seems fine. and yet it's not working =/
21:04 imirkin_: i'd like to point out that counter to your earlier claims, you did not increase the array by 1 in nvc0_stateobj.h
21:04 Lyude: did I not?
21:04 imirkin_: not in this commit
21:04 imirkin_: maybe it's elsewhere
21:05 Lyude: we're talking about nvc0_rasterizer_stateobj?
21:05 imirkin_: yes
21:05 imirkin_: perhaps you didn't push it out
21:05 Lyude: Yeah, I amended pipe_rasterizer_state to match but I didn't touch that struct directly
21:05 imirkin_: right, so the structu has an array of commands
21:05 imirkin_: and since you're adding a new command
21:05 imirkin_: you have to increase the array size
21:05 Lyude: ahhh
21:06 imirkin_: (which is where the SB_* things write to)
21:06 imirkin_: and then that array is just copied directly into the cmdstream
21:15 Lyude: that doesn't seem to fix it, but I'll keep messing around with that trace and see what I can figure out
21:16 imirkin_: oh, no. you'd get an assert() if you were hitting it
21:17 imirkin_: but like if the right combination of rasterizer options were enabled, you could
21:21 Lyude: imirkin_: btw, the only register write for the fill rectangle register I'm seeing in that dump is: PB: 0x8000044f GM204_3D.FILL_RECTANGLE = { 0 }
21:22 imirkin_: keep looking
21:22 imirkin_: you should see one with { ENABLE } eventually :)
21:22 Lyude: :/, i'm puzzled how I missed that
23:27 Horizon_Brave: greetings everyone