• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • Yeah, I reckon having a split of the frontend and the backend results in about half the complexity in each. If you have multiple frontends you can upgrade whatever the least important one is to see if there are any problems

    I didn’t really answer your original question.

    When I was using NUC’s I was using Linux mint which uses cinnamon by default as the window manager. Originally I changed it to use some really minimal window manager like twm, but then at some point it became practical to not use one at all and just run kodi directly on X.

    If I was going back to a Linux frontend I’d probably evaluate libreELEC as it has alot of the sharp edges sorted out.


  • I used to run kodi on linux on intel NUC’s connected to all our TV’s a while ago. I don’t remember it being particularly unreliable. The issue that made me change that setup was hardware decoding support in 4k for newer codecs.

    What I’ve had doing that frontend function ( kodi, jellyfin, disney plus, netflix etc ) for the last few years is three Nvidia shield TV pro’s which have been absolutely awesome. They are an old product now and I suspect Nvidia are too busy making money to work on a newer generation version of them,

    The biggest surprise improvement was how good it was being able to ( easily ) configure their remotes to generate power on / off and volume up and down IR codes for the TV or the AV amp they were using so you only need a single remote.

    Separating the function of the backend out from the frontend in the lounge has reduced the broken mess that happens around OS upgrades drastically.


  • I replaced mythtv with tvheadend on the backend and kodi on the frontend like 5 or 6 years ago.

    The setup and configuration at the time on mythtv was slanted towards old ( obsolete ) analog tuners and static setup and tvheadend was like a breath of fresh air in comparison where you could point it at a DVB mux or two and it would mostly do what you want without having to fight it.

    I’m not sure how much longer I’ll want something that can tune DVB-S2 and DVB-T though. Jellyfin and friends handle everything other than legacy TV better than kodi these days.



  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    I’m not the PR department for desktop Linux for everyone man.

    People who only have Windows experience see an Nvidia card that is premium priced product with a premium experience and think that this will translate to a Linux environment, it does not. I’ve been using Linux for like 27 years now and that was my opinion until a couple of years ago.

    Hopefully the folks that might read this thread ( like the OP 20 year IT veteran ) can take away that Nvidia cards in linux are the troublesome / subpar choice and are only going to get worse going forwards ( because of the Wayland migration that Nvidia are ignoring ).


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    7 months ago

    Oh yeah. That video of Linus Torvalds giving Nvidia the finger linked elsewhere in this thread was the result of a ton of frustration around them hiding programming info. They also popularised a dodgy system of LGPL’ing a shim which acted as the licence go-between the kernel driver API ( drivers are supposed to be GPL’d ) and their proprietary obfuscated code.

    Despite that, I’m not really that anti them as a company. For me, the pragmatic reality is that spending a few hundred bucks on a Radeon is so much better than wasting hours performing arcane acts of fault finding and trial and error.


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 months ago

    If you go back a bit further, multi monitor support was just fine. Our office in about 2002 was full of folks running dual ( 19 inch tube! ) monitors running off matrox g400’s with xinerama on redhat 6.2 ( might have been 7.0 ). I can’t recall that being much trouble at all.

    There were even a bunch of good years of the proprietry nvidia drivers, the poor quality is something that I’ve only really noticed in the last three or so years.


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    7 months ago

    The support for larger numbers of monitors and mixed resolutions and odd layouts in KDE vastly improved in the ubuntu 23.04 release. I wouldn’t install anything other than the latest LTS release for a server ( and generally a desktop ), but KDE was so much better that it was worth running something newer with the short term aupport on my desktops.

    We aren’t too far off the next LTS that will include that work anyway I guess. I’m probably going to be making the move to debian rather than trying that one out though.


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    7 months ago

    It isn’t something that is in the distro vendors control. Nvidia do not disclose programming info for their chipsets. They distribute an unreliable proprietry driver that is obfuscated to hell so that noone can help out fixing their problems.

    If you use an AMD card it will probably work fine in Windows and Linux. If you use an Nvidia card you are choosing to run windows or have a bad time in Linux.


  • I have two AMD Radeon cards for Linux that I’m pretty happy with that replaced a couple of Nvidia cards. They are an RX6800 and an RX6700XT. They were both ex mining cards that I bought when the miners were dumping their ethereum rigs, so they were pretty cheap.

    If I had to buy a new card to fill that gap, I’d probably get a 7800XT, but if you don’t game on them you could get a much lower end model like an RX7600.


  • Sorry to hear about that mess.

    I posted here https://lemmy.nz/comment/1784981 a while back about what I went through with the Nvidia driver on Linux.

    From what I can tell, people who think Linux works fine on Nvidia probably only have one monitor or maybe two that happen to be the same model ( with unique EDID serials FWIW ). My experience with a whole bunch of mixed monitors / refresh rates was absolutely awful.

    If you happen to give it another go, get yourself an AMD card, perhaps you can carry on using the Nvidia card for the language modelling, just don’t plug your monitors into it.


  • I had a Brother black and white laser (I think a HL1240?) for almost 10 years and then we started having to print a ton of education related stuff for our kid and colour made sense, so I got the closest thing that I could to the colour one that I use at work which ended up being a DCPL3551CDW.

    Printing a little in Windows and Linux, but more often from apps on my Android phone and my partners iPhone.

    I absolutely hate printers, but they have been fine.



  • I’ve been using Linux for something like 27 years, I wouldn’t say evangelical or particularly obsessed.

    I started using it because some of the guys showing up to my late 90’s LAN parties were dual booting Slackware it and it had cool looking boot up messages compared to DOS or Windows at the time. The whole idea of dual booting operating systems was pretty damn wild to me at the time too.

    After a while it became obvious to me that Slackware '96 was way more reliable than DOS or Windows 95 at the time, a web browser like Netscape could take out the whole system pretty easily on Windows, but when Netscape crashed on Linux, you opened up a shell and killed off whatever was left of it and started a new one.

    I had machines that stayed up for years in the late 90’s and that was pretty well impossible on Windows.




  • I’ve been running Linux for 100% of my productive work since about 1995. Used to compile every kernel release and run it for the hell of it from about 1998 until something like 2002 and work for a company that sold and supported Linux servers as firewalls and file servers etc.

    I had used et4000’s, S3 968’s and trio 64’s, the original i740, Matrox g400’s with dual CRT monitors and tons of different Nvidia GPU’s throughout the years and hadn’t had a whole lot of trouble.

    The Nvidia Linux driver made me despair for desktop Linux for the last few years. Not enough to actually run anything different, but it did seem like things were on a downward slide.

    I had weird flashing of sections of other windows when dragging a window around. Individual screens that would just start flashing sometimes. Chunky slideshow window dragging when playing video on another screen. Screens re-arranging themselves in baffling orientations after the machine came back from the screen being locked. I had crap with the animation rate running at 60hz on three 170hz monitors because I also had a TV connected to display network graphs ( that update once a minute ). I must have reset up the panels on cinnamon, or later on KDE a hundred times because they would move to another monitor, sometimes underneath a different one or just disappeared altogether when I unlocked the screen. My desktop environment at home would sometimes just freeze up if the screen was DPMS blanked for more than a couple of hours requiring me to log in from another machine and restart X. I had two different 6gb 1060’s and a 1080ti in different machines that would all have different combinations of these issues.

    I fixed maybe half of the issues that I had. Loaded custom EDID on specific monitors to avoid KDE swapping them around, did wacky stuff with environment variables to change the sync behaviour, used a totally different machine ( a little NUC ) to drive the graphs on the TV on the wall.

    Because I had got bit pretty hard by the Radeon driver being a piece of trash back in something like 2012, I had the dated opinion that the proprietary Nvidia driver was better than the Radeon driver. It wasn’t til I saw multiple other folks adamant that the current amdgpu driver is pretty good that I bought some ex-mining AMD cards to try them out on my desktop machines. I found out that most of the bugs that were driving me nuts were just Nvidia bugs rather than xorg or any other Linux component. KDE also did a bunch of awesome work on multi monitor support which meant I could stop all the hackery with custom EDIDs.

    A little after that I built a whole new work desktop PC with an AMD GPU ( and CPU FWIW ) . It has been great. I’m down from about 15 annoying bugs to none that I can think of offhand running KDE. It all feels pretty fluid and tight now without any real work from a fresh install.


  • A 2 gigabit event isn’t big enough to be considered a real attack, a service like cloudflare can sink a 2 terrabit attack every day of the week.

    Building a DDoS protection service ( that isn’t just black holing traffic ) starts with having enough bandwidth to throw away the attack volume plus keep your desired traffic working and have a bit of overhead to work your mitigation strategies.

    What this means is to DIY a useful service you start by buying a couple of terrabits of bandwith in ‘small’ chunks of a hundred gigabits or so in most peering locations around the globe and then you build a proxy layer like cloudflare on top of it with a team of smart dudes to automate outsmarting the bad guys.

    I don’t like cloudflare either, but the barriers to entry in this industry are epic.