• Glide@lemmy.ca
    link
    fedilink
    arrow-up
    110
    ·
    10 months ago

    A 1080TI still plays every release at medium or higher settings. /shrug

    Unless you’re worried about 4k or VR, I wouldn’t upgrade anyway.

    • schmidtster@lemmy.world
      link
      fedilink
      arrow-up
      19
      ·
      10 months ago

      If you care about refresh rate it matters, not a lot of people can stand 30-40fps with hard drops to single digits just to be able to play a game.

      • Glide@lemmy.ca
        link
        fedilink
        arrow-up
        26
        ·
        10 months ago

        I’m curious. What game do you think drops to single digits fps on medium settings with a 1080TI?

        I was playing Darktide on a 1060 with minimum 30fps recently, and that game is optimized like absolute trash.

        • schmidtster@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          10 months ago

          Starfield.

          1080 is the minimum card, the TI is decently more powerful (30%), but you’ve got to make concessions on medium to get 30fps, and there’s drops.

          • Viking_Hippie@lemmy.world
            link
            fedilink
            arrow-up
            22
            ·
            10 months ago

            To be fair though, that’s a VERY new game and they cared so little about optimizing it that they went out and said “you’re probably going to need a new computer to play this” …

            I hear the 1080ti runs Doom just fine 😛

            • RobertoOberto@sh.itjust.works
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              10 months ago

              My 1070 handled Doom Eternal just fine with pretty high settings. I’m sure it helps that I only use a single 1080p monitor for games, but it was still pretty enough for me.

              *Edit - I picked up on the sarcasm after posting this reply. Oh well.

            • schmidtster@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              10 months ago

              Every release means every release, and the requirements aren’t going to get lower. It’s a great card and I know people hate losing it, but it’s on its last legs and likely won’t be able to play new releases at all next year.

          • Perfide@reddthat.com
            link
            fedilink
            arrow-up
            6
            ·
            10 months ago

            That’s Bethesda’s fault. There is no fucking reason that game can’t run well on a 1080ti for how mediocre it looks.

        • Psaldorn@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          10 months ago

          Cod MW2/3 are total crapshoots with frame rates, even on a 3080 and set to performance it can still just turn to crap. It seems to run more stable and on higher settings on 2070 laptop. I don’t understand. (I tried to get as much hardware running DMZ as possible for friends and family, lots of machines)

    • banneryear1868@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      I actually prefer the crisp edges without as much post-processing effects sometimes. Source engine games look great to me, just minimal crisp and clean geometry. I find a lot of modern graphics distracting, but it depends on the game. I do love really pushing graphics for a game like Skyrim.

      • spader312@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        10 months ago

        Modern game engines don’t use the amazing SSAA (super sampling anti aliasing). Most have post processing anti aliasing like FXAA or TXAA which always makes edges look fuzzy. Source engine is one of those that still supports super sampling

        • banneryear1868@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Yeah that’s exactly it, MSAA isn’t too bad but FXAA makes edges look pretty blurry. Temporal anti aliasing is also really blurry looking sometimes but gives the impression that the edges could be crisp.

    • kakes@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      I do 4k and VR on my 1080Ti with no issues, on the highest settings, too. That said, I don’t do a lot of AAA gaming, so take that as you will.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      My 980ti is still a toss up between amazing or mediocre performance. The big issue is that I bought it for £600 which is a lot of money to me, and new GPUs are 3 times that, or more.

  • HexesofVexes@lemmy.world
    link
    fedilink
    arrow-up
    50
    ·
    10 months ago

    I’ve said it before and I’ll say it again, when humanity is wiped out future species will find a Nokia on half battery and a fully working 1080Ti.

    • Corroded
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      I feel like I’d need a display that can do 4K before I upgrade

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I have a 3060, which is a bit faster overall, and it struggles sometimes with my reverb g2. It’s a pretty high resolution headset though.

      • OmnislashIsACloudApp@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        yeah my VR headset is a couple years old as well and not the highest quality it’s the Odyssey.

        I guess I practice good enough gaming lol.

  • textik@sh.itjust.works
    link
    fedilink
    arrow-up
    29
    ·
    10 months ago

    If my 1080 gives up in the near future, I’ll probably just give up AAA gaming. BG3 is literally the only game in the last 5 years I have loved which would require more than a potato to run.

    • spader312@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      10 months ago

      Honestly I have a gtx1080 and I can run BG3 at close to ultra settings for around 30-60fps. I think my actual issue is my CPU I have an i7 3770 which came out in like 2012. And I have like ddr 3 ram from 10 years ago as well.

      • Baylahoo@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        My SO upgraded to a Ryzen 5600 and 6600 GPU from a 1050TI and some integrated old CPU and saw massive improvements for BG3. I don’t know the integrated CPU but I think that made the difference for this game. Loading in went from crashes or naked people to no issue at all.

      • Rev3rze@feddit.nl
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Same here but with an i7 6700K. If I happen to find a decent deal on something like a 12th gen i5 or i7 and chuck in an extra 16gb ram then I think I’d be all set for another couple of years. Although I still don’t desperately need an upgrade, everything works well enough except for maybe that one nightclub map in Ready or Not with more npcs than my cpu can keep up with.

    • adhocfungus@midwest.social
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      My 960 runs Unity games like Overcooked at 4k, so I probably won’t be upgrading any time soon. With a toddler I don’t have time for AAA games anymore, but I’m guessing the frame rate would be painful.

  • Coaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    10 months ago

    Having to run -400mhz on the vram to prevent mine from crashing all the time but I’m hanging in there 🥲👍

      • Coaster@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        I thought it was driver problems at first because it happened so infrequently, but it has gotten worse this year.

        It took ages to narrow it down to memory faults since I’ve been running stock settings forever. Stumbled upon this tool: https://github.com/GpuZelenograd/memtest_vulkan/

        Found a load of errors, which went away after the down clock and it’s been stable since. It must be just age related degradation, temps were never high but I know the repeated temp changes have an impact.

    • Max17@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Same problem as mine! I thought I was the only one. I’m using msi afterburn and tuning down power usage to 90% sigh

  • AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    10 months ago

    It’s still a decent card, probably can still do well at 1080p max settings in most games. Very similar to a 3060 in terms of performance, which is the card I have.

    • daq@lemmy.sdf.org
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I’m playing D4 at 4K on medium settings. No complaints. This is on a 3yo laptop with 1080ti in an external enclosure hooked up via Thunderbolt.

    • crashoverride@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      1080 here, got to use for like 220, before the graphics card nightmare. I may be looking to try to get a 2080 or 2070. I wonder if I can do that on 550 at 600 watts

  • Kadath (she/her)@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    10 months ago

    Depends on the games. My 980TI can still rock 3440x1440 in most of the games I play.

    The fact that what I play is mostly metroidvania shouldn’t be an issue, right? 😅

    • smooth_jazz_warlady@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      My 980ti still holds up pretty well at 1920x1440 (high-end CRT monitors were beautiful things, restart production you cowards) for most 3d games I play on Linux, but it is starting to have performance issues in some games, and I’m getting real sick and tired of the dumb shit Nvidia keeps pulling with their Linux drivers. The current driver gives me horrible black flickering in a lot of games, and of course they arbitrarily lock me out maxing out my CRT monitor (which don’t have a fixed resolution, only a balance of resolution vs refresh rate, and it keeps blocking me from a whole range of refresh rate/resolution combinations). So I confess I am starting to eye the higher-end AMD 6xxx GPUs, and I would definitely try and grab one as cheaply as I could if I ever got a 3440x1440 ultrawide.

      Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?

      • Kadath (she/her)@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?

        Awesome. For work (even if I am a Linux system engineer) I need to use W11 due to corporate policy. I have two 34" in landscape and a 27" in portrait. I split the screens with FancyZones.

        Time for my bad drawing skills, lol.

        In order:

        1. SSH
        2. SSH
        3. SSH
        4. Outlook
        5. Edge for work
        6. Teams
        7. Firefox with YouTube running. Firefox is the only browser that allows for in-window full screen.
        • smooth_jazz_warlady@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I see

          I’m debating getting a 3440x1440 monitor for coding and because I hear they work well with tiling window managers (hence the question), it’s just annoying that I have almost no chances to try them out for free, and also the cost is enough that I wouldn’t get one without serious consideration first. Although you have nudged me a bit closer to “maybe I could get one without testing them first, if it’s second hand and cheap(er)”.

          Also I’d be replacing my existing 27 inch LCD with it, and keeping the 4:3, 21 inch CRT, for a highly cursed monitor setup, where everything gets letterboxed or pillarboxed. And then to make things worse, I could grab a 16:10 monitor to put in portrait besides one of the other two, for maximum “what is 16:9 and why do I have black bars on everything”.

  • Mars@beehaw.org
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 months ago

    My 970 seems as old as the wheel if you read this kind of threads.

    I really need to upgrade my rig…

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I had a 1060 3G version and it just couldn’t hack it anymore. Picked up a 20 series this year and it was such an improvement.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Yeah. I’m about 90% sure that if I had gotten the 1060 6G and I hadn’t gotten a really good deal on a 2080, I’d probably still be using the 1060.

          For now though, I don’t suspect I’ll be replacing the 2080 anytime soon… So when the 50 series comes out, this meme will be me with my 2080.

    • JPSound@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Same here. Works great and does all I need it to. It would be nice to have a new GPU but I’m driving this one until the wheels fall off.