What games do you play on GNU/Linux?

Yes, the graphics look amazing. If you don't mind, could you please list your current hardware, what screen resolution you are running at, and what average FPS you are getting in Crysis Remastered?

It will give me an idea, on what I might expect, when I get a chance to run it on my system.


1 Like

Setting in game -> What games do you play on Linux? - #185 by Michel

Specs: Upgraded my notebook 1 and half year ago with 2x 4TB Samsung 990 pro's and 2x 32GB Samsung 3200 DDR4 ram.



Edit: Between 80fps (lowest) - 110 fps (highest) on this map (upper left there is the fps status of each shot)




![Schermafdruk van 2025-01-22 15-35-59|690x387](upload://

Finished Crysis Remastered and installed Crysis 2 Remastered. Thought the FPS would be worse...but it is not :grin:

Settings are set by default:








1 Like

Graphics looking very nice! I remember that courtyard even. No ray tracing though?


Did not try it yet, will try it later :slight_smile:

Update: Tried Raytracing set on performance mode, decrease of 30FPS. So nah i keep the settings as how they got detected.

With raytracing: (FPS Upper left)


Without Raytracing: (FPS Upper left)


I got 2-3 times a small fps drop in the same map and it gets back around 100-140 fps. If i remember right this whas also a issue years ago on my older setup. (FPS Upper left)

So far stable fps, Linux does not fail in gaming at all. (FPS Upper left)






My current game installation:

1 Like

I found this gem on Steam. It's called Jotunnslayer: Hordes of Hel a bullet hell rogue game where you level up, buy equipment and so on and so forth. I try many of these kind of games but this one stands out. Oh, it's cheap too.

1 Like

Huh. Looks like a Survivors-like (one of the many games that took after Vampire Survivors) that put some actual effort into its graphics, which is nice. I enjoy the genre a lot, but don't favor pixel art, which the majority are. Hordes of Hel is early access still. How complete does it feel? I don't mind early access, but I prefer a "vertical slice" over lots of unfinished systems.

(Vertical slice is what we call a game in development at work, when you have a small portion of content, but the core systems are mostly complete. Demos of finished games could be considered a vertical slice.)

So far it looks there's tons of stuff in it. It doesn't feel "unfinished" and as you said the graphic is good in this one. But moreover the gameplay is more fun than others in this genre.

2 Likes

I know I liked Hades a fair bit when I played it, but I couldn't get over the style making it very hard for me to actually see my character when a lot of lighting effects were going on. This seems like a mix of Diablo and Survivors - esque, interesting.

1 Like

Finished Crysis 2 Remastered, good game but sadly had technical issues (random crashing). Picked up Crysis 3 Remastered, so far it's running great. Not a single crash during gameplay. FPS at highest at 170fps, lowest around 76fps. Running with Proton Hotfix (for testing).

Game Settings: I did change stuff a bit, by default it enabled raytracing but for some reason when that is on the fps drops insane (30-40 fps). Turning this off and change DLSS to performance gives a great boost. Resolution remained the same except by default it used 60hz and my monitor is 165hz, turned off vsync, textures changed from high to very high.


FPS Screenshot: Just one because if i do a printscreen my game does not go back to game :sweat_smile:, i know this can be fixed by a commandline in steam but i am not going to add it.

Some random ingame shots:


Gun reflections:


Some random ingame shots again:




2 Likes

Ray Tracing started off as a gimmick, which nobody could use at the time of its release, as most folks were still using Nvidia GTX 1660 GPU's, and few had Nvidia RTX 20 series cards.

We learned years ago, if you want to use Ray Tracing, you better have an Nvidia 3080 or better, not even the 3060 could produce useful FPS with that on!

Now in the latest news, Ray Tracing is now coming in every game from now on, and you won't have an option to turn it off. I see it as their little trick, to force you to buy a new computer.

Cause as you know, we can't just make fun games anymore, games are now used to push users, to spend far too much money then they should, to buy new hardware, just to play a game.

I guess you could say, its always been that way. Having said that, gaming studios have never came off as scummy as they are today though! So, were living in a time where, you want to use all those fancy features in games? You better get that 4080/4090 5080/5090!

Or, the other options, is to switch to AMD, but their top cards don't even remotely compete against Nvidia's best offerings though. And Intel's Arc GPU's don't even remotely come close to offering an alternative for either of them.

Literally, a basic function that both Nvidia and AMD cards have, direct X 12, Intel can't even do it without black screening on people. So, Intel is still in diapers when it comes to GPU's, I don't even consider them an option.

Intel is like the toddler who is learning to walk still, thats how I see their GPU's. So, they have years to go yet, before they can even begin, to compete against Nvidia and AMD.


1 Like

I have a 3070ti 8gb (yeah low for todays standards), but I can’t get the fps people post (assuming they have 12gb or higher)…not even on windows. If u use amd’s fsr i get boosted fps too…which is weird :joy: (jedi survivor).

Anyway i am an old gamer and i don’t really care how all those tech stuff is working dlss/raytracing/fsr … just high settings for me is enough. If a game runs great and i can get above average fps i am fine with that. Games should look like games, not like something real life if you know what i am talking about. If a game doesn’t work wel i play it on medium or i try to find out which setting is causing the “lags/fps drops”.

I rarely play new games, i think my latest “new” games are star trek resurgence, jedi survivor and halo infinite.

1 Like

I'd be curious to know which game or games you've heard won't let people turn off ray tracing. Triple A PC gaming will always chase new tech, but that's hardly any different from before. You get games like Crysis which very few people could run acceptably at launch (thus the whole "will it run Crysis?"), but by and large game developers consider it very important to maintain options that lower system requirements.

For games that publish on consoles: The developer already needs it to run on consoles, so it's no extra work to keep requirements at or below the console requirement. PS5 and especially Xbox Series S aren't good at ray tracing and have lower power GPUs than a 2080 Ti. That was an expensive GPU at launch, but it's several cycles old now. Since they need it to run on console, it's also going to run on similarly powered PCs.

For PC exclusive games: MOST games that are PC only are indie these days and don't have the budget to chase top end graphics. They're also not on console, which limits their potential sales, and the developers don't want to limit them further by unnecessarily excluding older PCs.

The success of the Steam Deck further incentivizes not requiring ray tracing. It's just not up to it.

No matter how you slice it though, this isn't new. CGA graphics debuted in 1981. EGA debuted in 1984. VGA debuted in 1987, and cards immediately started going to "Super VGA" which wasn't a real standard, that same year. XGA became a thing in 1990, and by 1995, 3dfx released the first 3d accelerators.

My first experience with Quake was in 1996 on a 486 that absolutely could not handle it; it ran at MAYBE ten FPS. When Crysis came out, that was nearly a slideshow for me as well.

You're not wrong that the game industry is more profit driven and more risk averse than ever. That's what happens when large companies buy small ones, and the industry has seen an insane amount of consolidation. But the industry has always chased the best graphics and progressively pushed new hardware for the bad reason that it's a very clear, unmistakable way to show something new and exciting, and for the much better reason that game developers really want to make things they find amazing, and staying where they are graphically doesn't do that.

There are exceptions in the indie space in particular, but a lot of the games that eschew high tech graphics there and go pixel art are still trying to look amazing in their more limited space. You can perhaps argue that triple A games should take the same tack and try to limit their use of expensive hardware, but I really do expect the vast majority of games to work on older machines for quite some time, barring optimization failures like the launch state of Cities Skylines 2.

2 Likes

I watch Linus Tech Tips, Gamers Nexus, and Jayz 2 Cents on Youtube, this was mentioned on one of them. This is just how it works, technology is an experiment until proven, then it becomes permanent. Just like how modern cars snoop on its users now, 0-way to turn off snooping features.


So no confirmed games not allowing people to disable ray tracing. Yes, there will come a time when ray tracing is a default feature. There also came a time when color graphics were a default feature, and there came a time when 3d acceleration became a default feature, and so on. And yes, there will be people still using older hardware who won't be able to use it. On the whole though, the industry is not eager to make it impossible to turn off ray tracing; it doesn't benefit them any. In addition to being computationally expensive, it doesn't replace anything. It's a feature that, once implemented, is easy to turn off, and it doesn't actually help them make money.

Compare that to the things that actually have become commonplace at user expense, like always online games, publisher specific launchers, and things like that. Publishers actually DO care about pushing those, because always online allows for them to do things like heat maps, showing what players are spending their time on (so they can add more features like it, or avoid wasting money on features they don't use). Launchers, meanwhile, let them directly advertise in a window that has no competitors.

Raytracing doesn't bring them that value. They have no reason to force it since it's an easy toggle (once implemented). If something is actually going to force hardware updates, it's much, MUCH more likely to be needing more VRAM. 8 GB is low end already, my 4090 has 24, and 5090s have 32 GB.

https://gamefaqs.gamespot.com/boards/916373-pc/80900563

This post confirms Linus's statement. All one has to do, is a Google search to find these things. In the end, you choose to believe what you want, but Ray Tracing is the future reality, and if you don't have a GPU that can do it, then you arn't playing that game, its that simple.


I believe the new indiana jones game you cannot disable ray tracing. I think it'll be a long time before we get to the point that EVERY new game requires it, if ever. And even if at some point it gets to that, there are presently so many games released that are fantastic, that if you stopped to play every single game without working all day long, you'd never even get close to the end of the list. I don't even think someone has the full list available, it's disgustingly large.

2 Likes

They did something particular with global illumination such that they weren't actually ray tracing in the usual sense, but needed the support:

Indiana Jones and the Great Circle uses a technique called global illumination to light the environment and characters. Our engine MOTOR, used by MachineGames, uses hardware raytracing to calculate this. While this isn’t what players may typically think of as “ray tracing,” the compatible hardware is required to do this is in a performant, high fidelity way.

So yeah, I stand corrected, though in order for it to run well on consoles, it can't be demanding more than first gen RTX, which was six years old when Indiana Jones came out. I can understand and accept that not everyone can or wants to update their PC every five or six years, but I'd also say that the top tier of triple A running on six year old hardware isn't rushing massively.

1 Like

Honestly, just because a game may come to console doesn't always mean it will run well on the console

I know two examples for the nintendo switch specifically (yes, I know that's underpowered compared to PS5 and xbox, but the examples I'm going to give have absolutely no excuse to run bad there):

  • Minecraft: you'd think the best-selling videogame of all time with pixelated graphics and known to run well on previous-gen hardware (including the wii u) would care about running fine on the consoles it supports, right? Well... it kinda doesn't run very well on the switch. In fact, I've even heard PS4 users have been having lag there too. And having gotten many updates since launch is really not an excuse, as the version it launched with was the same as the last update the Wii U received (a slightly less powerful console). The Wii U still ran the game fine at 60 fps, the switch struggled with the framerate on some versions, or with the world loading speed in others

  • Pokemon Scarlet/Violet: A switch-exclusive from one of the most profitable gaming companies sometimes running at 1 fps on the switch... they kinda fixed it with updates, but you still get around 10 fps sometimes

Sometimes it feels like big AAA companies want to sell very little copies of their games at very expensive price instead of many copies at an affordable price

1 Like