Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…
Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.
Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.
Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.
Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…
my man, just because you’ve never seen the refresh rate option in the monitor settings doesn’t mean it hasn’t been there since basically forever
That was true before high framerate monitors were a thing, which was around 10+ years ago…
no it wasn’t true back then either, CRTs have been doing 100hz and more decades ago and it was very much supported by OSes and games
This is blatantly false.
Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.
Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.
Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.