Honestly, training your eyes to be able to easily tell the difference between framerates seems like one of the worst things you can do to yourself.
??????????????????????
So long as it's stable, I can't tell what a game's frame rate is (unless I've just been playing something with a different frame rate). Bloodborne at 30 fps? Didn't bother me at all.
I don't see how becoming one of the "30 fps is unplayable" people would improve my life as a gamer in any way.
You're acting like we wanted this lol
Yeah, I didn't train for nothing. Metroid Prime, for example, was super obviously running at 60 FPS which was really rare for 3D games at that time.
I don't know how people can't see the difference. I get not caring, but fully not being able to tell any difference? Seems so strange to me.
For me the diff is a feeling more than a visual. Like it's pretty obvious when I do input and the resultin actions if this is 30 or 60fps. 30fps is just so sluggish.
+1
Options
MorninglordI'm tired of being Batman,so today I'll be Owl.Registered Userregular
Honestly, training your eyes to be able to easily tell the difference between framerates seems like one of the worst things you can do to yourself.
??????????????????????
So long as it's stable, I can't tell what a game's frame rate is (unless I've just been playing something with a different frame rate). Bloodborne at 30 fps? Didn't bother me at all.
I don't see how becoming one of the "30 fps is unplayable" people would improve my life as a gamer in any way.
You're acting like we wanted this lol
Yeah, I didn't train for nothing. Metroid Prime, for example, was super obviously running at 60 FPS which was really rare for 3D games at that time.
I don't know how people can't see the difference. I get not caring, but fully not being able to tell any difference? Seems so strange to me.
For me the diff is a feeling more than a visual. Like it's pretty obvious when I do input and the resultin actions if this is 30 or 60fps. 30fps is just so sluggish.
Thats frame timing.
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
Speaking of the devil, A Plaque Tale: Requiem unlocked at midnight and I played it for a bit because it’s sequel I’m very excited about. I finished the opening chapter.
It does not have a 60 fps mode. It’s locked to 30 fps and that looked passable to my eyes, but apparently it also has a 40 fps mode if you have a 120 hz TV. I enabled 120 hz (I have it turned off unless a game will make use of it), and 40 fps is very good. Honestly, the main reason I was very adamant about future-proofing and making sure I had a 120 hz TV at the beginning of this gen was 40 fps and it’s really paying off in this game. I’m betting that by the end of this gen it’ll be my most common mode if more devs jump on this bandwagon.
Interestingly, there’s no menu choice to select the different modes (or even tell you they exist). It’ll default to 40 fps if it’s possible, and 30 if it’s not. Is this the first console game where the upper cap is 40? I’ve seen games 60 fps and 40 fps option, but I can’t think of one with only 30 fps and 40 fps.
I still don't fully understand this variable refresh rate thingamajig. My brain still operates under the understanding that framerate is mostly tied to the power of the machine and the game engine. If there's too much shit going on that it can't handle, then you get a lower framerate. Ok maybe the TV literally can't display 120 fps so that's why you need one that supports it, I can sort of get that. But... if the game can run at 40 fps... why can't it just freakin' run at 40 fps? What on earth can the TV be possibly doing to pull an extra 10 frames?
"The sausage of Green Earth explodes with flavor like the cannon of culinary delight."
0
Options
MorninglordI'm tired of being Batman,so today I'll be Owl.Registered Userregular
I still don't fully understand this variable refresh rate thingamajig. My brain still operates under the understanding that framerate is mostly tied to the power of the machine and the game engine. If there's too much shit going on that it can't handle, then you get a lower framerate. Ok maybe the TV literally can't display 120 fps so that's why you need one that supports it, I can sort of get that. But... if the game can run at 40 fps... why can't it just freakin' run at 40 fps? What on earth can the TV be possibly doing to pull an extra 10 frames?
OK so tv's etc have refresh rates, which is the amount of times the tv will display everything on the entire screen per second. So 60hz it will refresh and redisplay the entire screen 60 times per second.
Frame rate however, is how fast the game, pc, console etc, is actually capable of sending information to the TV.
If you have a fixed refresh rate, but your frame rate varies (as often happens in games), you can have a circumstance where the console sends information to the tv in the middle of a refresh, so you get weird visual stuff happening. You might get half a frames worth of information displayed instead of full, so you get visual disruptions, and it doesn't feel as smooth. This is tearing.
VRR means the tv will synchronise with the incoming frame rate. If the frame rate slows, the refresh rate slows. If the frame rate increases, the refresh rate increases. This results in a smoother image.
Now, games will deal with this by fixing the frame rate so it can't be out of sync with the refresh rate. So instead of allowing the game to hover around 40, dipping down to 35, occasionally going to 42, etc, which causes tearing, it just says "you can reliably do 30, so you only do 30, and that's that". This fixes the issue, but means the consoles aren't actually outputting full power.
With a VRR tv, you can allow it to do its inconsistent 40 without any visual distortion due to desynching, as the tv handling the synching, so the console can output whatever it wants.
It isn't that the tv magically makes the console able to run the game faster. It's just that it allows it to run as fast as its actually capable of without causing the problems they previously needed fixed frame rates to solve.
You can think of it as two people A and B cooperating to throw things to each other. B previously could only catch things at a fixed rate, so A had to do the accommodating by throwing at a fixed speed so nothing is dropped. With VRR, B can now adapt to however fast or slow A wants to throw things.
Morninglord on
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
I will say that I can for sure tell the difference between 1080p and 4k in PS5 games but only when watching my kid play them. Like, Horizon: Forbidden West is a treat visually, but when I am the one driving I am way too focused on whichever machine(s) is/are trying to murder me at the moment to fully enjoy it. Like a kind of inverse of foveated rendering, I guess? I can only really “see” the difference when the stakes are very low.
I will say that I can for sure tell the difference between 1080p and 4k in PS5 games but only when watching my kid play them. Like, Horizon: Forbidden West is a treat visually, but when I am the one driving I am way too focused on whichever machine(s) is/are trying to murder me at the moment to fully enjoy it. Like a kind of inverse of foveated rendering, I guess? I can only really “see” the difference when the stakes are very low.
I run into that a lot when it comes to performance/quality modes. Like, the quality modes do tend to look a bit nicer, but in motion they both pretty much look the same and the increased fps is much more noticeable in motion as well.
0
Options
DemonStaceyTTODewback's DaughterIn love with the TaySwayRegistered Userregular
I am sad it is not in time for Ragnarok.
But yea imma grab one of those to never click a stick again haha.
I still don't fully understand this variable refresh rate thingamajig. My brain still operates under the understanding that framerate is mostly tied to the power of the machine and the game engine. If there's too much shit going on that it can't handle, then you get a lower framerate. Ok maybe the TV literally can't display 120 fps so that's why you need one that supports it, I can sort of get that. But... if the game can run at 40 fps... why can't it just freakin' run at 40 fps? What on earth can the TV be possibly doing to pull an extra 10 frames?
OK so tv's etc have refresh rates, which is the amount of times the tv will display everything on the entire screen per second. So 60hz it will refresh and redisplay the entire screen 60 times per second.
Frame rate however, is how fast the game, pc, console etc, is actually capable of sending information to the pc.
If you have a fixed refresh rate, but your frame rate varies (as often happens in games), you can have a circumstance where the console sends information to the tv in the middle of a refresh, so you get weird visual stuff happening. You might get half a frames worth of information displayed instead of full, so you get visual disruptions, and it doesn't feel as smooth. This is tearing.
VRR means the tv will synchronise with the incoming frame rate. If the frame rate slows, the refresh rate slows. If the frame rate increases, the refresh rate increases. This results in a smoother image.
Now, games will deal with this by fixing the frame rate so it can't be out of sync with the refresh rate. So instead of allowing the game to hover around 40, dipping down to 35, occasionally going to 42, etc, which causes tearing, it just says "you can reliably do 30, so you only do 30, and that's that". This fixes the issue, but means the consoles aren't actually outputting full power.
With a VRR tv, you can allow it to do its inconsistent 40 without any visual distortion due to desynching, as the tv handling the synching, so the console can output whatever it wants.
It isn't that the tv magically makes the console able to run the game faster. It's just that it allows it to run as fast as its actually capable of without causing the problems they previously needed fixed frame rates to solve.
You can think of it as two people A and B cooperating to throw things to each other. B previously could only catch things at a fixed rate, so A had to do the accommodating by throwing at a fixed speed so nothing is dropped. With VRR, B can now adapt to however fast or slow A wants to throw things.
Man, you should write tech manuals. If the documentation that came with my TV read like this I would read it for fun.
... I guess I'd pay that much if it had a decent battery.
Heh, I forgot to complain about that when I got the PS5, it was easily the worst thing about it. At first I was not thrilled that I was forced to buy an extra controller, but I was so glad during my first marathon session of GoW as I’d alternate between charging one and playing with the other.
US price is not that crazy, though. It’s right between an Elite (180) and a Scuf (220), and it has haptics. European prices are a different question though. €240 is high.
+1
Options
MorninglordI'm tired of being Batman,so today I'll be Owl.Registered Userregular
I still don't fully understand this variable refresh rate thingamajig. My brain still operates under the understanding that framerate is mostly tied to the power of the machine and the game engine. If there's too much shit going on that it can't handle, then you get a lower framerate. Ok maybe the TV literally can't display 120 fps so that's why you need one that supports it, I can sort of get that. But... if the game can run at 40 fps... why can't it just freakin' run at 40 fps? What on earth can the TV be possibly doing to pull an extra 10 frames?
OK so tv's etc have refresh rates, which is the amount of times the tv will display everything on the entire screen per second. So 60hz it will refresh and redisplay the entire screen 60 times per second.
Frame rate however, is how fast the game, pc, console etc, is actually capable of sending information to the pc.
If you have a fixed refresh rate, but your frame rate varies (as often happens in games), you can have a circumstance where the console sends information to the tv in the middle of a refresh, so you get weird visual stuff happening. You might get half a frames worth of information displayed instead of full, so you get visual disruptions, and it doesn't feel as smooth. This is tearing.
VRR means the tv will synchronise with the incoming frame rate. If the frame rate slows, the refresh rate slows. If the frame rate increases, the refresh rate increases. This results in a smoother image.
Now, games will deal with this by fixing the frame rate so it can't be out of sync with the refresh rate. So instead of allowing the game to hover around 40, dipping down to 35, occasionally going to 42, etc, which causes tearing, it just says "you can reliably do 30, so you only do 30, and that's that". This fixes the issue, but means the consoles aren't actually outputting full power.
With a VRR tv, you can allow it to do its inconsistent 40 without any visual distortion due to desynching, as the tv handling the synching, so the console can output whatever it wants.
It isn't that the tv magically makes the console able to run the game faster. It's just that it allows it to run as fast as its actually capable of without causing the problems they previously needed fixed frame rates to solve.
You can think of it as two people A and B cooperating to throw things to each other. B previously could only catch things at a fixed rate, so A had to do the accommodating by throwing at a fixed speed so nothing is dropped. With VRR, B can now adapt to however fast or slow A wants to throw things.
Man, you should write tech manuals. If the documentation that came with my TV read like this I would read it for fun.
I freelance for a game guide company sometimes, and I also work in the tech section of a department store, where I have to explain arcane magical bullshit to people so they know what the magic runes on the boxes mean when they're trying to buy tech stuff.
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
I just play the PS5 with the controller wired half the time. That battery life absolutely sucks :< I wish Sony would finally stop having integrated batteries and let me slap in my own rechargeable batteries. Another reason why the Xbox controller is significantly better. Also would mean that I wouldn't have a dead controller in 5 years when the already short battery life tanks to nothing.
Stabbity Style on
0
Options
DemonStaceyTTODewback's DaughterIn love with the TaySwayRegistered Userregular
I have luckily not had issues there so far. But if im playing all day or something ill always plug it back in when I stop to eat or whatever and that has been enough that I have never seen a low battery warning pop up yet.
I can't find a cable that doesn't suck ass that isn't 4ft long. I have some nylon ones that don't get all curved and caught around things but they're heavier than normal and pull it down a bit. And any of the > 8ft cables I buy are also way thicker than normal and are heavy too.
e: I have like 5 USB-C cables in the room and I can't find one I like to actually use. And sometimes they don't charge in my outlets so I have to run it to the console... But then I can't charge my phones... Ugh. I fucking hate it.
I can't find a cable that doesn't suck ass that isn't 4ft long. I have some nylon ones that don't get all curved and caught around things but they're heavier than normal and pull it down a bit. And any of the > 8ft cables I buy are also way thicker than normal and are heavy too.
e: I have like 5 USB-C cables in the room and I can't find one I like to actually use. And sometimes they don't charge in my outlets so I have to run it to the console... But then I can't charge my phones... Ugh. I fucking hate it.
The PS4 controller also has an underwhelming battery but there was an easy fix for it. You could buy a battery with literally double the charge and it only took about 10 or 15 minutes to install into the controller. I haven't looked into the PS5 controller batteries yet because I have 2 and I have charging cables I can use while I play. I am just glad they made the switch to USB-C at least.
I bought another controller that I've been using sparingly, as this launch controller keeps having drift issues. It's actually playing nice so far, I'm just tired of having to open the whole thing up to clean it. Also "supposedly" this new model of controller has different analog parts to help... but I doubt it. Either way, having two controllers has given me the flexibility to be a lot lazier in charging them. And I'm finding out that when the low battery warning pops on screen, the controller still has a good hour and a half to it.
As annoying as the price is going to be, I actually think this controller is going to help with these drift issues. If the whole analog stick is removable, then it should be a lot easier to clean them. And an adjustable deadzone setting will do wonders. These sticks are just too fucking sensitive. The barest minimum pressure will cause my character to move in Genshin Impact. Just a 1% global deadzone setting and I betcha 90% of all drift issues people have will vanish.
"The sausage of Green Earth explodes with flavor like the cannon of culinary delight."
The PS4 controller also has an underwhelming battery but there was an easy fix for it. You could buy a battery with literally double the charge and it only took about 10 or 15 minutes to install into the controller. I haven't looked into the PS5 controller batteries yet because I have 2 and I have charging cables I can use while I play. I am just glad they made the switch to USB-C at least.
I took my PS4 controller apart and I couldn't get it together and sit properly I'm just not that inclined.
I can't find a cable that doesn't suck ass that isn't 4ft long. I have some nylon ones that don't get all curved and caught around things but they're heavier than normal and pull it down a bit. And any of the > 8ft cables I buy are also way thicker than normal and are heavy too.
e: I have like 5 USB-C cables in the room and I can't find one I like to actually use. And sometimes they don't charge in my outlets so I have to run it to the console... But then I can't charge my phones... Ugh. I fucking hate it.
The PS4 controller also has an underwhelming battery but there was an easy fix for it. You could buy a battery with literally double the charge and it only took about 10 or 15 minutes to install into the controller. I haven't looked into the PS5 controller batteries yet because I have 2 and I have charging cables I can use while I play. I am just glad they made the switch to USB-C at least.
I took my PS4 controller apart and I couldn't get it together and sit properly I'm just not that inclined.
Yeah I guess I should have made it more clear it is a bit tricky. Of the three I changed out one of them has very sticky buttons. Like when you press the button it makes an audible click and you have to press pretty hard. It still works, but it is definitely a but impaired.
Gamertag: KL Retribution
PSN:Furlion
0
Options
DemonStaceyTTODewback's DaughterIn love with the TaySwayRegistered Userregular
Having cracked open a PS5 controller for some back buttons it doesn't seem like ot would be too much harder to do a battery upgrade if you had a good option.
I have to think battery life might be a tiny bit better if there wasn't this giant useless glowing light coming out the back of your controller at all times.
Having cracked open a PS5 controller for some back buttons it doesn't seem like ot would be too much harder to do a battery upgrade if you had a good option.
The batteries have always been front and center in all the dualshocks. The hardest part is probably just pulling the connector pins out if you've got fat fingers.
"The sausage of Green Earth explodes with flavor like the cannon of culinary delight."
0
Options
AegeriTiny wee bacteriumsPlateau of LengRegistered Userregular
I have taken 3 days off work from the 9th to 11th of November.
I have no interest in ever cracking open a controller. Feels like it'd be too easy to mess stuff up, especially with analog sticks and whatnot. And really, it should never come to that anyway if they had just designed them better.
Every time controller battery life comes up, I feel like I'm living on Mars or something. I bought two controllers to handle couch co-op games and one sits on a charger whilst I use the other. I use the one on the couch until it goes flat, then swap them out. I think I change them every 5 to 7 days, which seems like a pretty decent battery life to me. And yeah, when the "low battery" warning pops up, I usually get another day of use out of them before they actually disconnect.
Every time controller battery life comes up, I feel like I'm living on Mars or something. I bought two controllers to handle couch co-op games and one sits on a charger whilst I use the other. I use the one on the couch until it goes flat, then swap them out. I think I change them every 5 to 7 days, which seems like a pretty decent battery life to me. And yeah, when the "low battery" warning pops up, I usually get another day of use out of them before they actually disconnect.
This is exactly my point. Instead of just having rechargable batteries you can swap out, you have a second $70 you can swap out, both of which will have their batteries diminish in a few years.
Honestly, training your eyes to be able to easily tell the difference between framerates seems like one of the worst things you can do to yourself.
??????????????????????
So long as it's stable, I can't tell what a game's frame rate is (unless I've just been playing something with a different frame rate). Bloodborne at 30 fps? Didn't bother me at all.
I don't see how becoming one of the "30 fps is unplayable" people would improve my life as a gamer in any way.
You're acting like we wanted this lol
Yeah, I didn't train for nothing. Metroid Prime, for example, was super obviously running at 60 FPS which was really rare for 3D games at that time.
I don't know how people can't see the difference. I get not caring, but fully not being able to tell any difference? Seems so strange to me.
For me the diff is a feeling more than a visual. Like it's pretty obvious when I do input and the resultin actions if this is 30 or 60fps. 30fps is just so sluggish.
Thats frame timing.
Do you mean frame pacing? Because no it's not that. But yes that a frame at 60 is shorter than a frame at 30. No as in it doesnt necessarily equate to the jitters from terrible frame pacing. 60fps game can have bad frame pacing and a 30fps can have good.
I still don't fully understand this variable refresh rate thingamajig. My brain still operates under the understanding that framerate is mostly tied to the power of the machine and the game engine. If there's too much shit going on that it can't handle, then you get a lower framerate. Ok maybe the TV literally can't display 120 fps so that's why you need one that supports it, I can sort of get that. But... if the game can run at 40 fps... why can't it just freakin' run at 40 fps? What on earth can the TV be possibly doing to pull an extra 10 frames?
OK so tv's etc have refresh rates, which is the amount of times the tv will display everything on the entire screen per second. So 60hz it will refresh and redisplay the entire screen 60 times per second.
Frame rate however, is how fast the game, pc, console etc, is actually capable of sending information to the TV.
If you have a fixed refresh rate, but your frame rate varies (as often happens in games), you can have a circumstance where the console sends information to the tv in the middle of a refresh, so you get weird visual stuff happening. You might get half a frames worth of information displayed instead of full, so you get visual disruptions, and it doesn't feel as smooth. This is tearing.
VRR means the tv will synchronise with the incoming frame rate. If the frame rate slows, the refresh rate slows. If the frame rate increases, the refresh rate increases. This results in a smoother image.
Now, games will deal with this by fixing the frame rate so it can't be out of sync with the refresh rate. So instead of allowing the game to hover around 40, dipping down to 35, occasionally going to 42, etc, which causes tearing, it just says "you can reliably do 30, so you only do 30, and that's that". This fixes the issue, but means the consoles aren't actually outputting full power.
With a VRR tv, you can allow it to do its inconsistent 40 without any visual distortion due to desynching, as the tv handling the synching, so the console can output whatever it wants.
It isn't that the tv magically makes the console able to run the game faster. It's just that it allows it to run as fast as its actually capable of without causing the problems they previously needed fixed frame rates to solve.
You can think of it as two people A and B cooperating to throw things to each other. B previously could only catch things at a fixed rate, so A had to do the accommodating by throwing at a fixed speed so nothing is dropped. With VRR, B can now adapt to however fast or slow A wants to throw things.
Isn't it also that (even without VRR) 40 frames multiplies into 120hz nicely so there are no uneven frames? Where if you try and push 40 frames into 60hz you get every second frame showing for twice as long or other weirdness.
I am definitely *not* an expert so I may be off base here.
McFodder on
Switch Friend Code: SW-3944-9431-0318
PSN / Xbox / NNID: Fodder185
Yes a fixed 40@120hz is the new hotness, which made it a slightly unfortunate example. But the logic of everything said remains accurate. To keep smooth at 40 without VRR you're either still leaving some performance on the table, or else its dipping below 40 often and getting jittery. It works the same as 30 or even 60 FPS.
I feel like every generation goes through the same motions.
At the start: 60 frames per second! Max resolution! Best graphics ever! But games haven't tapped into the full power of the machine yet. It runs dead silent.
Midway: 30 fps, pretty solid performance. Now we know what the machine can properly handle. Expect to be able to hear the fan at this point.
Near the end: Depending on the developer, everything is a struggle. Frame drops everywhere as the machine is pushed to its absolute limit. The fan is a goddamn jet engine now.
We'll see how long this lasts, but I would not put stock in the 60fps renaissance lasting very long. When it comes to performance or fidelity, the former will always be sacked for the latter. We'll see how long this golden age of load times lasts too. If they can sacrifice load times for moar powar, they'll do it.
"The sausage of Green Earth explodes with flavor like the cannon of culinary delight."
It depends on the game and what the developer is trying to achieve. You can often get more out of the same console as the devs get better at developing for it. Look at early gen Xbox 360 games vs late gen. I think the current trend of two modes, performance and quality, is probably more likely than everything going back to 30
The Gotham knights thing seems related to their coop mechanic being crucial to the game, which can be a nightmare to get working in a game so that’s more likely the culprit than any broader trend to go back to 30. Developing around coop especially in a large world is tough to get working without compromises, especially if there are other headwinds in the engine or development
Prohass on
0
Options
MorninglordI'm tired of being Batman,so today I'll be Owl.Registered Userregular
Honestly, training your eyes to be able to easily tell the difference between framerates seems like one of the worst things you can do to yourself.
??????????????????????
So long as it's stable, I can't tell what a game's frame rate is (unless I've just been playing something with a different frame rate). Bloodborne at 30 fps? Didn't bother me at all.
I don't see how becoming one of the "30 fps is unplayable" people would improve my life as a gamer in any way.
You're acting like we wanted this lol
Yeah, I didn't train for nothing. Metroid Prime, for example, was super obviously running at 60 FPS which was really rare for 3D games at that time.
I don't know how people can't see the difference. I get not caring, but fully not being able to tell any difference? Seems so strange to me.
For me the diff is a feeling more than a visual. Like it's pretty obvious when I do input and the resultin actions if this is 30 or 60fps. 30fps is just so sluggish.
Thats frame timing.
Do you mean frame pacing? Because no it's not that. But yes that a frame at 60 is shorter than a frame at 30. No as in it doesnt necessarily equate to the jitters from terrible frame pacing. 60fps game can have bad frame pacing and a 30fps can have good.
Bad frame pacing is a consequence of inaccurate frame timing. Frame timing inconsistencies is the cause, pacing issues is the effect.
Both 30fps and 60fps are smooth if the frame timing is accurate and doesnt vary.
However I didnt fully understand you before, and it seems like you are talking about game feel, which can also vary based on frame timing, since thats not just refreshing the image, but the underlying logic loop of the game, including inputs, is dependent on that as well. Slower frame timing means longer input updates. (33ms at 30fps vs 16.667ms at 60fps per update, so it updates your inputs 60 times a second vs 30) It can make things feel sluggish.
But thats not based on fps its based on frame timing. 60fps game with bad frame timing will also fail to update your inputs smoothly and it wont show the image smoothly either so the benefits of the higher frame rate (being able to see more of the animation) are lost. You aren't really seeing more information in an animation if its duplicating frames constantly to keep up. In that case it will look jittery and it will also "feel" erratic to play, as inputs are all over the place without a smooth consistency. You'll press things at the same timing, but they wont visually update what you did at the same timing.
Higher fps means frame timing has to be faster, which is more taxing. If the machine can't keep up, it's worse than 30fps with a solid frametiming that can be achieved. If you can achieve it, you reduce the time between input updates, resulting in increases responsivness.
Naturally 60fps with solid frametiming is just better than 30fps with solid frametiming, but it's always a compromise in games like this.
Frame timing is the key thing you need to look at, not frame rate. You need frame timing to be consistent. It solves frame pacing issues, and it makes the game feel smooth to play. If you can get that at 60fps, great! If you cant (and you often can't in consoles that's why so many of them on previous generations locked to 30) you are better off locking your fps to something achievable.
edit: I just realised the easiest word to describe the input thing is input lag. Faster frame time, less input lag. Inconsistent frame time, variable input lag. Very frustrating. (Note of course it's not the only thing that influences input lag. If you have a smooth frame timing at a high frame rate and the game still has inconsistent input shenanigans, the game just has crap code for handling inputs.)
edit2: I should also explain that most games usually lump all your polled inputs (polled is how often your computer accepts inputs from the controller) into each frame. But this isn't necessary to do. You can have polled inputs accepted and acted on independent of the frame rate. An example of lumping polled inputs is overwatch. With low frame rates in overwatch, if you sweep your mouse of the head of a target, fire, and keep going, you'd think the game is taking the mouse input, then firing, then continuing to take the mouse input, in sequence. But it doesn't do this. Instead if you sweep and fire fast enough it will lump it all into the frame and update it 1 frame later, so you'll actually completely miss your target even though you fired accurately on the head. With higher frame rates, there's less "time" for this to happen in, so it's less likely to happen. If you make the polling happen independently of the frame rate and frame timing, it'll do it in the right order as fast as you allow the code to accept the inputs and act on them, so you'll actually hit the target in that case no matter how fast you snap and shoot.
You can see why a variable frame rate, and a variable frame timing, would cause problems in a game like overwatch. The timing of when its allowing the shot to actually happen is suddenly all over the place. You get 16.667 ms at 60fps, if it drops the time increases so you miss by even more, if it raises the time decreases so it is more accurate to where you shot, but if you've adjusted to an average slower and it sometimes go higher, etc etc. It's a bit of a nightmare.
Morninglord on
(PSN: Morninglord) (Steam: Morninglord) (WiiU: Morninglord22) I like to record and toss up a lot of random gaming videos here.
So, here's a thing. I've got my PS5 set to turn on my TV automatically when I turn my PS5 on. And then half the time it just... doesn't. There's no rhyme or reason to why it does this. Anyone run into this before?
So, here's a thing. I've got my PS5 set to turn on my TV automatically when I turn my PS5 on. And then half the time it just... doesn't. There's no rhyme or reason to why it does this. Anyone run into this before?
Hate to say that it's probably the most consistent thing on my side. Is it a newer HDMI cable? Is your TV setup to fully support it? It's possible your TV has 2 off states: off and sleep.
Posts
For me the diff is a feeling more than a visual. Like it's pretty obvious when I do input and the resultin actions if this is 30 or 60fps. 30fps is just so sluggish.
Thats frame timing.
It does not have a 60 fps mode. It’s locked to 30 fps and that looked passable to my eyes, but apparently it also has a 40 fps mode if you have a 120 hz TV. I enabled 120 hz (I have it turned off unless a game will make use of it), and 40 fps is very good. Honestly, the main reason I was very adamant about future-proofing and making sure I had a 120 hz TV at the beginning of this gen was 40 fps and it’s really paying off in this game. I’m betting that by the end of this gen it’ll be my most common mode if more devs jump on this bandwagon.
Interestingly, there’s no menu choice to select the different modes (or even tell you they exist). It’ll default to 40 fps if it’s possible, and 30 if it’s not. Is this the first console game where the upper cap is 40? I’ve seen games 60 fps and 40 fps option, but I can’t think of one with only 30 fps and 40 fps.
OK so tv's etc have refresh rates, which is the amount of times the tv will display everything on the entire screen per second. So 60hz it will refresh and redisplay the entire screen 60 times per second.
Frame rate however, is how fast the game, pc, console etc, is actually capable of sending information to the TV.
If you have a fixed refresh rate, but your frame rate varies (as often happens in games), you can have a circumstance where the console sends information to the tv in the middle of a refresh, so you get weird visual stuff happening. You might get half a frames worth of information displayed instead of full, so you get visual disruptions, and it doesn't feel as smooth. This is tearing.
VRR means the tv will synchronise with the incoming frame rate. If the frame rate slows, the refresh rate slows. If the frame rate increases, the refresh rate increases. This results in a smoother image.
Now, games will deal with this by fixing the frame rate so it can't be out of sync with the refresh rate. So instead of allowing the game to hover around 40, dipping down to 35, occasionally going to 42, etc, which causes tearing, it just says "you can reliably do 30, so you only do 30, and that's that". This fixes the issue, but means the consoles aren't actually outputting full power.
With a VRR tv, you can allow it to do its inconsistent 40 without any visual distortion due to desynching, as the tv handling the synching, so the console can output whatever it wants.
It isn't that the tv magically makes the console able to run the game faster. It's just that it allows it to run as fast as its actually capable of without causing the problems they previously needed fixed frame rates to solve.
You can think of it as two people A and B cooperating to throw things to each other. B previously could only catch things at a fixed rate, so A had to do the accommodating by throwing at a fixed speed so nothing is dropped. With VRR, B can now adapt to however fast or slow A wants to throw things.
Your Ad Here! Reasonable Rates!
https://blog.playstation.com/2022/10/18/dualsense-edge-wireless-controller-for-ps5-launches-globally-on-january-26/
I run into that a lot when it comes to performance/quality modes. Like, the quality modes do tend to look a bit nicer, but in motion they both pretty much look the same and the increased fps is much more noticeable in motion as well.
But yea imma grab one of those to never click a stick again haha.
$200!?!?!?
... I guess I'd pay that much if it had a decent battery.
Man, you should write tech manuals. If the documentation that came with my TV read like this I would read it for fun.
Heh, I forgot to complain about that when I got the PS5, it was easily the worst thing about it. At first I was not thrilled that I was forced to buy an extra controller, but I was so glad during my first marathon session of GoW as I’d alternate between charging one and playing with the other.
US price is not that crazy, though. It’s right between an Elite (180) and a Scuf (220), and it has haptics. European prices are a different question though. €240 is high.
I freelance for a game guide company sometimes, and I also work in the tech section of a department store, where I have to explain arcane magical bullshit to people so they know what the magic runes on the boxes mean when they're trying to buy tech stuff.
e: I have like 5 USB-C cables in the room and I can't find one I like to actually use. And sometimes they don't charge in my outlets so I have to run it to the console... But then I can't charge my phones... Ugh. I fucking hate it.
https://www.amazon.com/gp/product/B08PVPTNZL/
That's the one I got. Works fine for me. Idk if it meets your specifications, tho.
PSN:Furlion
As annoying as the price is going to be, I actually think this controller is going to help with these drift issues. If the whole analog stick is removable, then it should be a lot easier to clean them. And an adjustable deadzone setting will do wonders. These sticks are just too fucking sensitive. The barest minimum pressure will cause my character to move in Genshin Impact. Just a 1% global deadzone setting and I betcha 90% of all drift issues people have will vanish.
I took my PS4 controller apart and I couldn't get it together and sit properly
Yeah I think I bought this one too. I just don't think I like nylon cables.
Yeah I guess I should have made it more clear it is a bit tricky. Of the three I changed out one of them has very sticky buttons. Like when you press the button it makes an audible click and you have to press pretty hard. It still works, but it is definitely a but impaired.
PSN:Furlion
The batteries have always been front and center in all the dualshocks. The hardest part is probably just pulling the connector pins out if you've got fat fingers.
No particular reason why.
Just felt like it.
You know?
Does this new edge controller have weight to it? I really LOVE the feel of my Elite.
This is exactly my point. Instead of just having rechargable batteries you can swap out, you have a second $70 you can swap out, both of which will have their batteries diminish in a few years.
Do you mean frame pacing? Because no it's not that. But yes that a frame at 60 is shorter than a frame at 30. No as in it doesnt necessarily equate to the jitters from terrible frame pacing. 60fps game can have bad frame pacing and a 30fps can have good.
Isn't it also that (even without VRR) 40 frames multiplies into 120hz nicely so there are no uneven frames? Where if you try and push 40 frames into 60hz you get every second frame showing for twice as long or other weirdness.
I am definitely *not* an expert so I may be off base here.
PSN / Xbox / NNID: Fodder185
At the start: 60 frames per second! Max resolution! Best graphics ever! But games haven't tapped into the full power of the machine yet. It runs dead silent.
Midway: 30 fps, pretty solid performance. Now we know what the machine can properly handle. Expect to be able to hear the fan at this point.
Near the end: Depending on the developer, everything is a struggle. Frame drops everywhere as the machine is pushed to its absolute limit. The fan is a goddamn jet engine now.
We'll see how long this lasts, but I would not put stock in the 60fps renaissance lasting very long. When it comes to performance or fidelity, the former will always be sacked for the latter. We'll see how long this golden age of load times lasts too. If they can sacrifice load times for moar powar, they'll do it.
The Gotham knights thing seems related to their coop mechanic being crucial to the game, which can be a nightmare to get working in a game so that’s more likely the culprit than any broader trend to go back to 30. Developing around coop especially in a large world is tough to get working without compromises, especially if there are other headwinds in the engine or development
Bad frame pacing is a consequence of inaccurate frame timing. Frame timing inconsistencies is the cause, pacing issues is the effect.
Both 30fps and 60fps are smooth if the frame timing is accurate and doesnt vary.
However I didnt fully understand you before, and it seems like you are talking about game feel, which can also vary based on frame timing, since thats not just refreshing the image, but the underlying logic loop of the game, including inputs, is dependent on that as well. Slower frame timing means longer input updates. (33ms at 30fps vs 16.667ms at 60fps per update, so it updates your inputs 60 times a second vs 30) It can make things feel sluggish.
But thats not based on fps its based on frame timing. 60fps game with bad frame timing will also fail to update your inputs smoothly and it wont show the image smoothly either so the benefits of the higher frame rate (being able to see more of the animation) are lost. You aren't really seeing more information in an animation if its duplicating frames constantly to keep up. In that case it will look jittery and it will also "feel" erratic to play, as inputs are all over the place without a smooth consistency. You'll press things at the same timing, but they wont visually update what you did at the same timing.
Higher fps means frame timing has to be faster, which is more taxing. If the machine can't keep up, it's worse than 30fps with a solid frametiming that can be achieved. If you can achieve it, you reduce the time between input updates, resulting in increases responsivness.
Naturally 60fps with solid frametiming is just better than 30fps with solid frametiming, but it's always a compromise in games like this.
Frame timing is the key thing you need to look at, not frame rate. You need frame timing to be consistent. It solves frame pacing issues, and it makes the game feel smooth to play. If you can get that at 60fps, great! If you cant (and you often can't in consoles that's why so many of them on previous generations locked to 30) you are better off locking your fps to something achievable.
edit: I just realised the easiest word to describe the input thing is input lag. Faster frame time, less input lag. Inconsistent frame time, variable input lag. Very frustrating. (Note of course it's not the only thing that influences input lag. If you have a smooth frame timing at a high frame rate and the game still has inconsistent input shenanigans, the game just has crap code for handling inputs.)
edit2: I should also explain that most games usually lump all your polled inputs (polled is how often your computer accepts inputs from the controller) into each frame. But this isn't necessary to do. You can have polled inputs accepted and acted on independent of the frame rate. An example of lumping polled inputs is overwatch. With low frame rates in overwatch, if you sweep your mouse of the head of a target, fire, and keep going, you'd think the game is taking the mouse input, then firing, then continuing to take the mouse input, in sequence. But it doesn't do this. Instead if you sweep and fire fast enough it will lump it all into the frame and update it 1 frame later, so you'll actually completely miss your target even though you fired accurately on the head. With higher frame rates, there's less "time" for this to happen in, so it's less likely to happen. If you make the polling happen independently of the frame rate and frame timing, it'll do it in the right order as fast as you allow the code to accept the inputs and act on them, so you'll actually hit the target in that case no matter how fast you snap and shoot.
You can see why a variable frame rate, and a variable frame timing, would cause problems in a game like overwatch. The timing of when its allowing the shot to actually happen is suddenly all over the place. You get 16.667 ms at 60fps, if it drops the time increases so you miss by even more, if it raises the time decreases so it is more accurate to where you shot, but if you've adjusted to an average slower and it sometimes go higher, etc etc. It's a bit of a nightmare.
Hate to say that it's probably the most consistent thing on my side. Is it a newer HDMI cable? Is your TV setup to fully support it? It's possible your TV has 2 off states: off and sleep.