FF16. I’m talking about running quality mode (which still doesn’t guarantee native 4K it does upscaling lot of the time) at higher FPS than 30 with a better PS5 Pro hardware.
FF16. I’m talking about running quality mode (which still doesn’t guarantee native 4K it does upscaling lot of the time) at higher FPS than 30 with a better PS5 Pro hardware.
Yeah but some games like Final Fantasy run at 30 FPS capped and they suck.
They should start moving those games from fixed 30 FPS to a higher FPS. Doesn’t even need to be fixed 60 FPS. If they enable VRR between 30 and 60/120 they will massively improve the experience.
I understand that some TVs can run VRR down to 20 or 30Hz.
Honestly I prefer this to Microsoft Edge stealing (importing) all my chrome passwords and bookmarks without my consent. Something they do from time to time.
After Microsoft Edge decided to import all my chrome passwords and data I decided to get rid of windows as much as possible.
I think that upgrading it with just 22% faster processor is kind of killing it.
The current CCWGTV performance sucks.
The CCWGTV performance is a shit at 4K. The UI is far from smooth, some UHD shows randomly drop frames and sometimes I have to restart the device because Netflix and prime stop working.
Eventually I’ll replace it, if the new device sucks I might try apple TV.
I mean it’s a tiny device after all.
I also now have an HDR 4K 144Hz VRR OLED TV so it would be nice to get a device that supports this max resolution (like the PS5).
News exaggerate everything nowadays.
I think “populism rule” would fit a bit better.
I have a Dell Laptop (Latitude 7390) where I changed in the bios some option to maximize battery longevity and 6 years later it still lasts quite a lot.
So offline (external) bitlocker drives that are unlocked with the key only.
Or internal bitlocker drives that are unlocked with AMD fTPM are excluded from this exploit?
All good unless you replace Hawaiian with Pineapple in that name!
One thing that I also noticed is that my monitor (which has i think 350 nits / LG 27GL850-B) it is much easier and clear to see at direct sunlight because of the anti-glare screen.
But I doubt that antiglare/matte displays is a thing you find on TVs.
Thanks for the hints. So that means that in a bright room, a TV with 1500+ nits is ideal for HDR right?
But even with a 1500 nits TV, HDR will be still much better in a dark room? (Where OLED shines?).
I have a budget Samsung 55" NU7400 and I can’t see shit while playing a PS5 game with HDR during the day. I need to close the blackout curtains otherwise I see my face reflected.
Next TV I buy I will do some research and spend a bit more money, 120Hz, more nits, VRR, etc.
FF16 in quality mode runs at 1440p 30 FPS locked upscaled to 4K. And it decreases resolution sometimes in my experience while playing it.
And in performance mode runs at 1080p targeting 60 FPS. The lower resolution it’s very noticeable.
Both modes can be improved by better hardware. That’s my point.
Now things like “30 FPS locked, sucks” or performance mode targeting 1080p is not enough quality is just my personal opinion. Of course some people enjoy 30 FPS or 1080p on a 4K big TV. I personally try to avoid 30 fps locks OR very low resolution performance modes. Games like GT7 or Warzone that can handle lot of FPS at a high resolution (on the current PS5 hardware) are the ones I don’t complain at all and truly enjoy. And in those 2 games I wouldn’t care about upgrading to better PS hardware.