Once again gamers are being treated like idiots and have been handed an apology from the developers of LOTR: Gollum for its awful launch.
This is beyond a joke at this point, LOTR: Gollum was released recently with NVIDIA regularly boasting about its DLSS and RTX capabilities when in reality they should’ve left it in the bin as the graphics rival that of a PS1 game. The UI utilises a very bad calibri-like font, Gollum himself looks dreadful and it wouldn’t be a 2023 release if it didn’t run like a one-legged spider being chased by a hammered dog. Several reports, like many other 2023 releases, are claiming high VRAM usage and stuttering which backs up everyone’s complaints about the 8GB VRAM on the 4060 Ti.
Following this release, the developers have released the apology letter to add to the massive collection of apology letters. You can read it in the tweet above but it doesn’t contain anything more than any of the other apology letters we’ve received and is mostly just “we’re sorry that simply releasing a game that functions and isn’t a disaster is completely beyond our capabilities as a company.” Of course, I understand that most of the issues surrounding these games are the muppets at the top who have no idea what they’re doing and are forcing the developers to work to ridiculous deadlines to churn out poor products. I also understand that we shouldn’t stand for this and change has to come very soon for the games industry because if I have to read another one of these apologies and I’m going to go insane.
A few days ago MSI announced its new Limited Series which would involve a variety…
Ray-Reconstruction is a new technology that debuted alongside DLSS 3.5 and only currently works with…
BIOSTAR has today revealed two new H610 chipset-based motherboards for both DDR4 and DDR5 memory…
Sequential Read/Write speeds up to 3,500/3,300 MB/s respectively. Performance varies based on system hardware and…
[Performance Boost] – enhance productivity and increase efficiency, faster start up and access to applications…
NVMe (PCIe Gen3 x4) technology with up to 3500MB/s sequential reads, random read/write 650K/700K IOPS…