What a nice board!
READ THE RULES https://zzzchan.xyz/v/custompage/rules.html
It's a conspiracy >>>/hikki/
>BTW New world fried my Evga ftw3 3090 I have another 3090 that’s a icx version played for 15 hours and it’s not dead. Anyone know if it’s exclusive to the FTW3?
>It is probably not a good idea to play New World right now. The closed Beta and Alpha builds of this game have reportedly been a reason for the bricking of GeForce RTX 3090 graphics cards, multiple users on the official game’s forum have reported.
>The issue appears to affect mainly GeForce RTX 3090 graphics cards which are reportedly overheating and see power spikes. The game has an uncapped framerate in the main menus, which is usually associated with buzzing capacitors. Most users however have reported that EVGA RTX 3090 cards specifically are the most affected brand. A number of the RTX 3090 cards have been bricked in the process.
>It is highly recommended not to play this game right now, at least not until developers publicly acknowledge the issue, not to mention release a patch.
>Thank you all for sharing your reports about this problem, we believe this is related with driver settings and frame rate limiters.
>1. Disable the overrides in the driver settings,
>2. Make sure to press “APPLY”
>3. Restart the game client.
>Also you can cap your FPS.
>This will help prevent issues with the GPU’s utilization.
>Go to Settings > Visuals > Max fps > Set this to 60, this should help to bring the utilization back down.
>Additionally, please be sure to check in your NVIDIA Control Panel under Manage 3D Settings > Program Settings > Select New World, and check that Max Frame Rate either shows 'Use Global Settings (Off) or just Off.
>Please let us know if you need more assistance and thank you all for your contributions to the Beta.
This is what, Amazon Studios' third bomb in a row? Does anyone even remember Crucible?
>bricking video cards
modern gaming really finds new and interesting ways to fuck you over
but the truth is this isn't really a game issue, the video card shouldn't be running hot enough to destroy itself but the RTX 3090 idles at 80C with cooling. It was only a matter of time when a piece of software comes along and destroys the piece of shit.
The main menu pushing hardware to critical levels is in-excusable, regardless of how shoddy the card's failsafes are though.
A single polygon at uncapped framerates would push the hardware to "critical level"
Getting infected with a bitcoin miner would brick a 3090. This is a hardware level issue and Nvidia is passing the buck to developers. The people getting affected by it are the ones who don't use Vsync which shouldn't be required in the first place just not to brick your fucking GPU. What are you going to do when a driver update breaks global vsync? Sit on your 1000$ brick?
My EVGA cards have always had issues.
>our game somehow literally destroys your graphics card when you open the main menu
>here's a list of things that YOU should do in order to avoid our software from damaging your equipment
Let's forget for a second how badly we have sank into insanity because I have a question for the tech folks: how is it even possible? Shouldn't the operating system's kernel be responsible for ensuring user space software can't misuse the hardware so badly that shit like this can happen?
>Games have become so fucking bloated and unoptimised that they can kill your GPU now
What the fuck happened to programmers
Good. Amazon supporters must suffer.
The gist of it is game dev is a lot of work and not for a lot of pay. A competent enough programmer can work at a "real" programming job, while making their own games on the side or as a hobby. So a lot of programmers in the vidya industry are just plain bad, passionate but not very good, and diversity hires.
t. programmer fag
Actual programmers don't work on video games. It is an extremely oppressive industry that pays like shit, demands at least an extra 50% of your time as free labor (which they call "crunch") and has no real path for meaningful career advancement as a programmer. Whatever they call "programmers" are /agdg/ nodev equivalents except with the resources of a large company to help them do the same terrible job with a nice coat of paint, or outsourced pajeets.
What this anon also said.
Has it always been like this, or only became so as gaming went mainstream? Pretty sure that game developers used to be pretty competent guys who were able to make the most of much weaker hardware than today's norm.
Its happening because the 3090 is a "gamer card" and doesn't throttle itself or shuts down when it reaches dangerous temps.
This is probably a manufacturing flaw with the card itself as even decade old nvidia cards are capable of it but since 3090 isn't widely available and most of the people who spend a grand on it will also buy half a dozen fans to keep it at 60C, it becomes a really rare use case so it just wasn't detected until now.
Internet and GPU cards got better, so there's no reason to optimize now.
You can have a game's file size explode due to new voice files which aren't compressed (Something that can easily add a gigabit to your game's size) and random unoptimized garbage that a old computer wouldn't even be able to run and since its 2021 its fine.
Back in the day you needed actually competent programmers because of the reasons you mentioned. You didn't have what is effectively infinite memory to play with (as we have today) or infinite CPU cycles (again, as we have today for 99% of use cases) and you can't just hire any incompetent boob to do the necessary optimizations to work in that environment. So yes, older games were made by MUCH more competent programmers than the ones working in the industry today, who basically just learn how to operate pre-built game engines (barely).
>Its happening because the 3090 is a "gamer card" and doesn't throttle itself or shuts down when it reaches dangerous temps.
What can men do against such reckless braindead retardedness? Truly every large corporation deserves to burn.
You mean the pay or just the quality of work?
The pay I assume because as much as vidya is supposedly a gorillion dollar industry there's also bloated studios in everything nowadays. Credit lists are just massive with seemingly hundreds or thousands of people all working on miniscule things and ultimately I believe most or all the of the sales of a game go to the publisher so unless you got really lucky as an indie dev you're making peanuts.
The quality of the work? I assume this is because putting something out is easier now. You have pre-built engines, models, assets, books, a shitton of games and references to work of of, and ultimately the mindset for a lot of work, vidya or not, has shifted from "give it your all" to "just get something out". I remember watching a video of Jackie Chan in the early days of him saying that you have to make sure everything's at its best as it can be because what the consumer is seeing is the end product and not what trials and tribulations you went through and what excuses you have for what turns out wrong. If a movie's shit then that's all what the viewer's seeing and I feel that type of mindset is slowly dying out. In vidya this is because you can now very easily patch things and you can have all sorts of unprompted or unorganic excuses show up for your game that does not make fucking up a game seem like the big deal it should be.
Oh just download the patch, just mod it, just buy better hardware, it worked "fine" when we made it so it's not our fault, we brought our game out, that's the important part!
Anon, games need as much control of your hardware as possible in order to run well. You would not want Microsoft anywhere near your drivers, and I have never seen satisfactory performance from open source drivers like nouveau. Simply put if it isn't drivers from the manufacturer, it shouldn't be used unless necessary.
Yes, I am an "actual programmer" and I avoided games for that reason. However game programming was always more difficult than enterprise. It has become too much of a waste of time for large companies to find the crazy bastards they need to make their games work well. Games aren't the cutting edge of technology anymore so you won't see many John Carmacks or his like who are interested (Carmack also only cared about games in the capacity that they represented virtual reality). Back in those days every new device that came out for the purpose of playing games was completely different than the last. Solving the challenges that come with these radically new devices and getting to play with new stuff is the type of thing that interests competent programmers. But now everything is homogenized so anyone who is remotely as autistic as Carmack would stay in his basement and write the game himself.
Even if such programmers were to appear, they would necessarily have to be indie. id Software was indie-but-not-quite-indie, since they got their start working for another small company who didn't mind letting them make games every 6 months to put in a shrink wrapped disc for a magazine. Then Romero and a couple of people left on good terms after getting some valuable experience. There aren't companies willing to take small risks like that any more, it's just not in their business doctrine and they don't believe it will lead to financial success. You either are a unicorn who has all the skills to succeed immediately in a trial by fire as an indie or you're a worthless cog in the AAA machine.
Maybe there's hope in Europe somewhere. They still have a markedly different market for this stuff than we do. However for some reason (perhaps lack of top tier education for software) they tend not to be as refined at programming.
>This is a hardware level issue and Nvidia is passing the buck to developers.
Are there any other games that are killing GPUs, anon?
Holy shit i knew gamers were retarded but this is a new low. The GPU should NEVER overheat, any hardware fucking shouldn't. It has nothing to do with amazon's game. Nvidia literally sold malfunctioning GPU.
As you play any modern title GPU usually bottlenecks the system because if it worked at 100% (which is basically infinity, it can work for a short period of time at any clock speed) it would heat. The RTX 3090 is overclocked at unstable levels in factory settings, it is fucking 100% Nvidia's fault. Bugmen will eat it up thought, they like it when corporation fucks them. Proprietary software is also at fault though, it basically lets any corporation fuck the userspace up. After i saw what they let riot do to their kernel i don't trust any megacorp.
AMD have good OSS drivers
The only reason there weren't one is because new world is the first game to brick these trash, 3090 is quite overkill for most game titles, if there were several modern games with no fpslock there would be more games to brick it. Nvidia will fix the drivers and pay journos to say new world devs fixed their game.
Nvidia's every move is soo disgusting it is unreal.
>Are there any other games that are killing GPUs, anon?
The same GPUs would die from running an emulator with fast forward on or running shit at higher than native resolution.
Its just that the mouthbreathers who spend money on this piece of shit GPU won't even know what emulators are.
Programmers used to be really cool, and limited hardware really pushed creativity. I remember Crash Bandicoot being basically magic.
>one lazy google search later
Better hardware should mean even better optimizations, not worse, but most of us, me included, are lazy.
The best programmers were all racists.
>Better hardware should mean even better optimizations
Sorry I'm speaking in half hearted nonsense. I guess if I were attempt to articulate a thought, it would be that, generally, one thing advancing should open up opportunities for cooperative/symbiotic effects to be lifted up alongside it, instead of allowing it to atrophy. Basically, I'm proposing there's a compounding effect if you improve all facets of an operation, as opposed to improving one and trying to get it to compensate for the weakness of the system. Chain only as strong as its weakest link, perse. If we can develop improved storage, circuits, and computational power, we should be able to develop better data structures with it as well as more efficient implementation. In fact I'd argue we're almost over specialized, where the brain power spent improving hardware could manifest the same effects if focused elsewhere, and if designed in tandem would be a whole that is greater than the sum of its parts.
Or whatever. tldr if we can improve one thing in a system we can improve all things in the system for synergistic effect
They do, videogames are just such a big industry now that good games drown in the mass of shitty games by shitty developers. If you play a game that is well made and works well, you don't notice it because that's how you expect it to run. Factorio and Noita immediately come to mind as games that are very technically impressive. I haven't played Teardown personally, but that also seems pretty neat.
I can imagine that even if you're a good developer in a big AAA studio, there's not much you can do to make the code good when there's 100 other people contributing into the game.
Your mistake is assuming that any business' goal is to make the best product, and not to make the most money. It's far more cost effective to lean on the better hardware and hire legions of shit-tier programmers, then never promote them.
I'm no economist but it appears to me that there is not nearly enough forward planning. Instead of accepting short term losses for long term gains it seems like every business leader only focuses on quarterly goals. It's my impression that investing in quality and making realistic plans turns the most long-term benefits. I mean, it's easy to tell when a product is well received because companies will immediately jump on the bandwagon and make cheap replicas. I want to blame marketers. They're more persuasive than engineers and you don't need to be the best value, just convince people you are.
>all these niggers blaming software when its a hardware issue
You can achieve the exact same effect with literally anything that uses hardware rendering and doesnt get throttled by limiters or bottlenecks. A card that cant handle 100% utilization is defective.
Was there another software, game probably, that results in a similar/the same effect? If so then 100% on the hardware manager. If not then it needs to be investigated further, and the software developer should participate in the investigation and acknowledge the possibility of error.
When was the last time you used your hardware to 100% of its capacity? The card just shitting itself and dying is one thing, but reaching this within a few seconds of use is another matter entirely.
Most games cap your CPU because of shitty timing function that just loops at infinite speed. It doesn't necessarily show in your process manager because it only shows 100% if all the cores in the CPU are at 100%.
GPU is hard to measure because the bottleneck may be retarded data transfers from the CPU/RAM to GPU. I think you may see 100% utilization even if the GPU is literally just idle and waiting for data to come.
Probably the last time a game I played didnt get CPU bottlenecked by some pajeet spaghetti, and certainly when I was last fucking around with SDL which was just doing something as simple as rendering a couple of png's at a few thousand FPS.
Did it fry any other cards? Do you have a 3090 FTW? Ill send you a build of the aforementioned SDL test project and you can try frying as many cards as you like.
That would be really cool except I'm just some dumbass. Just like it's easy to tell if food tastes bad but hard to say why or even better improve it if you aren't a cook, I can tell something went wrong with the way the gpu handled the software but since I don't have the expertise all I can do is speculate at the problem. Fact is I see nothing about similar failures on the same gpu with other software, and I see nothing about similar failures on different gpu's with the same software. Maybe there is a common industry safeguard that wasn't implemented? Maybe this or that? I dunno I'm just some faggot, but don't just brush it under the rug.
If better hardware enables worse software then surely the inverse is also true. Perhaps this might open the software market to a wider audience but who is that but poorfags. Poorfags, mind you, that also have to be idiotic enough to spend some of their already dwindling cashstacks on software. From what I can tell there seems to be more money in hardware, whether it be due to patent jewry, limited resources, impossible to replicate by an individual technology used to justify high prices and so on. If anything, it would seem to me that it would be in a companies best interests to produce software as bloated and as garbage as possible without upsetting the masses or limiting their range such that it neccesitates over the top hardware, thus generating more sales of such. Perhaps I'm too shortsighted though and can't see the potential in the mass production of more efficient, cheaper to produce, mid-end hardware that would be enabled.
As a certified /tech/ autist, the most likely scenario is a mix of shit software (poorly coded demo loop abusing DLSS/RTX, or pajeet sneaking a bitcoin/monero miner into the game) taxing much more of the GPU chip than usual and shit QC/design from EVGA leading to a heatsink that doesn't. The last couple generations of GPUs have had all kinds of issues with heatsinks that either have shit thermal pads that don't conduct heat fast enough or outright shit cooler design and assembly so the cooler doesn't even make adequate contact with the chips it needs to cool.
If only you knew how bad things really are.
It's easier to lock down hardware because it is a tangible product, unlike software. Hardware follows all the rules of the market we've known for centuries. Software broke most of the rules involving production, distribution, supply and demand, and there are still many suits out there who just don't fucking get it. It's also quite hard to design for and requires a very specialized mindset if you want it done well.
Nvidia has followed the model you described for a while. They don't produce software, but they make deals with the developers who make software that runs on their hardware, and they design their drivers such that their hardware runs as shittily as possible while still producing the end user's desired result (see GameWorks). This shortens the lifespan of their own hardware, forcing people to buy more hardware more frequently. It also helps to eliminate the competition because games will use Nvidia libraries that don't run nearly as well on other hardware. Fortunately for us, Vulkan threw a giant wrench in their business model and Nvidia is forced to come to terms with it due to developers' interest in multiplatform releases. I don't know how many devs use GameWorks anymore but it doesn't seem to be very many.
Artificial need instead of artificial scarcity? Interesting concept. Also I guess if you make something too well people won't be incentivized to replace it. Probably why manufacturers want to push for services instead of purchases (though even if this would enable "getting it right the first time" I doubt they'd actually bother to do any more than they could get away with).
Yeah something more in the middle like a volatile cocktail makes sense.
Hey man I think you're really getting it there. Despite intellectual property bullcrap, distribution rights really scares away production and vendors, huh? In a perfect world information would be shared rather freely and only monetized if used for industry and not for personal projects or learning, but there needs to be some sort of protections and incentives for content creators, eh? Just wish I could offer more than "yup that's a problem *whistles nonchalantly*"
Coincidentally, I just saw a video from a Star Citizen shill about this particular issue right before this was published, which he attributed to a faulty GPU.
>the AMD cards that are actually available are wildly overpriced
>all the Nvidia cards except the 3090 and 3060 have half the VRAM they should
>3090 is a piece of shit housefire that fries itself even if you're willing to pay the ridiculous price
There are literally no good high-end cards. The absolute state of this market.
I bought a "refurbished" (ie sat in a warehouse long enough for the warranty to expire) enterprise level card. Still more expensive than a consumer equivalent but generally performs better with better driver support.
I'm no expert either, but I'm pretty sure a GPU ought to throttle itself rather than brick itself.
I thought we'd got beyond games running with uncapped framerates in menus. This is like "storing passwords in cleartext" levels of stupidity--and I know that doing that ALSO happens from time to time, which just goes to show that the square wheel is just going to get reinvented from time to time. Hell, even healthcare businesses can't always get basic "never do this" things straight without nationwide "don't do stupid shit" campaigns by a private accreditation body that can put them out of business.
A lot of the "gamer" cards are overclocked out the box to give them more performance than what the hardware can safely support. Its cheaper than all the R&D needed to actually improve the architecture. Technology has started to bottleneck simply because its too complex for most of the retards with their trash degrees they get just for willing to pay ridiculously inflated education costs.
There isn't going to be any investigation
You know what it means when it gets exposed as a hardware issue right?
Massive recalls and replacements. While the 3090 isn't as widely used as their other cards, the fact that its priced at 1000$+ is going to sting a lot to Nvidia and they will also take a stock hit.
>even healthcare businesses
You say that like the healthcare industry is held to a higher tech standard than others. Like banks and the government, they're scraping by on ancient technology that they're too afraid to touch. These sectors have some of the most stunningly tech-averse morons you will ever meet.
Nvidia strikes again.
>amd has no driver
That's pretty clever.
Reminder: ATi drivers have never really improved, even on Linux. It's just that Nvidia shits the bed more frequently than ever now and AMD outsourced 90% of their Linux drivers to Red Hat by using Mesa.
>defunct company has shit drivers in CY + 6
Huh, I did get the impression my R9 Fury was pretty long, but I didn't think anything of it since it was my first PC.
While uncompressed audio is a scourge most of the audio bloat is from having multiple languages stuffed in there which can easily double the base size of a game. Repacks really show it up, a 20gb game turns into 43gb or more cos the fucking chinks, hues, frogs, spics etc want all their garbled noise catering to, delet all that shit and they're far lighter.
It shocks me how in the modern age, language files aren't just selected automatically based on your region/interface language, and the rest aren't downloaded. Maybe on PCs this can be different but on consoles it should be as simple as I describe it, especially considering that they don't make physical CDs anymore and it's all just online downloaded shit.
Anon you're asking people to spend extra effort for the sake of saving disk space. Developers who would even have that thought enter their brains do not exist in current year.
They clearly think that just because storage is becoming cheaper, they have permission to not care about file size
>selected automatically based on your region/interface language
Instead of leaving the choice to the machine it should be available to users.
I personally like having English interface while having Japanese voice files while playing Japanese games.
Sounds like consoomers trying to get free shit to me.
I mean core, non-tech things that should never be fucked up. I mean things like like "don't leave sponges in a patient."
It's like a shit sandwich.
>Reminder: ATi drivers have never really improved, even on Linux. It's just that Nvidia shits the bed more frequently than ever now and AMD outsourced 90% of their Linux drivers to Red Hat by using Mesa.
>Thinking mesa/xf86-ati-dri/whateverthefuckitscalledthesedays is actually as bad as catalyst
>uncapped framerate in the main menus
I am not shocked this has occurred, I've seen that dumb crap with games with runaway frames on menus before. Has nobody every run these damn games with an FPS counter enabled during testing? It's really stunning out of a major deal but they likely hired the same indie retards that have done it and not corrected the oversight even though it's a pretty common occurrence.