Fallout 4 Locked At 30 FPS on Consoles

 Pages 1 2 NEXT
 

Fallout 4 Locked At 30 FPS on Consoles

Bethesda Softworks confirmed when when it comes to Fallout 4's fps, frame rate never changes.

We may be well into the latest console generation but like war, one thing never changes - the ability to offer high frame rates on consoles. This time the offender is none other than Fallout 4, which Bethesda confirmed would be limited to 30 fps on PlayStation 4 and Xbox One platforms. On the plus side 1080p will be standard while the PC version has no frame rate limitations - but that likely won't matter to fans and Vault Dwellers hoping this game would be different

The information was first revealed during an E3 interview last week, and confirmed on Bethesda's Twitter account today. "Fallout 4 is 1080p & 30fps on Xbox One and PS4," the post reads. "Resolution and fps are not limited in any way on the PC."

Now before we attack Bethesda about Fallout 4's frame rate - or attack the attackers by suggesting no one would notice a difference - let's remember this is the same challenge Assassin's Creed: Unity faced last year. It's unlikely that software developers have fixed the technical hurdles yet, especially for a game world that's probably bigger than Ubisoft's Paris.

Yet at the same time, I get the frustration. This is Fallout 4 for Dogmeat's sake - if any game was going to usher in a consistently high console frame rate, you'd think a series this groundbreaking had the best chance. That said, I'm sure it won't impact Fallout 4's gameplay when it launches on Nov. 10, 2015.

Source: Twitter

Permalink

Fallout never really was technologically groundbreaking...

And neither is Bethesda such a company people :P ... they are mediocre coders with not very high ambitions to push tech. This is not GSC Game World, nor is it 4A Games or CDPR.

*And that is fine actually, not every company has to be like that*

It is disappointing because Witcher 3 does that (well, the 1080p part on PS4) at 30 fps too (most of the time). ANd it looks a lot better than Fallout.
And Unity, for all the shit we give it (it was coded by monkeys or hairy land-whales) too is a better looking game :P

haha, I like that the thumbnail's still Vault Boy giving the thumbs up. "Healthy Fallout! Now with 50% less FPS!

Oh no! Now how will I enjoy the glitchy, buggy NPCs and incomplete textures?!

Fallout has never had a high frame rate on consoles, as far as i've seen. Neither Skyrim or its' predecessors either. In fact, i'm really confused as to why this should be a disappointment. If anything, locked at 30fps sounds like an upgrade compared to previous console fallouts.

I'm ok with this as long as it stays at a consistent 30FPS. I have grown use to playing 60FPS, sometimes, on my Laptop but I can make do with 30 on my Xbox.

Who cares? I've been playing Witcher 3 on PS4 for a while and it looks fine at 30fps. I don't buy into this 'more cinematic' crap for one moment but while 30FPS looks like shit on a PC generally it looks fine on consoles, at least to me anyway

PC it is then. Well, I already planned on the PC version from the get-go. Anyone familiar with Bethesda knows that you should get the PC versions of their games if you have that option.

Though, this doesn't surprise me much. A lot of games are either making sacrifices to hit 60 FPS or just go with 30 FPS for this generation. (Or at least those games are getting attention in the news for their frame rate, since resolution and frame rate are the two buzzwords for this generation.)

Eh, even though my computer would be fine with Fallout 4 I kinda have to get it on the console anyway, so I was expecting something like this.
I'm not that bothered, personally. As long as it looks and plays alright, I wont be complaining. I do wish there were more options for console, but at the end of the day I just wanna play a fun game.

Not surprised being that its a massive open world game so im fine with it. Good there isnt a limit on PC, they shouldn't be restricted if their PC can handle it.

As long as they don't utter the word "cinematic"...

Well gee whiz, who couldn't see that coming?!

Next you tell me the console version will necessitate a controller.

I dunno, I had PS3 and PC versions of Fallout 3 and never noticed any differnce between them in both framerate AND quality of the graphics.

I am still going to buy F4 for PC though (because I don't have PS4, yet).

Perhaps we should just accept the fact that the PS4 and Xbone should likely have spent more time in development as it seems to be a massive issue on most games for it to have 60fps? That said, Bethesda open world games tend to have so much going on in the background, that a console can't really handle it as well as a pc could. This is the same sets of games where npc's can get themselves killed when you are nowhere near them. Hell, it's getting to a point that I am tempted to just drop down the money on a gaming pc rather than moving onto a new console for better bang for my buck.

Honestly though, it's looking more and more like the 30/60fps is a bit of a hardware issue at this point.

Given the VATS system, I think 30FPS is going to be a lot less of an issue with Fallout 4 than it would be with a lot of other games.

As far as the main point, I have to be blunt. I think waiting for some technical miracle to bring about universal 1080P 60FPS on the current generation of consoles is a lost cause, and people should stop expecting it. Some games will achieve it, but it's going to be about limiting the number of characters on screen, limiting the draw distance that's necessary to show at any given moment, or limiting the fidelity with which characters or objects are shown. (Or tricks like the "letterboxing" bars in "The Order", effectively limiting the resolution.)

The underlying hardware of both the XBox One and the PS4 is essentially PC hardware. This isn't obscure proprietary parts that some arcane process, some sudden developer insight, is suddenly going to find a way to crank up to 11.

And it's only going to get worse if 4K television actually gains traction as a common standard.

I think if people want better performance, they're going to have to grit their teeth and accept that there's only so much oomph you can get out of a box that costs under $500 anymore. And I think the needle's already at the red line.

Not too sure that anyone really expected anything different. 1080p 30 is pretty much standard on consoles, these days, at least when it isn't 900p 30 on the Xbox One.

"It begins"

*eats popcorn*

Or, if we're being more topical.

"Platform war, platform war never changes"

*Continues to eat popcorn*

Eh, this is not the type of game that is dependent on a huge framerate. 60 definitely looks and feels better, but 30 is far from unplayable. I'm a PC gamer, so I will be glad to get more than 30 though. It matters a bit more on the PC since we tend to have our turn sensitivity much higher.

Quellist:
Who cares? I've been playing Witcher 3 on PS4 for a while and it looks fine at 30fps. I don't buy into this 'more cinematic' crap for one moment but while 30FPS looks like shit on a PC generally it looks fine on consoles, at least to me anyway

Framerate doesn't really have a big impact on how a game looks, generally speaking. As long as frame rate is consistent, it won't be especially noticeable because it doesn't affect graphics at all.

What high framerate will do though, is make a game FEEL a lot smoother, faster, and more fluid. It will also reduce the input lag so you'll feel like you can react more quickly.

This is why high framerate (and low input lag) are so important for fast paced games like shooters and action RPGs. Framerate is arguably more important than even graphics because it improves your reaction time and makes the game so much smoother.

OT: This doesn't surprise me in the least. This console generation has been nothing but disappointment in terms of good performance. THey put all their budget into making their games as shiny as possible and don't care at all about keeping a reliable framerate.

I am surprised considering how dated the visuals looked.

I thought this was the standard (just a console limitation) and our beef was with developers locking it on pc too, so that consoles dont look bad. Deals with Sony and MS and so on..

I wouldnt really expect Berthesda to find new ways to optimize stuff. I am happy if they reach results like other companies on that end.

Hmm, maybe the consoles will get a mid Gen update. Amd is coming out with a new really small gpu that's supposed to have some terrific performance figures.

I think maybe in 2 years it'll be cheap enough to put in consoles.

Callate:
Given the VATS system, I think 30FPS is going to be a lot less of an issue with Fallout 4 than it would be with a lot of other games.

As far as the main point, I have to be blunt. I think waiting for some technical miracle to bring about universal 1080P 60FPS on the current generation of consoles is a lost cause, and people should stop expecting it. Some games will achieve it, but it's going to be about limiting the number of characters on screen, limiting the draw distance that's necessary to show at any given moment, or limiting the fidelity with which characters or objects are shown. (Or tricks like the "letterboxing" bars in "The Order", effectively limiting the resolution.)

The underlying hardware of both the XBox One and the PS4 is essentially PC hardware. This isn't obscure proprietary parts that some arcane process, some sudden developer insight, is suddenly going to find a way to crank up to 11.

And it's only going to get worse if 4K television actually gains traction as a common standard.

I think if people want better performance, they're going to have to grit their teeth and accept that there's only so much oomph you can get out of a box that costs under $500 anymore. And I think the needle's already at the red line.

But then the question becomes: If the new consoles are not significantly more powerful than the generation that came before and can't do what mid-range PCs have been doing for ages, then what's the point of getting a next-gen console?
Particularly due to the whole pack of backwards compatibility, there was sort of the implication that these consoles were this Great Leap Forward in games presentation, however if they're still limited in this way without the benefit of having hundreds of last-gen games to pad out the library, then why did they release them at all?
I'm aware that frame rate doesn't matter to all people, and that a lot of current-gen games run at 60, however, they aren't doing a great job of justifying themselves over even a similarly-priced PC.

Whoops, double-posted there.

I like this 1080p 30fps on consoles thing. It means games run at 1080p 60fps on my computer. If games were 1080p 60fps on console with downgraded visuals the visual options could well be worse on pc as well meaning my hardware wouldn't be fully utilized. As it is we get good looking games on consoles which run at a high framerate on my pc.

PCMASTE...no? ok. ,)

Really, though, can this/last gens console hardware even support 60FPS?
I seem to always be hearing that this and that game wont be in 60FPS on console X.
My last console could not. (and not so surprising, considering it was the Playstation.)

Casual Shinji:
Well gee whiz, who couldn't see that coming?!

Next you tell me the console version will necessitate a controller.

Nice one, but no, not that, though I am afraid that controllers will become a necessity on PC's.. o.0
I already hear of lots people playing games on PC with a controller.
I for one prefer a sharp turn instead of running around the parking lot..

Quirkymeister:
If the new consoles are not significantly more powerful than the generation that came before and can't do what mid-range PCs have been doing for ages, then what's the point of getting a next-gen console?
Particularly due to the whole pack of backwards compatibility, there was sort of the implication that these consoles were this Great Leap Forward in games presentation, however if they're still limited in this way without the benefit of having hundreds of last-gen games to pad out the library, then why did they release them at all?
I'm aware that frame rate doesn't matter to all people, and that a lot of current-gen games run at 60, however, they aren't doing a great job of justifying themselves over even a similarly-priced PC.

PS4/Xbone are significantly more powerful the PS3/360. Open world PS4 games look better than PS3's best looking linear games. PS3/360 only had 512MB of RAM. No matter how powerful the consoles are, devs are always going to choose to make the graphical fidelity as good as they can at 30fps vs lowering the graphical fidelity and running at 60fps. If PC had set hardware, the same thing would happen there. Only the really really fast-paced games even benefit from 60fps anyways. I played TLOU on PS4 and the game doesn't play any better, mainly because the camera is too sluggish to take advantage of 60fps anyways.

Consoles are at least $100 cheaper, they play the newest games (which don't play better on PCs), the game you buy will fucking play guaranteed, and you put your console conveniently by your TV, sound system, and couch/recliner. There's plenty of Pros for consoles. Building a PC with a $100 CPU and $100-$150 video card costs just about $500 (a little below or over depending on the other parts) and can't play the newest games on high graphical settings at 60fps.

Quirkymeister:
But then the question becomes: If the new consoles are not significantly more powerful than the generation that came before and can't do what mid-range PCs have been doing for ages, then what's the point of getting a next-gen console?
Particularly due to the whole pack of backwards compatibility, there was sort of the implication that these consoles were this Great Leap Forward in games presentation, however if they're still limited in this way without the benefit of having hundreds of last-gen games to pad out the library, then why did they release them at all?
I'm aware that frame rate doesn't matter to all people, and that a lot of current-gen games run at 60, however, they aren't doing a great job of justifying themselves over even a similarly-priced PC.

I think last generation, there was a real benefit to the standardization of the console market. Even if the XBox 360 and PS3 ran on fairly different hardware, occasionally resulting in different performance on cross-platform games, there was still a certain sense of a baseline level of performance that both systems would be able to achieve. This was both bane and boon to the PC market; while it prevented developers who wanted to reach the widest possible audience from creating games that were widely outside the capabilities of the PS3 and XBox 360, the "reining in" also meant that any half-way decent PC ought to be able to capably play the same games that were coming out for the consoles (barring exclusivity and shoddy ports, of course, of which there was certainly a fair amount.)

But it feels like we've reached a place where display technology is outpacing processing technology. I suspect it's true that someone could create a PC today that would rival the PS4 or XBox One for power for the same price, but maybe only just... And I think that a $399 computer would also struggle with 4K resolutions. With economies of scale working in their favor, what I've heard suggests that Sony and Microsoft are still taking a small loss on their console hardware. Hardcore PC gamer "rigs" blow them out of the water, certainly, but many of those are spending as much or more on video cards alone as the consoles are spending on the entire box.

So, why did they release them at all...? I don't know that I have a single, definitive answer.

It was, arguably, time; the generation had gone on far longer than the previous one, customers were clamoring for something new and something that would look good on HD televisions, and at a certain point they either had to release something or cede the market. Yet I haven't gotten any real sense that either company thought of this generation's hardware as some sort of "stopgap" measure that would just have to hold until the real advance came out.

Part of me wonders if there is or was a hope that, for lack of a better word, "gimmick" hardware would make up for a lack of sheer processor power. Microsoft may have thought the Kinect would do for them what the Wiimote did for the previous generation; of course, we've seen how that played out. Both companies may still be banking on the new resurgence of interest in VR, technology that in some cases requires a lower resolution and that the new hardware itself may be able to take up some of the processing work.

But I think the bottom line is that another year of hemming and hawing wasn't necessarily going to create a monumentally better console. For all the various snafus, major and minor, that both Sony and Microsoft have endured in birthing the current console generation, I suspect nothing would have been as painful as trying to convince Christmas shoppers that $600 or more was a price worth paying for a new console. Maybe in the long run it would have been the smart move, but I think a fair number of executives might have lost their jobs in the process.

Instead, we have a "Meh, just about good enough" generation. And I think the next few years may only diminish its already lackluster splendor.

And I'll do what I always do regardless. I get the Xbox version first because my kid likes playing on Xbox. Then, 6 to 8 months later once the mod community has made the game spectacular, I'll grab a PC copy as well and crank up the setting because I have a reasonably high-end machine at the moment (ASUS R.O.G laptop with the best NVIDIA laptop GPU and I think 32g of RAM with an SSD). Likely during a Steam Sale. Ta-Da, best of both worlds.

Makes me wonder whether this game will be optimized all that well for PC. I mean, it's running on a modified Skyrim engine, and Skyrim wasn't exactly top-of-the-bill graphically either. I can buy that it wasn't running on 60 FPS on the ancient Xbox 360/PS3 hardware, but surely the vastly more powerful Xbox One and PS4 must be able to run a modified Creation Engine at 60FPS. And if not, well, that isn't reassuring for the PC version either, maybe.

WouldYouKindly:
Hmm, maybe the consoles will get a mid Gen update. Amd is coming out with a new really small gpu that's supposed to have some terrific performance figures.

I think maybe in 2 years it'll be cheap enough to put in consoles.

Nope. Consoles are NEVER updated with significantly faster hardware. It violates all sorts of assumptions made about the hardware during game development, especially when it comes to micro optimizations. Every game for a console platform, whether launch titles or the last one ever made, will run the same on the first one off the assembly line and the last.

The only processor modifications made are process/die shrinks and component integration.

I'd prefer lower res and higher fps, but whatevs - we all know the consoles are fairly pitiful (I'll be getting one of the two consoles, btw, but I'm not going to pretend they're as powerful as they bloody well should be. both Sony and MS came out with rather embarrassing hardware).

Vendor-Lazarus:
Really, though, can this/last gens console hardware even support 60FPS?

If you're talking about XB1/PS4? It depends on the game, but yes. The Forza 6 trailer actually bothered to advertise 1080p/60fps on the reveal, and Halo 5 will be 60fps, but not at 1080.

Darth Rosenberg:

Vendor-Lazarus:
Really, though, can this/last gens console hardware even support 60FPS?

If you're talking about XB1/PS4? It depends on the game, but yes. The Forza 6 trailer actually bothered to advertise 1080p/60fps on the reveal, and Halo 5 will be 60fps, but not at 1080.

I might be? I'm not a console player so I have no idea what the current gen is, nor when the next is due.
Therefor I had to be a bit creative in my usage of the term. Apologies for the confusion.

Ah, I see, so some games do run at 60FPS on this gen at least.
It's no contest for developers to market their games on Graphics and turn 30FPS into Cinematic instead.

Thanks for the info!

You guys ready for some deadpan?

Okay? Here goes!

"Oh, the humanity. Who'll think of the inane fps fanatics?"

For a game that is going to be this big 30fps is fine for my Xbox/PS4 but I'm getting it for PC as well so it doesn't really matter to me. That being said though maybe it's because it has been in development for so long that they can only achieve 30fps on the consoles, as it is their first game on the new consoles right?

 Pages 1 2 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here