PDA

View Full Version : Release = Optimized?


BlueSmiley
2012-11-18, 07:46 AM
Sup dawgs,

I didn't notice any performance increases the last few patches and i'm tired of playing this game with 20 ~ 30 FPS and getting killed all the time just because I can't aim for p00 with these poor framerates.
I do understand that it was a beta, BUT, since the game is releasing, do they have a build with decent optimisations ready or so?

I have an AMD Phenom II x6 1100T and a Zotac Geforce GTX 670 AMP! Edition, so I should be able to run the game on medium easily. I play games like BF3 and Crysis 2 on ultra without any problems. I know those games ain't CPU-bound, but seriously... This ain't cool. :(

Tycho
2012-11-18, 08:02 AM
I'm running the game on high settings with upwards of 75 fps. I have an intel i7 3.4ghz, galaxy GeForce 670 gtx, and 16 gigs of ram. I get 50 fps if its a large battle.

Snipefrag
2012-11-18, 08:08 AM
So does this just mean the game still runs shit on AMD for certain people? Personally i have a Intel 3930k 3.2Ghz 6 cores (12 hyper-threaded), OC'ed to 4.2Ghz and normally stay around 100 fps. 50 when frapsing, so cant say i notice any optimisations :D.

ArchangeI
2012-11-18, 08:08 AM
x6 1090t 32gb:s RAM 5800 ati card.
Running at 50fps on Everything maxed

Sent from Tapatalk

BlueSmiley
2012-11-18, 08:15 AM
x6 1090t 32gb:s RAM 5800 ati card.
Running at 50fps on Everything maxed

Sent from Tapatalk

Weird. I don't have any problems with other games at all though.

SlotJockey
2012-11-18, 08:19 AM
Have you upgraded the Nvidia Drivers? the beta drivers that is and see if that did anything different?
I tested various Vid Cards in beta on different platforms....
As seeing any noticble gain, I truly had a better performance using Virtu MVP. I'm still not convinced that DX is fully optimized.
Perhaps the issues with Framerates could be due to the possible debug parameters that were being used through various times in beta.

There is a slight memory leak that was detected while using the Nvidia card on one run, however the server was patched and I couldn't re-create the oddity.

Also, I strongly believe after launch we will see new official drivers being rolled out with both AMD and Nvidia.... that could be the one common denominator causing all these issues that some people are bringing up.

BlueSmiley
2012-11-18, 08:24 AM
Have you upgraded the Nvidia Drivers? the beta drivers that is and see if that did anything different?

I always download the latest drivers. I'm using the new beta drivers at the moment. I did not see anything different after installing them.
Believe me, I tried everything. I've updated my realtek and BIOS drivers, tried multiple GPU drivers, closed all the background programs, edited the useroptions.ini file for dozens of times, et cetera. Nothing helped. =/

Jaybonaut
2012-11-18, 08:40 AM
Helps to repost this sometimes:

Current Recommended System
OS: Windows 7
Processor: Intel i5 processor or higher / AMD Phenom II X6 or higher
Memory: 8 GB RAM
Hard Drive: 15 GB free
Video Memory: 1,024 MB RAM
Video Card: nVidia GeForce 500 series or higher / AMD HD 6870 or higher
Sound Card: DirectX compatible

BlueSmiley
2012-11-18, 08:43 AM
Helps to repost this sometimes:

Current Recommended System
OS: Windows 7
Processor: Intel i5 processor or higher / AMD Phenom II X6 or higher
Memory: 8 GB RAM
Hard Drive: 15 GB free
Video Memory: 1,024 MB RAM
Video Card: nVidia GeForce 500 series or higher / AMD HD 6870 or higher
Sound Card: DirectX compatible

I have 8GB. Windows uses 2Gb though.
Do you mean I should have 8 for the game itself?

SlotJockey
2012-11-18, 08:45 AM
I always download the latest drivers. I'm using the new beta drivers at the moment. I did not see anything different after installing them.
Believe me, I tried everything. I've updated my realtek and BIOS drivers, tried multiple GPU drivers, closed all the background programs, edited the useroptions.ini file for dozens of times, et cetera. Nothing helped. =/

Now I'm curious to what could be the issue, with your listed setup you should be holding decent FPS regardless....

What resolution did you play on?

If you have nothing to do, you could always Benchmark your video and actually see what your baseline is.
Atleast then you can see if perhaps the card itself is degrading. Since there is no beta or anything else out there that uses ForgeLight.

Sounds very suspicious of a Bottleneck issue.
Something is not playing nice with the other bytes on the block so to speak....

Knocky
2012-11-18, 08:48 AM
Helps to repost this sometimes:

Current Recommended System

Video Card: nVidia GeForce 500 series or higher / AMD HD 6870 or higher


I wish whoever is posting the system requirements knew what nVidia's naming convention was.

Who here thinks a $40 GT 610 video card is going to run PS2?

IHateMMOs
2012-11-18, 08:49 AM
Last time I played it, it had decent optimization, no doubt it will be much better after launch.

TheStigma
2012-11-18, 09:23 AM
I have 8GB. Windows uses 2Gb though.
Do you mean I should have 8 for the game itself?

No. The game client only has 32bit support - so by definition it can only use a maximum of 4GB of addressed memory for the game. Actually I don't think it even uses Large Address Awareness (at least by default although that could possibly be enabled manually) because I don't think I've seen the game push above 2GB. This is fairly normal by the way since most games easily fit within 2GB of memory.

So no - I would interpret that 8GB as that you should reserve up to 4GB for the game to function optimally, and your OS and other stuff in the background use a little memory too so they just say 8 since that is the next normal step after 8. If they said 5GB that would just confuse people since no computer actually sells with (exactly) 5GB of memory.

Realistically from what I have seen out of ACTUAL memory-usage when playing it doesn't go above 2GB though, so unless they are going to expand memory-usage later in development and planned for that in the recommended specs (which is possible I suppose) then you could easily get away with 4GB to run the game as it is. 2GB to run the game, and 2GB more to easily take care of the OS and various background tasks - with probably nearly a gig to spare in fact...

-Stigma

TheStigma
2012-11-18, 09:30 AM
From a patch almost a week ago or so I got a fairly substancial framerate increase - large enough to easily notice just from playing normally. Patches after that haven't seemed to touch the performance any. I'm completely CPU bound just to be clear (even with a 2500K @ 4,2Ghz).

I doubt you will see huge performace increases at launch. Maybe another step up from what we had at beta end if we are lucky... but most likely the performance issues will be something that they will have to keep working on for a long time. I don't think any miracle-patch is going to come for this realistically - and unfortunately I think that is going to reduce the amount of people who can play with game simply due to having too poor performance. Having such poor scalability is very much not a good thing for a F2P model that needs a huge audience to succeed. I'm crossing my fingers...

-Stigma

BlueSmiley
2012-11-18, 10:55 AM
1. Now I'm curious to what could be the issue, with your listed setup you should be holding decent FPS regardless....

2. What resolution did you play on?


3. Sounds very suspicious of a Bottleneck issue.

1. Yeah, I should.

2. I'm using 1920x1080

3. Uhuh. A few guys told me this GPU wouldn't do anything wrong to my CPU though.

PS: I tried benchmarking it with PCMark, but I got errors and I couldn't benchmark anything.


No. The game client only has 32bit support - so by definition it can only use a maximum of 4GB of addressed memory for the game. Actually I don't think it even...

Thanks for the explanation!

Mechzz
2012-11-18, 10:58 AM
double post. Derp.

Mechzz
2012-11-18, 11:01 AM
Sup dawgs,

I didn't notice any performance increases the last few patches and i'm tired of playing this game with 20 ~ 30 FPS and getting killed all the time just because I can't aim for p00 with these poor framerates.
I do understand that it was a beta, BUT, since the game is releasing, do they have a build with decent optimisations ready or so?

I have an AMD Phenom II x6 1100T and a Zotac Geforce GTX 670 AMP! Edition, so I should be able to run the game on medium easily. I play games like BF3 and Crysis 2 on ultra without any problems. I know those games ain't CPU-bound, but seriously... This ain't cool. :(

I have the same CPU and found that overclocking it to 3.9 GHz gave me an extra 10-15 frames. I also o'ced my card (HD 6870 1GB) but beta closed before I could try it out.

Have you considered this approach? I'm a total novice at it, but just used the auto-tune utilities that came free with my mobo and the card. (There was also a utility to let me see temps and fan speeds I hadn't found earlier).

BlueSmiley
2012-11-18, 11:02 AM
I have the same processor and found that overclocking it to 3.9 GHz gave me an extra 10-15 frames. I've also overclocked my card (HD 6870 1GB) but beta closed before I could test it.

Have you considered overclocking? I'm a total novice, but just used the auto-tunes tools

Yep, Turbo Boost is on.
My GPU is already overclocked.

ThePackage
2012-11-18, 11:44 AM
So does this just mean the game still runs shit on AMD for certain people? Personally i have a Intel 3930k 3.2Ghz 6 cores (12 hyper-threaded), OC'ed to 4.2Ghz and normally stay around 100 fps. 50 when frapsing, so cant say i notice any optimisations :D.

No one, and I mean no one gets 100 FPS all the time. It's simply impossible.

I'm running a 3570k at 5.1 ghz and can't even touch 60 FPS in large battles.

Rago
2012-11-18, 11:49 AM
I would also not believe everything people write down here in the psu Forums.

Hamma
2012-11-18, 12:27 PM
Optimization is something they never stop.

ThePackage
2012-11-18, 12:39 PM
I understand why they chose to go with DX9 for initial development. However, they're really going to have to work hard to bring in the substantial performance increases they need. I don't see any large improvements, for high end rigs, happening until they switch to something more threaded.

They really need to migrate to DX11 as quickly as possible if they want to showcase a well performing game.

SpcFarlen
2012-11-18, 12:42 PM
The game utilized 2 cores, and less than 2 gigs of ram on my rig at the last patch of beta. According to their recommendation list i should be fine with:
Amd 955 @ 3.6gzh
8 GB RAM
Gtx 660ti

But i get 70 fps at warpgate and 30fps in any battle no matter what graphics setting i have on. The game just was not optimized that well. Even people with high end i5s are struggling to get a constant 60 frames, which recommended specs should be for playing the game on medium settings and getting 60+ frames.

Dkamanus
2012-11-18, 01:17 PM
I understand why they chose to go with DX9 for initial development. However, they're really going to have to work hard to bring in the substantial performance increases they need. I don't see any large improvements, for high end rigs, happening until they switch to something more threaded.

They really need to migrate to DX11 as quickly as possible if they want to showcase a well performing game.

They won't change to DX11 for the same reason CCP hasn't changed EVE Online to DX9. They WANT people with lower end machines to play their game. The game was build with that perspective. Changing would force people into buying new GPUs, which most won't do because of a single game.

So yes, the performance increase should be astronomical in this game. Hopefully they can do so.

SpcFarlen
2012-11-18, 01:23 PM
They won't change to DX11 for the same reason CCP hasn't changed EVE Online to DX9. They WANT people with lower end machines to play their game. The game was build with that perspective. Changing would force people into buying new GPUs, which most won't do because of a single game.

So yes, the performance increase should be astronomical in this game. Hopefully they can do so.

Can always add an option between the two. BF:BC2 had the option to switch between the two DX version. So its not like thats new technology.

Dkamanus
2012-11-18, 01:26 PM
Can always add an option between the two. BF:BC2 had the option to switch between the two DX version. So its not like thats new technology.

I'm not an expert of computer technology, but they'd need to spend a whole lot of development to put a DX11 in the game. Wouldn't mind, but, at least for now, isn't cost-worthy for them.

Mechzz
2012-11-18, 01:32 PM
They won't change to DX11 for the same reason CCP hasn't changed EVE Online to DX9. They WANT people with lower end machines to play their game. The game was build with that perspective. Changing would force people into buying new GPUs, which most won't do because of a single game.

So yes, the performance increase should be astronomical in this game. Hopefully they can do so.

But if you look at Steam Stats, less than 5% of Steam users have DX9 or older machines. 95% are on DX10 and newer.

Are people with Core Duo's and 512MB cards getting the same frames as people with high end i5's and SLI'd monster cards? Or is the game just a write-off for them and SOE have wasted their time and our money by targeting these older machines?

Dkamanus
2012-11-18, 01:55 PM
But if you look at Steam Stats, less than 5% of Steam users have DX9 or older machines. 95% are on DX10 and newer.

Are people with Core Duo's and 512MB cards getting the same frames as people with high end i5's and SLI'd monster cards? Or is the game just a write-off for them and SOE have wasted their time and our money by targeting these older machines?

We must remember (and I have an opinion as well): The Forgelight engine is be used in a huge array of new games coming from SOE, being Planetside 2 the first, followed by the Everquest NEXT. Planetside, from my point of view, is being used as a test bed for technologies SOE is developing for her future games. Yes, not everyone nowadays uses DX9 video cards, but a lot can be achieved with those still.

Gamers of everquest might not be so forward in changing their PCs, compared to FPS players, where fluidity is everything. I believe that's why they decided to go with DX9.

They prefer to be safe and sound. The more the merrier, specially in this type of bussiness model. They will try to get those people inside as well, and, in last case, minimize the problems of not being able to get better performance for those low end PCs, so will start praizing more the middle-to-high-end PC gamers.

ChipMHazard
2012-11-18, 02:12 PM
Aye I haven't seen any surveys for years that show WinXP as having a really substantial user base among PC gamers.
I'm rather certain that the majority of all PC owners use either Vista, Win7 or Win 8. Of course there are still a lot of XP users.
I don't know how old the Forgelight (Forge Light?) engine is or how much of a factor that plays.

LaggMaster
2012-11-18, 02:14 PM
I run a 1055t with the 5770. I get between 40-70fps. Last week of patching fixed all the frame lagging for me.

Morsong
2012-11-18, 02:55 PM
Going to be running medium I believe. Going to have to get a 670 to play high me thinks. Currently I have a 460. Not really worth playing on high settings with that card I'd imagine.

BlueSmiley
2012-11-18, 03:01 PM
Going to be running medium I believe. Going to have to get a 670 to play high me thinks. Currently I have a 460. Not really worth playing on high settings with that card I'd imagine.

I'm CPU-bound and I don't think my (uber) 670 (that I have bought for this game before beta started :rolleyes:) changes anything in the game.

Morsong
2012-11-18, 03:02 PM
I'm CPU-bound and I don't think my (uber) 670 (that I have bought for this game before beta started :rolleyes:) changes anything in the game.

what is your cpu? I just finished upgrading to an i5 (2500k) today and am curious to see how it will play with a geforce 460. I still haven't played PS2 since my old comp before today was a core 2 duo.

BlueSmiley
2012-11-18, 03:15 PM
what is your cpu? I just finished upgrading to an i5 (2500k) today and am curious to see how it will play with a geforce 460. I still haven't played PS2 since my old comp before today was a core 2 duo.

I have an AMD Phenom II x6 1100T. This CPU is quite good, but PS2 doesn't really like it.

SpcFarlen
2012-11-18, 03:15 PM
I'm not an expert of computer technology, but they'd need to spend a whole lot of development to put a DX11 in the game. Wouldn't mind, but, at least for now, isn't cost-worthy for them.

The fact that they didnt future proof a new flagship engine (Forgeglight is being used in the new Everquest)... is not reassuring. We have had 3 generations of video cards with DX11 capabilities. A majority of the pc gaming community actually has DX11 compatible components.

Yes it is a bit more work to do but i feel it them not upgrading to DX11 will in the long run only hurt them when other games which may copy PS2 and can run better.

Tatwi
2012-11-18, 03:41 PM
I run a 1055t with the 5770. I get between 40-70fps. Last week of patching fixed all the frame lagging for me.

That's what I was getting near the end on my Core2 Quad Q8200 / GTS450 1GB GDDR5, when running around doing very little. In battles it's way better now, 15-30 FPS as apposed to 0-15. So that's nice.

I also noticed smoother frame rates when I took out 2GB of my slower RAM and only used my 4GB of DDR2 800, while I also over clocked the CPU from 2.33GHz to 2.8GHz. I had two random 1GB 667MHz modules and a matched pair of 2GB 800MHz modules. Apparently one of the 1GBs was causing a blue screen from not liking the over clock from 667MHz to 800MHz.

For those talking about the game being single threaded, you're only partly correct. The DX9 rendering path is indeed single threaded, but other aspects of the game are obviously able to make use of more cores. I've seen PS2 use all four of my cores like so: 100/90/50/50 +/-15% depending on the load/time. That's not too bad for a game, but it would a whole lot better with DX11 multi threaded optimizations...

From what I have read about the technology, the differed lighting used in PS2 is heavily effected by memory latency. My ancient computer, with it's Front Side Bus and DDR2 RAM suffers significantly when there are a lot of shots being fired, even on low lighting settings. Just the way it's going to be with this lighting system.

With that in mind, a new AMD A10-5800K APU with DDR3 2133 RAM would likely cream my system in PS2, given that it's equal or better than my Core2 Quad in everything by CPU based video encoding/trancoding. The best part? The A8-5600K and A10-5800K APUs are new "budget" processors on a modern platform that is very affordable.

It's not going to stupid awesome playing PS2 on my old hardware or on the very affordable AMD quads, but it's going to be playable in all but the most extreme cases.

All that said: I still end up flying into the ground a lot due to massive frame rate stops, where I am facing in completely a different direction from one frame to the next... I wish it were related to the game loading new assets off the hard drive, but that does not appear to be the case. All is well, until I get close to a battle and then the frame rate drops and stops randomly, even if I land and sit there a while or fly around the same spot in a circle. Indar is WAY worse for this than Amerish come to think of it...

Anyhow, if someone has anything better than...

Core2 Duo 2.6GHz
4GB DDR2 800MHz RAM
GTS 8800GT 512MB

They should be able to enjoy the game in all but huge battles. I think that's a fair trade off.

ThePackage
2012-11-18, 04:22 PM
They won't change to DX11 for the same reason CCP hasn't changed EVE Online to DX9. They WANT people with lower end machines to play their game. The game was build with that perspective. Changing would force people into buying new GPUs, which most won't do because of a single game.

So yes, the performance increase should be astronomical in this game. Hopefully they can do so.

DX10 is almost 6 years old at this point, and has true threading support. If you're on a PC that's 6 years old, you're not going to run PS2 with any sort of acceptable FPS.

Also, allowing DX11 support doesn't mean you can't use DX9. When WoW introduced real DX11 support the FPS gains were fairly substantial.

Knocky
2012-11-18, 05:35 PM
I floated a gtx660 trial balloon over the wife's head.

Sadly her AA fire is completely OP. :(

Schisist
2012-11-18, 05:57 PM
I wish whoever is posting the system requirements knew what nVidia's naming convention was.

Who here thinks a $40 GT 610 video card is going to run PS2?

How's this for stumping?

I run this game on everything at max, with a CRT monitor on 1280x1024, all graphics settings on full with occlusion and motion blur.

I have a Core2 Quad Q9300, 4GB Ram, Windows7 64bit, and a GeForce GTS 450 512mb.

I run with everything on full at 40-50FPS depending on area, in large battles it dips to 35~.

Shouldn't i barely be able to play with those specs with all settings on full? I don't feel the FPS lag one bit.

Dkamanus
2012-11-18, 06:02 PM
How's this for stumping?

I run this game on everything at max, with a CRT monitor on 1280x1024, all graphics settings on full with occlusion and motion blur.

I have a Core2 Quad Q9300, 4GB Ram, Windows7 64bit, and a GeForce GTS 450 512mb.

I run with everything on full at 40-50FPS depending on area, in large battles it dips to 35~.

Shouldn't i barely be able to play with those specs with all settings on full? I don't feel the FPS lag one bit.

Which is quite odd, cause My brother has a somewhat like config, except the GPu that it is a HD 4890, and he can't get good FPS. Maybe at launch it'll be better, but for now, its hedious.

Must be AMD drivers problems =/

Knocky
2012-11-18, 06:05 PM
Must be AMD drivers problems =/


This is probably the reason.

My FPS more then doubled when I installed last months driver update from nVidia.

Dkamanus
2012-11-18, 06:17 PM
This is probably the reason.

My FPS more then doubled when I installed last months driver update from nVidia.

Well, that would explain the clog on my GPU. I'm using and OC'ed HD 6870 Black Edition, and I can't reach 60 fps constantly in medium =[

With updated drivers

Knocky
2012-11-18, 06:22 PM
Well, that would explain the clog on my GPU. I'm using and OC'ed HD 6870 Black Edition, and I can't reach 60 fps constantly in medium =[

With updated drivers

I dont keep track of AMD equipment. I use a GTX 460 for my GPU and 2500k for the CPU. Maybe you will be able to compare the two. I get 100 FPS in gate, 60ish in battle and 30ish in heavy battle on high on a 32HDTV with another 17 wide screen monitor running as well.

Dkamanus
2012-11-18, 06:28 PM
I dont keep track of AMD equipment. I use a GTX 460 for my GPU and 2500k for the CPU. Maybe you will be able to compare the two. I get 100 FPS in gate, 60ish in battle and 30ish in heavy battle on high on a 32HDTV with another 17 wide screen monitor running as well.

http://www.hwcompare.com/8368/geforce-gtx-460-2gb-vs-radeon-hd-6870/

Using the strongest 460. My 6870 should be able to do what your does much better, and still, I only reach constant 60~70 at the warpgate. At heavy battles I reach 30 fps. Must be the nVidia Slavery =/

BigpermCML
2012-11-18, 07:44 PM
my i5 2500k @4.3 and 7970 jumps between 45-70sh in the largest of battles, and hangs around 80-100 when just out bullshitting around.

I'm sure they'll optimize more and more (especially when it comes to AMD cpu's), but as of now, if you didn't run very well towards the end of beta...

You might just need to upgrade your gameboy

Knocky
2012-11-18, 07:51 PM
http://www.hwcompare.com/8368/geforce-gtx-460-2gb-vs-radeon-hd-6870/

Using the strongest 460. My 6870 should be able to do what your does much better, and still, I only reach constant 60~70 at the warpgate. At heavy battles I reach 30 fps. Must be the nVidia Slavery =/


Dont worry about it too much. The Devs have 3 days to accidentally leave out some miniscule line of code that will have our machines running at 5 FPS on the 20th. :D

AThreatToYou
2012-11-18, 07:54 PM
If 20-30 FPS is unplayable for you, then your standards are too high...

Dkamanus
2012-11-18, 08:31 PM
If 20-30 FPS is unplayable for you, then your standards are too high...

I'm not saying its unplayable. But in a FPS game, where twich movements and defacto skill (not saying I'm uber skilled) is needed, 60 FPS contribute much more the 30 fps. Doesn't seem so, but once you see the difference you start to understand.

The problem here is the powerhog needed to run a game such as planetside.

BlueSmiley
2012-11-19, 04:36 AM
Since I probably won't be able to play this game with decent framerates for the next couple of months... ->

Ordered: i7-3770, Gigabyte GA-H77-DS3H, Scythe Mugen 3

This game is the only game I play, so I think it's worth teh cash.

Faidwen
2012-11-19, 05:23 AM
Lets all remember that the average human cannot see / recognize framerates greater than 30 ;) (and that turning on vsync / vertical sync, will limit your framerate to whatever your monitor / screen refresh is, typically 60)

;)

MrSmegz
2012-11-19, 06:25 AM
Lets all remember that the average human cannot see / recognize framerates greater than 30 ;) (and that turning on vsync / vertical sync, will limit your framerate to whatever your monitor / screen refresh is, typically 60)

;)

This is true but when you are moving fast with a mouse, the movements are smoother with higher FPS. Most people can notice this below 40-45 FPS (which is where I am.) I cannot tell the difference between 45 and 60 FPS. Some people notice all kinds of jitter and stuff, I see it on forums all the time. Jitter that is caused by HyperThreading on the CPU or SLI/Xfire Stuttering. Its practically imperceptible to most unless you are looking for it.

Yutty
2012-11-19, 06:54 AM
Since I probably won't be able to play this game with decent framerates for the next couple of months... ->

Ordered: i7-3770, Gigabyte GA-H77-DS3H, Scythe Mugen 3

This game is the only game I play, so I think it's worth teh cash.

dude i got that exact same motherboard except i got a i5-3570k. If you ever figure out how to overclock the CPU by adjusting the turbo speed please let me know. In the advanced bio's default view I can't change the clock speed, only way i can over clock it is to go into the 3D bios and change it manually there. But that method changes the default speed to overclock 24/7 when i only want turbo overclocked to kick in when needed.

BigpermCML
2012-11-19, 07:33 AM
http://frames-per-second.appspot.com/

Piper
2012-11-19, 07:40 AM
Because I assume most of you have fragile minds and are susceptible to arguments from authority, here is another industry icon talking about framerates (i.e. why 30 FPS is unacceptable):

http://beefjack.com/news/carmack-prefers-pretty-at-60fps-than-graphics-as-the-only-thing/




Didn't you just suggest to us not to believe everything we read and then give us a link of something to read....:p

BlueSmiley
2012-11-19, 08:52 AM
Lets all remember that the average human cannot see / recognize framerates greater than 30 ;) (and that turning on vsync / vertical sync, will limit your framerate to whatever your monitor / screen refresh is, typically 60)

;)

I definitely see the difference between 30 - 35 35 - 40 and 40 - 60.
I've been playing CS 1.6 for like six years and I need higher FPS to enjoy the game. Once the framerate drops to 30-something, I can't even aim for shit. So that reply of yours: BS. I would like to have 50+ FPS. I'm getting new hardware, but if it isn't smooth on high graphics, I'll sure move over to medium or low. I don't care about graphics. I want to aim and pwn.

dude i got that exact same motherboard except i got a i5-3570k. If you ever figure out how to overclock the CPU by adjusting the turbo speed please let me know. In the advanced bio's default view I can't change the clock speed, only way i can over clock it is to go into the 3D bios and change it manually there. But that method changes the default speed to overclock 24/7 when i only want turbo overclocked to kick in when needed.

If you want to overclock: Buy a new one.
I've read some reviews for this mobo and they all mentioned the fact that this motherboard sucks in OC'ing.
I'm not planning to overclock my CPU(I would have bought a 3770k otherwise), so i'm good I guess.

ThePackage
2012-11-19, 09:40 AM
Lets all remember that the average human cannot see / recognize framerates greater than 30 ;)

This is so incredibly incorrect. There's no real answer to how many frames we can see per second. And how many frames we can see, or recognize, does not even correlate to how much FPS we need a game to play at to feel smooth.

CyclesMcHurtz
2012-11-19, 10:59 AM
Just a quick FYI ... I am not the lead programmer. Brad Heinz is the lead on the game client (he's got a little write up on the official website)

I'll see about re-posting the info from the other forums.

[Sent from the outskirts of the Oort cloud]

rTekku
2012-11-19, 11:06 AM
If 20-30 FPS is unplayable for you, then your standards are too high...

I'd like to see you try to play a competitive FPS games at 20 FPS.

Lets all remember that the average human cannot see / recognize framerates greater than 30 ;) (and that turning on vsync / vertical sync, will limit your framerate to whatever your monitor / screen refresh is, typically 60)

;)


Go play Bad Company 2 on a console, then play Bad Company 2 on a PC at 60 FPS. Come back and tell me you don't see or feel a difference.

drdeadlift
2012-11-19, 02:31 PM
Lets all remember that the average human cannot see / recognize framerates greater than 30 ;) (and that turning on vsync / vertical sync, will limit your framerate to whatever your monitor / screen refresh is, typically 60)

;)

http://www.reactiongifs.com/wp-content/uploads/2012/10/facepalm-shame.gif

Eduard Khil
2012-11-19, 03:04 PM
Lol, so my true 120 Hz monitor was a scam, Science ( I don't know whose) says I can't see 120 frames, yet I could swear this little app http://frames-per-second.appspot.com/ looks somehow different when I set it to 120, all in the mind.

Dkamanus
2012-11-19, 03:26 PM
The human eye CAN notice the differences between 30 fps and 60 fps quite easily. Movies nowadays run at 24 fps. You'll see smoother movements if they were running at 60 fps. THAT's the main difference. Where precision is needed, 60 FPS is much better then 30 FPS.

blbeta
2012-11-19, 04:00 PM
i5-2400
8GB Ram

GTX260 @1680x1050
Settings @ Medium + Low Shadows + Low Flora
FPS = 25-35 @ warpgate, 40-60+ fighting in open, 15-35 in the bigger bases.

GTX660 @1680x1050
Settings @ High with everything on
FPS = 45-60+ @ warpgate, 45-60+ finghting in open, 35-60+ in the bigger bases. (I am using adaptive vsync that the GTX600 series has, so I never go above 60)

The GTX 660 with the GTX 260's medium settings was 70+ FPS everywhere I tried it, but I cranked up the graphics within 10 minutes of making sure the card was a good upgrade.

Faidwen
2012-11-20, 02:21 AM
The human eye CAN notice the differences between 30 fps and 60 fps quite easily. Movies nowadays run at 24 fps. You'll see smoother movements if they were running at 60 fps. THAT's the main difference. Where precision is needed, 60 FPS is much better then 30 FPS.

Absolutely, I wasn't saying that "more" FPS is a bad thing, obviously video games are substantially different than movies. But to play anything fluidly 30fps sustained is sufficient! If you can reach this bench mark, then you won't be negatively impacted by that much.

Smoothness, crisp edges, and motion blur will definately play into this.

The goal of 30fps on ALL systems should be the minimum to play successfully. Anything greater than this is PURE eye candy.

Albeit, mighty nice eye candy!!!!

camycamera
2012-11-20, 02:52 AM
^ this.

if it runs, i'll play it. but if it was a competition, obviously you'll want more FPS, and more FPS is more than welcome, but i don't mind 30FPS at all.

Rolfski
2012-11-20, 09:49 AM
Besides the ongoing optimization from the SOE team, I really hope we will see new driver versions from Nvidia and AMD soon that give significant performance increase to this game.

Knocky
2012-11-20, 09:59 AM
Besides the ongoing optimization from the SOE team, I really hope we will see new driver versions from Nvidia and AMD soon that give significant performance increase to this game.

Nvidia's Oct driver release more than doubled my FPS.

Have you tried it yet?

ChipMHazard
2012-11-20, 10:06 AM
Nvidia released a new beta driver today.

Bear
2012-11-20, 10:13 AM
Besides the ongoing optimization from the SOE team, I really hope we will see new driver versions from Nvidia and AMD soon that give significant performance increase to this game.

This might be of interest to you then! :)

http://www.reddit.com/r/Planetside/comments/13i3ha/geforce_31061_beta_drivers_an_essential_upgrade/

zerozeroseven
2012-11-20, 12:11 PM
3570k @ 4.8ghz
GTX670 @ 1300/7000 or so
2560x1440

Manage 40-150 fps depending on whats on screen

I was going to go SLI but have managed to hold off as it has only annoyed me a few times

TheBladeRoden
2012-11-25, 12:14 AM
I am proud to report I got a 66% increase in FPS in the largest battles!

CPU: Intel Core2 Quad Q6600 2.40 GHz
Motherboard: Intel DP35DP
Memory: 6 GB Kingston ValueRam

6fps to 10 fps :p