PDA

View Full Version : Regarding PhysX and PS2


Death2All
2012-01-04, 08:19 PM
So we've been told that PS2 is going to utilize Nvidia PhysX to process it's physics...Or at least I'm pretty sure that's what I heard.

I'm not especially adept with this, but would an Nvidia card presumably run PhysX better than an ATI card would? Thanks.

Zulthus
2012-01-04, 09:25 PM
ATI cards do not have the capability to process PhysX, they are not built to do so. You'll need a Nvidia card or let your CPU do all the work, and I wouldn't recommend that.


Also, nice flying.

SKYeXile
2012-01-05, 12:53 AM
you can run physx on a second nvidia card even with an ati card as primary. but i wouldn't worry about it, games have been processing physics for years on the CPU.

Zulthus
2012-01-05, 01:11 AM
Not necessarily true. What determines if you can mix an ATI card and a Nvidia card is if your motherboard supports it. Just because Windows 7 lets you have two separate drivers installed does not mean you can mix card brands. This motherboard, for example, http://www.amazon.com/dp/B0033C47XI/ can mix ATI and Nvidia.

So, I'd recommend you just get a single Nvidia card unless your motherboard can support both types.

EDIT: Okay, I'll clarify; you can use two different cards by THEMSELVES on the motherboard, but if it doesn't support mixing them, you can't CrosSLI or SLIFire them. You can use the ATI for one monitor and the Nvidia for another though.

Mutant
2012-01-05, 04:20 AM
So we've been told that PS2 is going to utilize Nvidia PhysX to process it's physics...Or at least I'm pretty sure that's what I heard.

I'm not especially adept with this, but would an Nvidia card presumably run PhysX better than an ATI card would? Thanks.

PS2 is almost certainly using the physX SDK only (somewhat similar to Havok from intel) which is currently in use by over 300 games and not the hardware accelerated version (of which less than 30 games have ever used) and so the whether your graphics card is Nvidia or ATI wont matter for PhysX in PS2.

Goku
2012-01-05, 02:27 PM
I am still unsure what is going to be used. I just wish SOE would say which one it is for sure. If you are building now and want to use the PhysX you maybe better off going with Nvidia just in case. Probably have better over all performance due to SOE making use of Nvidia based systems too.

Sentrosi
2012-03-09, 02:06 PM
Running Physx natively on the CPU with an ATI card (http://www.tomshardware.com/forum/334290-33-running-physx-natively-card)
I was thinking of giving this a try as I can't really upgrade my PC due to monetary funds and thought my 6850 could get by with this enabled for PS2.

Thoughts?

Vancha
2012-03-09, 11:25 PM
Mutant/Goku has it right. We just don't know which "PhysX" is being used. I get the feeling it's the SDK and not the graphical effects, which would make GPU brand irrelevant, but we won't know for sure until it's confirmed.

The problem is, Nvidia are...sponsoring? SOE, which means they'll probably try and steer people towards Nvidia regardless, so the only way we'll really know is once we have some benchmarks.

Ailos
2012-03-09, 11:54 PM
I'm currently running a single 5870 myself, and though it's more than a year old I don't feel like upgrading it just yet. I'm leaning more towards the dedicated PhysX GPU - come beta if I find my framerates drop whenever there's an explosion in front of me, I'd buy this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130612 and slot that into my second PCI slot. Plus, I feel like that card will be capable of serving as the PhysX engine for a good while.

IronMole
2012-03-09, 11:59 PM
Higby stated that Nvidia will be advanced for the Physx engine.

Whalenator
2012-03-10, 01:13 AM
PhysX runs absolutely fine on any decent CPU.
Don't worry guys, I play Arma 2.

Vancha
2012-03-10, 02:19 AM
Higby stated that Nvidia will be advanced for the Physx engine.
Sebastian Vettel says Red Bull is the best tasting drink in the world.

IronMole
2012-03-10, 09:37 AM
Sebastian Vettel says Red Bull is the best tasting drink in the world.

Good for him?

Goku
2012-03-10, 10:22 AM
PhysX runs absolutely fine on any decent CPU.
Don't worry guys, I play Arma 2.

Is that the GPU or CPU type? If its the GPU type I highly doubt its playable.

I could of sworn the devs confirmed that it is going to be the GPU accelerated type not CPU as well.

Ailos
2012-03-10, 09:17 PM
I still don't know but like I said, I'm hoping to use Beta as a tester as well. I'll make the decision to buy a dedicated PhysX card or just use an nVidia card at that point. Even in the worst-case scenario, I feel like this problem will be easily solved. The only drawback is increased power consumption, but this rig is currently scheduled for a major tuning come Holiday season 2012 anyway, so it'll be a short-term solution.

Sibercat
2012-03-12, 08:19 AM
I'm pretty sure devs told us, that the PhysX that they are implementing in to planetside 2 will work on both brands. NVIDIA and ATI=AMD GPUs.

Goku
2012-03-12, 09:27 PM
I'm pretty sure devs told us, that the PhysX that they are implementing in to planetside 2 will work on both brands. NVIDIA and ATI=AMD GPUs.

The GPU accelerated PhysX can only be run on Nvidia GPUs. No game has ever managed to used GPU PhysX on an AMD card. You can attempt to use your CPU to run the GPU PhysX when using an AMD, but expect really awful frame rates. From what I understand they are using the GPU PhysX with Nvidia. I'm going to dig this up to see what they said again.

basti
2012-03-13, 12:22 PM
The actual question is if you should get a Ati or Nvidia GFX. And the answer is simple there: Get a Nvidia if you want quality, a ATI if you want to throw your cash away.

Vancha
2012-03-13, 03:58 PM
The actual question is if you should get a Ati or Nvidia GFX. And the answer is simple there: Get a Nvidia if you want quality, a ATI if you want to throw your cash away.
Would you like to elaborate, or are you being purposefully unconstructive?

I could just as easily say "Get an ATI if you want quality, or Nvidia if you want to throw your cash away!"

Right now, the only arguable waste of cash would be a high-end Nvidia (hence the price drop they're getting soon). After that, everything should be fairly well priced in relation to each other, albeit rather expensive in general.

Ailos
2012-03-13, 08:36 PM
The actual question is if you should get a Ati or Nvidia GFX. And the answer is simple there: Get a Nvidia if you want quality, a ATI if you want to throw your cash away.

That's just you being an nVidia fanboy. If the game isn't specifically built to use GPU PhysX on every special effect, there is no appreciable difference between nVidia and ATI cards. Except that nVidia cards tend to be slightly more expensive.

Would you like to elaborate, or are you being purposefully unconstructive?

I could just as easily say "Get an ATI if you want quality, or Nvidia if you want to throw your cash away!"

Right now, the only arguable waste of cash would be a high-end Nvidia (hence the price drop they're getting soon). After that, everything should be fairly well priced in relation to each other, albeit rather expensive in general.

I'll agree with you on that one. Personally, I've never bought anything I considered to be beyond the point of reason (generally, around the $300 mark for each component). Such equipment is incredibly quickly depricated due to the fact that the cutting edge of any electronics gets dull quicker than the silicon its made from, and also because the newest cutting edge technologies will frequently suffer from bugs and performance issues that slipped past quality control.


Back to a more relevant topic, the reality is that if SOE decide to, they can make the game run just as good on an AMD system as it would on a system with an nVidia GPU if they go in and tweak some settings in the PhysX engine to allow it to run on the CPU. And in all reality, they may end up having to do that anyway if they are to meet their goal of making this game run on 4 to 5 year old hardware. I also, don't see any drawbacks to them doing that here. Here's why:

The kind of PhysX that absolutely requires an nVidia GPU (at least currently) is the one that simulates special effects (such as explosions) as particle phenomena. In other words, things like explosions, waterfalls, or fog are treated as millions, billions and sometimes trillionons of individual tiny particles and are rendered as such. Rendering these kinds of occurances requires insane amounts of repetitive, simple, and mind-numbing calculations - something that a GPU is much better suited for. Even though GPUs typically have clock speeds of only around 1 GHz (compared to a CPU's ~3GHz), their GDDR4 or GDDR5 (which can run as high as 4 GHz) allows incredibly higher amount of data transfer between the actual processor chips and memory, allowing the pixels to be calculated in time for the frame to be refreshed. To describe it another way, this kind of PhysX modelling is very similar to the kind that fluids engineers use to study and simulate aerodynamics and flow of fluids, when the simulate the flow by modelling the fluid's molecules.

CPUs on the other hand are more complex chips, use a higher level of instructions, and are more suited for more complex operations. Furthermore, CPUs don't have immediate access to gigabytes of memory - their on-board caches typically total on the order of a dozen MBs. And while they may have access to 8 GB or more of RAM, that sits further away, past a controller, and usually doesn't operate above 1.6 GHz. Though calculating the trillions of pixels per second needed to render isn't something that is difficult for the CPU, the amount of data exchange required to do this in a timely manner presents a problem for the memory controllers. Even if the interface bus is wide enough to accomodate the bandwidth, the CPU's playground (RAM) will simply be unable to keep up (by the time the CPU's 3 GHz clock requests another chunk from RAM, the memory is still recording what the CPU sent it of the last chunk).

However, the CPU has absolutely no problems in assisting with and outright controlling the macroscopic simulations - larger and more complex particle physics, like bullets and tanks, are at their heart, still single-step calculations. A CPU would have absolutely no problem performing these calculations in time, and in fact, a faster clock speed and multi-threading is much more important now because the calculations are no longer as repetitive, and each one will be different from the next. There is less data that must exchange hands, but more manipulations must be done in between exchanges.

Based on what I saw from the GDC demo, the game is probably not using the particle special effects version of the PhysX engine, primarily because muzzle flashes look like animations. It's a bit harder to tell with the fog and the dust that comes up from tank cannons and whatnot, but with a bit of CPU optimization, it shouldn't be too hard for the team to make it run well on any decent-speed CPU.

Sibercat
2012-03-14, 06:55 PM
I asked Higby about it and he told me this :


Matthew Higby:

"PhysX will work on all GPUs. Some features are exclusive to nVidia GPUs, but they'll be FX not gameplay." from the Tech Dir.

Goku
2012-03-14, 07:04 PM
FX?

Sibercat
2012-03-14, 07:06 PM
FX?

Like particle effects, grass, most likely clothes physics stuff like that.

Goku
2012-03-14, 07:19 PM
Like particle effects, grass, most likely clothes physics stuff like that.

Yup sounds about right. Thanks for the clarification.

Sibercat
2012-03-14, 07:26 PM
Np bro

Ailos
2012-03-14, 09:20 PM
Ah well then, I am gonna be needing an nVidia PhysX dedicated GPU.

Fenrys
2012-03-14, 11:15 PM
I'm OK with not seeing peoples' majestically flowing capes, and I'm glad I won't need to add a dedicated PhysX card :)

Ailos
2012-03-19, 10:00 AM
Just thought of an interesting idea for those of us who (like me) opt to have a dedicated PhysX GPU and an AMD primary - you could have a PhysX-enabled Eyefinity group! 4800x900 resolution anyone?

Mutant
2012-03-20, 06:51 AM
Just thought of an interesting idea for those of us who (like me) opt to have a dedicated PhysX GPU and an AMD primary - you could have a PhysX-enabled Eyefinity group! 4800x900 resolution anyone?

I already have a 6144x1158 Eyefinity setup but no confirmation on if/how well it will be supported.

I also have a GTX 260 sitting on my desk i could use for PhysX although I could prob pay for a newer card in a short time with power saving alone.

Ailos
2012-03-20, 06:14 PM
Well, that's the the hitch innit? I think the really simple way for SOE to support such a wide resolution is letting us pick the FOV cone of up to 120 degrees if we're using a resolution with a higher ratio than 16:9. Can't imagine having any other issues beyond that.

Oryon22
2012-04-03, 12:57 PM
GTX680 powered grass baby!

Sabin
2012-05-05, 04:19 PM
Not to necro.. but can't a big disadvantage of PhysX for PS2 be that certain effects could obscure your vision. Be it fog, or some type of fabric blowing around. People who don't have it will not have the same issue, and could possibly see you easier.

PhysX seems nice for single player games, but seems like it could put you at a disadvantage in multiplayer competitive games.

Rbstr
2012-05-06, 11:24 AM
They're probably not adding any PhysX effects solely for aesthetic purposes. It's mostly used for vehicle and projectile physics.

Yeah and that's why PhysX performance isn't something to be terribly worried about. PhysX hardware acceleration doesn't help much with rigid body stuff.