+ Reply to Thread
Page 4 of 5 FirstFirst ... 2 3 4 5 LastLast
Results 31 to 40 of 43

Thread: Dx11

  1. #31
    dx11 performance often depends on the implementation including which features and how.


    for example wow just uses dx11 for multithreading rendering so in dx11 mode you get an fps boost with more eye candy, even though that eye candy is perfectly doable in dx9 only.

    tsw doesn't seem to use multithreaded rendering, but it does have tesselation and SSAO(AO is actually dx10 stuff but it's become more popular recently). so dx11 mode without the extra eye candy is pretty pointless, but with the eye candy on it looks very nice(though it stutters like a mother****er in london).

    just two examples of mmo's with dx11 modes and how they've used it differently.


    although i very much like tesselation, not having it doesn't bother me at all. SSAO i could care less about since it's so subtle.

    archeage also has dx11 and some other related nvidia eye candy. looks pretty nice in the video and in screenshots. game is already going the wrong way though. going from sandpark to wow clone.

  2. #32
    Senior Member Pasha's Avatar
    Join Date
    Jan 2013
    Location
    Moscow
    Posts
    568
    On PC this game looks a bit better than LOTRO (which was designed for DX8). It's not ugly like The Secret World - but it's not "stunning" either. Unless release will be very different from beta version.

    And either whole engine including animations will be rebuilt for DX11 and upcoming generation of consoles - or adding DX11 effects to this graphics and animations will be wasting of resources - just like adding DX9-10 effects to LOTRO didn't make characters there less primitive.

    Be ready to play this game without any graphic changes for year(s).

  3. #33
    Senior Member
    Join Date
    Mar 2013
    Posts
    250
    please dotn mention console games.

    lets take tombraider into account. such sexy graphics options for pc. conesoles however not so lucky.

    so why should we be held back by consoles and people with 5 year old PC. if they can do dx 11 then great

    WoW has DX11 support and that game is ancient now.

    and console users dotn care for pc so why should we care for console ?

  4. #34
    Senior Member HexCaliber's Avatar
    Join Date
    Mar 2013
    Posts
    115
    While this is not directly relevant to what version of DX defiance uses, most of the talk about defiance's graphics stem from its apparent quality.

    In talking to a couple of ppl in pm's it has become abundantly clear that many people do not know what overscan does in this games graphics options, and as a result many people left it at default, then complained about the graphics quality.

    Overscan in defiant causes the game to render at a lower resolution to reduce memory overhead, improve, fps etc, and then upscales that image to your monitors resolution and renders it to your screen. At its default setting, the mid point, the game was being rendered at half resolution; anyone who did not touch overscan and thought there was a problem with Defiant's image quality needs to try defiant with overscan set to zero.

    Oh as to the comment about DX 9 being more machine friendly, I am afraid you are wrong, DX10/11 improved on texture compression, memory management, shader routines and more, significantly reducing overheads and improving performance.

    As a result of the improvements to Directx 10, many hobbyists were able to get away with DX10 emulation on Directx 9 machines after 10 was released, and did so with a number of early DX10 titles. Most of the emulation didn't see general release as they were hacks and more than a little buggy, but one that allowed use of DirectX 10 functions in Stalker on DirectX 9 machines did appear as a mod for general use.
    Mankind's greatest weakness, and greatest strength, hope.

    Regards HexCaliber.

  5. #35
    Senior Member
    Join Date
    Mar 2013
    Posts
    151
    Quote Originally Posted by DustOfDeath View Post
    no physX - its nvidia card exclusive. Not everyoen uses nvidia cards you know.
    Is it? I have it on my puter and I use AMD.

  6. #36
    Senior Member Otis Spunks's Avatar
    Join Date
    Mar 2013
    Location
    Sikeston, MO
    Posts
    112
    Quote Originally Posted by DustOfDeath View Post
    no physX - its nvidia card exclusive. Not everyoen uses nvidia cards you know.
    Not true, physX cant be forced on ATI cards via .ini files which all games hold if there are graphic settings. Its a simple .ini tweek. PhysX is only Nvidia only when you have no clue what you are doing.
    "Your momma is so fat that her UV won't fit in a 1-0 space."
    "Your momma is so ugly that not even Vray can save her."

  7. #37
    Quote Originally Posted by Harsk View Post
    Is it? I have it on my puter and I use AMD.
    Nvidia own PhysX
    http://en.wikipedia.org/wiki/PhysX

    Yes, there are times Nvidia ask AMD to PhysX. AMD said no to it. Then again AMD does have their own physic. Hell, even intel has their own physic(havok). You don't need a Nvidia card to do physic. Is just another marketing gimmick from Nvidia.

    AMD use Bullet Physics
    http://www.amd.com/us/press-releases...009sept30.aspx
    http://en.wikipedia.org/wiki/Bullet_(software)

    Intel buys out Havok:
    http://www.pcworld.com/article/137232/article.html
    Havok Physics:
    http://www.havok.com/products/physics

    Defiance:
    http://www.havok.com/client-projects/games/defiance
    Developer Trion Worlds utilized Havok Physics and Havok Animation to power Defiance on Xbox 360, Playstation3, and PC.
    Guess who else use Havok:
    http://www.havok.com/client-projects/games/guild-wars-2

    PS: If you want to see nice DX11. Try Bioshock Infinite.
    You could also try Heaven 4.0 now with DX11 http://unigine.com/products/heaven/
    and their new Valley http://unigine.com/products/valley/

  8. #38
    Member Nelob's Avatar
    Join Date
    Mar 2013
    Location
    England
    Posts
    47
    Quote Originally Posted by HexCaliber View Post
    While this is not directly relevant to what version of DX defiance uses, most of the talk about defiance's graphics stem from its apparent quality.

    In talking to a couple of ppl in pm's it has become abundantly clear that many people do not know what overscan does in this games graphics options, and as a result many people left it at default, then complained about the graphics quality.

    Overscan in defiant causes the game to render at a lower resolution to reduce memory overhead, improve, fps etc, and then upscales that image to your monitors resolution and renders it to your screen. At its default setting, the mid point, the game was being rendered at half resolution; anyone who did not touch overscan and thought there was a problem with Defiant's image quality needs to try defiant with overscan set to zero.

    Oh as to the comment about DX 9 being more machine friendly, I am afraid you are wrong, DX10/11 improved on texture compression, memory management, shader routines and more, significantly reducing overheads and improving performance.

    As a result of the improvements to Directx 10, many hobbyists were able to get away with DX10 emulation on Directx 9 machines after 10 was released, and did so with a number of early DX10 titles. Most of the emulation didn't see general release as they were hacks and more than a little buggy, but one that allowed use of DirectX 10 functions in Stalker on DirectX 9 machines did appear as a mod for general use.
    no your only right on a few games, and those games didn't do dx11 correctly, dx9 definitely uses less resources.

    your telling me dx9 has worst fps than dx11 with maxed tessellation? just no, on good dx11 games they tend to use 5-20fps more with all dx11 features, heck sometimes more depending on how well the game is optimized.

  9. #39
    PhysX can be run off the CPU as well, and is then not exclusive to Nvidia, barring any sort of "hack" or .ini tweak.

    It's also actually more than a marketing gimmick, but whatever makes you feel better about your AMD purchase. Assuming you have an AMD card which I think is evident.

    Quote Originally Posted by Nelob View Post
    no your only right on a few games, and those games didn't do dx11 correctly, dx9 definitely uses less resources.

    your telling me dx9 has worst fps than dx11 with maxed tessellation? just no, on good dx11 games they tend to use 5-20fps more with all dx11 features, heck sometimes more depending on how well the game is optimized.
    You're wrong. If anything, he was saying that Dx9 vs Dx11 with the exact same settings will prove dx11 to be overall superior in both resource management, as well as in-game performance(arguably IQ to some extent as well). Obviously using features such as tessellation will impact performance much more severely than not using it, and it's impact will vary on it's implementation and optimization.

  10. #40
    Member Nelob's Avatar
    Join Date
    Mar 2013
    Location
    England
    Posts
    47
    Quote Originally Posted by Smokey the Bear View Post
    PhysX can be run off the CPU as well, and is then not exclusive to Nvidia, barring any sort of "hack" or .ini tweak.

    It's also actually more than a marketing gimmick, but whatever makes you feel better about your AMD purchase. Assuming you have an AMD card which I think is evident.



    You're wrong. If anything, he was saying that Dx9 vs Dx11 with the exact same settings will prove dx11 to be overall superior in both resource management, as well as in-game performance(arguably IQ to some extent as well). Obviously using features such as tessellation will impact performance much more severely than not using it, and it's impact will vary on it's implementation and optimization.
    I see what your saying, im on about all the features of DX11

    again, *sigh* that's not DX11 to it's fullest.

    but the main argument, should DX11 be implemented... yes.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts