Intel Haswell-E review: The best consumer performance chip you can buy – with some caveats
on August 29, 2014 at 12:00 pm
Today, Intel is launching its long-awaited update and refresh of its top-end enthusiast platform: Haswell-E. It’s a refresh that’s been a long time coming — the old X79 chipset was long in the tooth 12 months ago, while the top-end six-core Ivy Bridge didn’t overclock well and wasn’t a huge improvement over the six-core Sandy Bridge. Last year’s Core i7-4690X wasn’t panned, but it didn’t do much to
for hardcore enthusiasts — and the debut of Devil’s Canyon, with its significantly faster clock and far cheaper price eroded that difference even more.Today, Intel is changing that with a new eight-core CPU — Haswell-E — a new lineup of LGA2011 products, and the X99 chipset. Will the new hardware put a fresh coat of paint on a lackluster lineup? Let’s take a look.
The CPUsIn the past, Intel followed the same pattern for product launches. The SNB-E and IVB-E series both debuted in six-core and four-core configurations at approximate price points of $1050, $600, and $330 respectively. The top two chips would be hexa-core while the third was a quad-core. Today, that changes — Intel’s quad-core enthusiast platform processor is going away altogether, as shown below:Intel’s new Haswell-E lineupThis shift introduces some significant changes to Intel’s total product stack. The price for a six-core desktop chip is coming down sharply, from roughly $580 to $380. The clock drop isn’t quite as significant as it seems, but there’s an important difference between the Core i7-4790K and the Core i7-5000 family. These new chips clock up to full Turbo mode and stay there, even when running a program like Prime95. The Core i7-4790K and Core i7-4770K, in contrast, will tend to top out around 4.2GHz or 3.8GHz when running all cores simultaneously.Thus, the Core i7-5820K
offers a 50% core increase for an effective 16% frequency decrease — along with more PCI Express channels and twice the RAM channels. Overall, it’s probably the strongest part in the lineup for pure CPU work — assuming you don’t want to add a second GPU. The Core i7-5960X , in contrast, is a bit harder to pin down. Intel’s octa-core chip still adds 33% more cores compared to the hexa-core variety, but trades off 12.5% of its frequency to do so and runs about 17% slower than the effective top frequency on the Core i7-4790K.That’s still more than enough frequency to beat out Intel’s other chips on multi-threaded workloads, but single-threaded is going to be another story. Before we dive into that comparison, though, let’s take a look at the new chipset.Intel’s X99 reclaims the feature crownThe X99 chipset is an , but it’s significant enough to reclaim the overall chipset crown. Unlike the X79, which was limited to just two SATA 6G ports, the new X99 packs 10 SATA 6G. X79 also lacked integrated support for USB 3.0, whereas the X99 has six USB 3.0 ports and eight USB 2.0 ports.
Next up, there’s the most obvious upgrade — the DDR4-2133 support. Intel’s marketing documents are tilted to make this appear like a bigger jump the X79 may have only formally supported DDR3-1600, but manufacturers regularly validated DDR3-2133 and even DDR3-2400 in X79 motherboards. The good news is that this is somewhat replicated on the X99; the motherboard supports up to DDR4-2800 (we tested 16GB of DDR4-2667). Unfortunately, support for this standard is still a little flaky — the Asus board we benchmarked could only run its multiplers at DDR4-2667 if we increased the CPU base clock to 125MHz and brought multipliers down to compensate.Now, the real-world impact on benchmarks should be negligible — the base clock boost won’t impact PCI Express frequencies or other peripherals — but early support is still, well, early. Also bear in mind that DDR4 still commands an enormous price premium, even if DDR3 prices are up sharply from their trough nearly two years ago. 16GB kits of CAS 11 latency DDR3-2400 can be had for around $180, compared to $400 for our Corsair DDR4-2667.Next page:
Post a Comment
ExtremeTech Newsletter
Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
Subscribing to a newsletter indicates your consent to our
More ArticlesYour question
&&&DirectX 9.0 Vs 10.0&
Graphics card
1. You can install DirectX 10; assuming you have Vista.
2. Umm... generally better visual effects
3. Download DirectX 10a from Microsoft's site if you are paranoid.
I dont have vista, does it make a difference?
Can't find your answer ? Ask !
um... reply to the op...
1... dx10 is NOT INSTALLABLE
2... dx10 looks better than dx9 little bit... very little...
3... again... dx10 is not downloadable program...
dx10 is only for win vista... and it comes with it...
dx10 is NOT compatible w/ xp or anything below
Just to clarify I thought Direct x 10 was for people with Windows Vista Ultimate edition. I was unaware it was offered to any other version, or is this not true anymore???
Update: Never mind I just looked it up. I'm incorrect its on vista not just ultimate
Hi Rojito.
Ok, lets give you a little background here. DirectX is what is called an API. API stands for Application Programming Interface. An API is basicly software that responds to requests from either hardware or other computer programs. In DirectX's case, the API responds to the requests from 3d graphics cards as well as the PC games that run on them.
Unfortunately for gamers, or those who are perfectly satisfied with using Windows XP, Microsoft has elected to only allow DirectX 10 to be used on Windows Vista. This means that if you're running Windows XP, chances are that you'll never be able to use DirectX 10 unless you purchase a copy of Windows Vista and install it on your machine. This is in my opinion, one tactic Microsoft is using to force gamers into adopting their new Operating System.
Anyway, to answer your questions in order - here you go:
1. You can't install DirectX 10 Without Windows Vista. If you're okay with Windows Vista then yes, install it.
2. The benefits of DirectX 10 are many, primarily once developers really start to optimize for it. The number one benefit of DirectX 10 is pixel shader 4.0 also known as the geometry shader. As I understand it, the geometry shader allows greatly increased detail and realism without a serious performance impact. DirectX 10 is also supposed to run faster and more efficiently than DirectX 9. Key word is SUPPOSED....doesn't mean it really will....yet.
3. See the above answer to question 1.
Hope this helps.
jpmeaney speaks da truth.
Just one thing to add to his statement. Dont worry much about DX10 for now. Games ussually have a Developing Cycle of 2 Years or more, so games that FULLY utilize DX10 are still to come. And i mean FULLY.
You have already few examples, but their just few, and basicly they all play in DX 9 (that you have installed). So dont worry much about it.
radnor said:jpmeaney speaks da truth.
Just one thing to add to his statement. Dont worry much about DX10 for now. Games ussually have a Developing Cycle of 2 Years or more, so games that FULLY utilize DX10 are still to come. And i mean FULLY.
You have already few examples, but their just few, and basicly they all play in DX 9 (that you have installed). So dont worry much about it.
Totally agree dont worry about it. For now you have a graphics card that rocks and will play any game you want to throw at it. I dont think DX10 is a reason to switch to Vista.
Quote:...It adds scheduling and memory virtualization capabilities to the graphics subsystem and foregoes the current DirectX practice of using "capability bits" to indicate which features are active on the current hardware. Instead, Direct3D 10 defines a minimum standard of hardware capabilities which must be supported for a display system to be "Direct3D 10 compatible". Microsoft's goal is to create an environment for developers and designers where they can be assured that the input they provide will be rendered in exactly the same fashion on all supported graphics cards. This has been a recurring problem with the DirectX 9 model, where different video cards have produced different results, thus requiring fixes keyed to specific cards to be produced by developers.
According to Microsoft, Direct3D 10 will be able to display some graphics up to 8 times faster than DirectX 9.0c because of the new improved Windows Display Driver Model. In addition, Direct3D 10 incorporates Microsoft's High Level Shader Language 4.0. However, Direct3D 10 is not backward compatible like prior versions of DirectX. The same game will not be compatible with both Direct3D 10 and Direct3D 9 or below. Games would need to be developed for both APIs, one version for Direct3D 9 and below if targeting Windows versions prior to Windows Vista and another version using Direct3D 10 if targeting only Windows Vista. Windows Vista does, however, contain a backward compatible Direct3D 9 implementation.
The Direct3D 10 API introduces unified vertex and pixel shaders. In addition, it also supports Geometry Shaders, which operate on entire geometric primitives (points, lines, and triangles), and can allow calculations based on adjacent primitives as well. The output of the geometry shader can be passed directly onwards to the rasterizer for interpolation and pixel shading, or written to a vertex buffer (known as 'stream out') to be fed back into the beginning of the pipeline.
D3D10 functionality requires WDDM (Windows Display Driver Model) and new graphics hardware. The graphics hardware will be pre-emptively multithreaded, to allow multiple threads to use the GPU in turns. It will also provide paging of the graphics memory.
The version of Direct3D 9 available in Windows Vista is called Direct3D 9Ex. This modified API also uses the WDDM and allows Direct3D 9 applications to access some of the features available in Windows Vista such as cross-process shared surfaces, managed graphics memory, prioritization of resources, text anti-aliasing, advanced gamma functions, and device removal management.
When comparing an XP machine to a Vista machine, what people are getting/seeing in games now is the difference between DX9a/b/c and DX9ex.
There are *no* true DX10 games on the market today, as evidenced by the simple fact that you can run the same titles, from the same discs, on both XP and Vista.
A true DX10 game would not run at all on the XP box.
jpmeaney, mostly good points, but a little wrong in subtle ways.
DirectX enhancements are of a relatively minor variety, to the end user.
They are pretty significant architectural changes under the hood, though.
That means eventually, once the developers optimise for DX10, you should get better visuals and faster code.
Unified shaders, more constants, SM4.0, virtualization, no cap bits - all good good things.
It CAN NOT be ported to WinXP without a major rewrite of the XP kernel, also known as making XP like Vista - makes absolutely no sense whatsoever.
In terms of downloading - I wouldn't worry about it.
Every game that requires DX comes with it on the media, so it will get installed when you install the game if you need it.
In terms of worrying about DX10 - right now running it in DX10 mode in most games comes with a performance hit (that' the most recent eye candy generally is modest and comes with a big performance hit until developers optimise better and the hardware gets better).
And last but not least, 8800 GT, in fact, IS a DX10 GPU.
Scotteq said:Quote:...It adds scheduling and memory virtualization capabilities to the graphics subsystem and foregoes the current DirectX practice of using "capability bits" to indicate which features are active on the current hardware. Instead, Direct3D 10 defines a minimum standard of hardware capabilities which must be supported for a display system to be "Direct3D 10 compatible". Microsoft's goal is to create an environment for developers and designers where they can be assured that the input they provide will be rendered in exactly the same fashion on all supported graphics cards. This has been a recurring problem with the DirectX 9 model, where different video cards have produced different results, thus requiring fixes keyed to specific cards to be produced by developers.
According to Microsoft, Direct3D 10 will be able to display some graphics up to 8 times faster than DirectX 9.0c because of the new improved Windows Display Driver Model. In addition, Direct3D 10 incorporates Microsoft's High Level Shader Language 4.0. However, Direct3D 10 is not backward compatible like prior versions of DirectX. The same game will not be compatible with both Direct3D 10 and Direct3D 9 or below. Games would need to be developed for both APIs, one version for Direct3D 9 and below if targeting Windows versions prior to Windows Vista and another version using Direct3D 10 if targeting only Windows Vista. Windows Vista does, however, contain a backward compatible Direct3D 9 implementation.
The Direct3D 10 API introduces unified vertex and pixel shaders. In addition, it also supports Geometry Shaders, which operate on entire geometric primitives (points, lines, and triangles), and can allow calculations based on adjacent primitives as well. The output of the geometry shader can be passed directly onwards to the rasterizer for interpolation and pixel shading, or written to a vertex buffer (known as 'stream out') to be fed back into the beginning of the pipeline.
D3D10 functionality requires WDDM (Windows Display Driver Model) and new graphics hardware. The graphics hardware will be pre-emptively multithreaded, to allow multiple threads to use the GPU in turns. It will also provide paging of the graphics memory.
The version of Direct3D 9 available in Windows Vista is called Direct3D 9Ex. This modified API also uses the WDDM and allows Direct3D 9 applications to access some of the features available in Windows Vista such as cross-process shared surfaces, managed graphics memory, prioritization of resources, text anti-aliasing, advanced gamma functions, and device removal management.
When comparing an XP machine to a Vista machine, what people are getting/seeing in games now is the difference between DX9a/b/c and DX9ex.
There are *no* true DX10 games on the market today, as evidenced by the simple fact that you can run the same titles, from the same discs, on both XP and Vista.
A true DX10 game would not run at all on the XP box.
Didnt know that info from wikipedia. Although it seem hard to believe. If it was true, i guess everybody would welcome it in open arms, instead of what is happening atm. There isnt much the discussion between the two diferent APIs ( games have been made before for DX/OpenGL for example) and that didnt seem too much of a problem. I believe there is more to DX 10 than meets the eye. For example, that Microsoft is charging dearly for a Digital Signed Driver to the manufactureres. And other Off-topic related Situatons.
Not calling you a liar or so, dont get offended, the info comes from wikipedia afterall ( isnt bullit proof, but ussually pretty reliable).This ill wait to be proven, its too good to be true. And M$ isnt know to make flawless products. But is know for great marketing caimpains.
According to Microsoft, Direct3D 10 will be able to display some graphics up to 8 times faster than DirectX 9.0c because of the new improved Windows Display Driver Model.
No one will make a DX10 only game yet since it will hurt their sales of the game. Not enough demand.
Rad - Thanks for not calling me a liar...
The simple fact is the two are different API's, and that code written for one is not compatible with the other.
In the case of a Vista machine, the game runs on DX9ex - Which is a DX9 API set that includes some extra functionality and the ability to communicate graphics via WDDM as Vista requires.
There's a couple layers of marketing fluff
- On the MSFT side, add a new functionality (shaders, or whatever) in 10 that didn't previously exist, then figure out how to duplicate the effect in 9, measure the difference...
10 is faster!!
Or, if you dont' like that one - How about taking into account that the original purpose of Direct X was to cut through the layers of the Operating System to talk 'Direct'~ly to the hardware for best performance...
And then have the MSFT turn around and add hooks to a specific OS??
What about standards??
But then when you have 80+% of the market, and
can rightfully call yourself "The Standard", then does it really matter??
What about the perception/reality/idea/want/need that DX9 isn't sufficient any more, that a new model really is needed to drive performance and visual improvements, and coming to the conclusion that 9 has to be trashed...
How do you make people swallow that The Cord Must Be Cut when the install base is *that* large??
On the side of the game creators - Is it really right to add some (undefined, unexplained, and unnamed) of the new DX9ex commands to your DX9 game and call that "DX10"??
They can say "it's DX10 functionality" for sure, and that because it is DX10 functionality, then the game must be DX10...
But it isn't really DX10, after all, is it??
But then, they've also been put in the position of literally having to develop TWO COMPLETE versions of the same game for their XP and Vista user bases.
Is it a sound business decision to do that??
Especially when a Vista machine will run your DX9 stuff??
Or is it easier just to put a DX10 label on it and keep your mouth shut?
At the various benchmarkers and reviewers - Is it right to measure the difference between DX9c on an XP box and DX9ex on a Vista machine, and call that a comparision of 9 to 10??
You would think these people intelligent enough to (1) understand and communicate that nothing exists for 10 yet, and (2) understand and communicate that what's really being measured is 9c and 9ex , and (3) understand and communicate the extra work involved with DX9ex having to translate what is really a DX9 game to the new WDDM??
Also - and understand that the following doesn't apply to far less than 5% of the market who are technicians and enthusiasts -
Since it *is* a bigger and more complex story than anyone is really saying, and since the general user population only cares that they got their eMail, that the internet works, and that the game plays. Does it really make sense to widely publicise the actual differences to people who...
&cue Old Guy Voice&
Don't Know...
Don't Care...
DX10.1 is where it is at
maximiza said:DX10.1 is where it is at
OpenGL FTW !! *Tinfoil hat* ok ok, i stop trolling.
maximiza said:DX10.1 is where it is at
Hardly......
Other posters have hit it on the head.
You will barely notice the difference while playing unless you pause the game and look at the DX9 screen next to a DX10 screen.
Only one thing is for sure, your wallet will be lighter, because you have to buy Vista, and then depending on your setup you'll need to bump to 4gigs of ram because Vista is a pig.
maximiza said:DX10.1 is where it is at
Yesh lol by the time that gets ported to games I think we'll all be installing windows 7 and going to DX11 which will be in the current 'state' DX10 is in right now &
DX10.1 IS where it's at, but there's no one else there. Which means it's ultra exclusive of course. &
Anywhoo, unlike what alpha says, it is downloadable but strictly for Vista users, and thank god that's true because there's a ton of corrupt installs out there.
But if you're on XP just check for the latest DX9.0C redistributable if you have concerns.
However as mentioned most games ship with thelatest DX level they require, which is alaos a good thing because stuff is added/subtracted all the time.
I was under the impression that DX10.1 was what DX10 was supposed to be had Nvidia been able to make it work. Indexable cube map arrays (sweet global illumination & )and AA standardization (No more crappy EdgeAA!)are not features to be scoffed at. Read more .
I suspect that by the time we start seeing "true" DX10 games (no, Crysis does not count) Nvidia will have a DX10.1 compliant hardware available.
Jpmeaney said:Hi Rojito.
Ok, lets give you a little background here. DirectX is what is called an API. API stands for Application Programming Interface. An API is basicly software that responds to requests from either hardware or other computer programs. In DirectX's case, the API responds to the requests from 3d graphics cards as well as the PC games that run on them.
Unfortunately for gamers, or those who are perfectly satisfied with using Windows XP, Microsoft has elected to only allow DirectX 10 to be used on Windows Vista. This means that if you're running Windows XP, chances are that you'll never be able to use DirectX 10 unless you purchase a copy of Windows Vista and install it on your machine. This is in my opinion, one tactic Microsoft is using to force gamers into adopting their new Operating System.
Anyway, to answer your questions in order - here you go:
1. You can't install DirectX 10 Without Windows Vista. If you're okay with Windows Vista then yes, install it.
2. The benefits of DirectX 10 are many, primarily once developers really start to optimize for it. The number one benefit of DirectX 10 is pixel shader 4.0 also known as the geometry shader. As I understand it, the geometry shader allows greatly increased detail and realism without a serious performance impact. DirectX 10 is also supposed to run faster and more efficiently than DirectX 9. Key word is SUPPOSED....doesn't mean it really will....yet.
3. See the above answer to question 1.
Hope this helps.
Yes, mostly.
I thought Vista came with DX 10, and SP1 game with 10.1 ?
I read the same load of wonderful things about geometry, detail, realism, etc without a big performance impact. I still believe it's possible. However, I don't know of ANY current game (even crysis) that can pull this off.
Maybe we need to wait for Elder Scrolls 5 - but by then, DX11 will be out.
Yeah the main draback is that while all
of these things are more efficient, the devs haven't just gone that route, they've also increased the workload, so while it may be 3 times as efficient, often they've increased the amount of work by 5+ times, and then say "Oh yeah it's slow, but it's doing so much work that a DX9 card would take Xtimes as long to do it".
To me there's two options for DX10, and they're only doing the slightly shinier version rather than the more efficient version. And not everyone is going to have a high end card, and when the Shiny version runs like crap even on a higher end card, what's the point? A little more focus on offering some efficiency to the mid range would be nice. Seriously Crysis DX9 high settings running at 20-50% speed boost would probably be more attractive to many people than the slightly shinier model running at 70% the speed of DX9.
MS claimed that DX10 would dramtically reduce the instruction length needed to communicate with the video card.
However, they also made it default to multi-threaded mode (DX9 does not), which means that it must use locks when updating resources.
They thought that having the video card do more would make it faster.
Then, the games that were written for DX10 were not just rewrites of current games.
They (Crysis developers for example) wanted to make their games even more realistic than previous games.
So, inspite of any efficiency achieved with DX10, the games written for it piled on visual effects that more than compensate for that efficiency.
Meanwhile, Nvidia and AMD are trying to make DX10 cards that can do the eye candy without being too expensive.
This has not worked out, as it it obvious you need to spend a lot of green to get card(s) that can do it.
Now we have quad core CPUs and, as Intel is saying, the rendering pipeline can be moved back to the CPU.
We will have to see where that goes.
TheGreatGrapeApe said:Yeah the main draback is that while all
of these things are more efficient, the devs haven't just gone that route, they've also increased the workload, so while it may be 3 times as efficient, often they've increased the amount of work by 5+ times, and then say "Oh yeah it's slow, but it's doing so much work that a DX9 card would take Xtimes as long to do it".
To me there's two options for DX10, and they're only doing the slightly shinier version rather than the more efficient version. And not everyone is going to have a high end card, and when the Shiny version runs like crap even on a higher end card, what's the point? A little more focus on offering some efficiency to the mid range would be nice. Seriously Crysis DX9 high settings running at 20-50% speed boost would probably be more attractive to many people than the slightly shinier model running at 70% the speed of DX9.
I still don't see how Crysis in DX9 can be faster than in DX10 -WITH the same number of objects, textures, etc? I don't see this taking any more work.
So, DX10 advantages is all just about "Shiny-ness" , effects, blur, depth, etc? (from a visual perspective). Actually I think crysis has depth of field in DX9 as well.
Confused...
enewmen said:Sorry,
I still don't see how Crysis in DX9 can be faster than in DX10 -WITH the same number of objects, textures, etc? I don't see this taking any more work.
So, DX10 advantages is all just about "Shiny-ness" , effects, blur, depth, etc? (from a visual perspective). Actually I think crysis has depth of field in DX9 as well.
Confused...
Crysis does have depth of field and a screen based motion blur in DX9, but these effects are much better with DX10. Especially the motion blur, which is object based under DX10. Actually it is possible to enable object based motion blur under DX9, but it's glitchy as hell.
In any case Crysis isn't really a DX10 most of its development took place before DX10 hardware even existed. I still don't know of any games under development that are native to DX10, as in they won't run on DX9 hardware at all.
homerdog said:Crysis does have depth of field and a screen based motion blur in DX9, but these effects are much better with DX10. Especially the motion blur, which is object based under DX10. Actually it is possible to enable object based motion blur under DX9, but it's glitchy as hell.
In any case Crysis isn't really a DX10 most of its development took place before DX10 hardware even existed. I still don't know of any games under development that are native to DX10, as in they won't run on DX9 hardware at all.
It seems I'll answer my own question after a true DX10 "only" game comes out.
Perhaps the Crysis Episode 1 expansion will have major engine parts reworked? Then I can see what DX10 should be like,.
thanks a lot you guys, you've been very helpful, not only on answering my question but on giving me aditional knoledge on the subject.
To look at this from a different angle lets say i have a graphic card that supports DX10. The question is will that card still run DX9 apps also? &
^ Just download Cuban's Ultra High graphics mod for Crysis. It gets you ultra high GFX settings and enables DX10 gfx for dx9 windows xp
^ You can't NEVER get "true" DX10 graphics in XP.
What you can do, is use a "hack" like bluescreen suggested to get near Ultra settings in Crysis in DX9 mode. This looks very good & close to DX10, but is still only DX9.
This helps some people, even with DX10 cards & Vista - if the DX9 hack can make the game run smoother than in DX10 mode while not looking worse.
@ rmaster, DX10 cards can run DX9 apps as good or better than DX9 cards.
Again, there is no point to DX10.
A few people above mention the 2 year developmental cycle, but this is the first time since DX became the dominant API where the coding path is split (DX9 for XP, DX9E/DX10+ for Vista).
As a result, developers will code all their programs based on the lowest common standard in an effort to sell more games, so DX 9.0c will be with us for a while yet.
Anything that uses DX10 will simply be a DX9 program with some DX10 content, and not a DX10 focused game.
Note, its possible to port DX10 to XP, if a group were willing to put forth enough work.
The only major incompatability is WDDM, which was added in Vista.
The files are still .DLL, and are still called by the host program, its just an extra compatability layer that needs to be hacked.
No true implementation of DX10 on XP has been done yet, although I know the WINE guys are attempting this currently...
Whats funny is if even they do DX10 for XP there still won't be true DX10 games anyway.
The closest thing I saw to DX10 is 3DMark Vantage, which seems to use geometry shaders and crude GPU physics.
I've given up on DX10 already.
There is still hope for DX11...
enewmen said:The closest thing I saw to DX10 is 3DMark Vantage, which seems to use geometry shaders and crude GPU physics.
I've given up on DX10 already.
There is still hope for DX11...
My point is, the same split API that killed DX10 will kill DX11.
DX will stall until devs stop coding for XP, which will not happen until XP's market share drops to below 10%.
From their point of view: You can get 80% of the market for coding for DX10, and have much higher minimum requirements as a result (scaring off some potential sales), or code for 100% of the market with DX9.0c, with lower minimum requirements (if a 7800GTX can play it, my 8800GTS should be fine!).
To devs, its a no brainer.
DX11 will fail for the same reasons DX10 did.
I understand what you're saying.
The difference with DX11 is the next-gen consoles will use DX11.
So, all titles written for consoles, then ported to the PCs will use DX11.
I'm personally hoping DX11 will be enough of a well delivered / major update, then developers will WANT to use this API.
I think Windows 7 will get a vast # of gamers to switch from XP.
But that could just be hope talking.
Graphics card
Actually, consoles already have DX10 and DX11 parts in them. XP has run its course. I read something that made alot os sense. Every other M$ OS release is the one that triumphs, as we saw with Vista, which got all the blame for many things, it was the HW makers, and their lack of driver compatability that "made" Vista as "bad" as it was. What preceded XP? Didnt it also suffer from the same things? W7 comes out, DX10/11 will fly
Faulty logic.
XP still has share, and it will take time to phase out Win7.
DX9 will be the standard for at least the next two years, regardless of how XP does, and if XP maintains &20% market share, DX9 could be around even longer, regardless of how 7 does.
For every other OS release, M$ would have the same DX versions for a period of time (98, 2000, and ME all support DX 9.0c), which made it easier to code for one unified standard, as you did not have to worry about people with an old OS.
That is not possible, and most of you underestimate that just because DX11 will be avaliable doesn't mean it will be used.
Never mind the fact a lot of people will be sticking with there DX10 capable cards for some time.
Also, why would the PS3 use DX, and pay its competition licensing fees?
All the Playstations used OpenGL as far as I know, not DX.
And the reason why every "other" Windows release stinks is because every other release comes out of the Florida office, which brought us 95 and ME.
The good stuff (98, NT, 2000, and XP) comes out of the Seattle office, which is doing Win7.
Well it seems that most of you have a vast amount of information on DX10.0, I have a home built machine 2.8g X2 AMD black edition prosser 4g ram XP Pro and a xfx 8500GT vid card that does support 10X, I currently use 9.0X. I am not a big time gamer, but do play Home World2, BF1942 and Command and Conquer Generals.......So do I really need 10.0X???? I do at times get vid. lag on these games, but not to bad and they do play....please help me out with this issue. (the lags)
Thank you.
Snakebite7734
Graphics card
cobra7734 said:Well it seems that most of you have a vast amount of information on DX10.0, I have a home built machine 2.8g X2 AMD black edition prosser 4g ram XP Pro and a xfx 8500GT vid card that does support 10X, I currently use 9.0X. I am not a big time gamer, but do play Home World2, BF1942 and Command and Conquer Generals.......So do I really need 10.0X???? I do at times get vid. lag on these games, but not to bad and they do play....please help me out with this issue. (the lags)
Thank you.
Snakebite7734
The 8500GT was never a gaming card, try a 9800GTX+ or a 250/260GTX and no you do not need DX10 or 11 as the games you listed are DX9 titles. HTH
enewmen said:I understand what you're saying.
The difference with DX11 is the next-gen consoles will use DX11.
So, all titles written for consoles, then ported to the PCs will use DX11.
I'm personally hoping DX11 will be enough of a well delivered / major update, then developers will WANT to use this API.
But by then, DX13 will be out.
Besides, the PS4, like the PS3, will use OpenGL, and not directX.
Not to mention that the fact the architectures are completly diffrent (Data bus length and CPU register sets) and diffrent coding styles (coding down the the Registers) are the main reasons why console ports are so bad.
Until XP dies, DX9.0c will be the dominant architecture, plain and simple.
And it won't be an immediate jump to DX11, because of all the people who will still have DX10+ hardware that can not run DX11 (Split API games were rare prior to Vista and the split DX API for a reason).
When the first DX11 only games come out, then its clear its time to upgrade to a DX11 card.
Until then, like the limited implementations of DX10, DX11 features will all be watered down and will give little visual improvement.
It's still far easier to code for DX11 & DX10.1 from DX10.0 than it is to add a separate code path just for XP.
Look at today's new deck released by AMD about DICE's experiences, pretty interesting that it addre
Like I told you last time, there no major API split within Vista for DX10 and DX11, DX11 allows it to run on down level hardware within the same package just by changing calls on the fly and adding the two libraries to create if/else options, whereas running on XP actually requires that difficult split you talk about where limits within the implementation of the old versus the new WDDM model mean they don't act the same even DX9 to DX9. 3 Hours of work to add DX10.1 & DX11 support to a game that's built for DX10 doesn't sound like much of an impedement, whereas re-building for XP certainly would be. DX9 to DX10 is the tougher stumbling block, not DX10-&DX11 and that initial DX9-&10 jump is a given nowadays.
As for XP retention, the 'Games for Vista' program already started the XP/Vista split back in the Shadowbane, Gears of War, Halo2 days, making games those titles Vista Exclusives. Vista & Win7 compatible is nothing, and as for the XP vs Vista split, it's not like 2 years ago when there was a small install base. Now for the Enthusiast crowd, that actually pays for games, the Vista install base is no longer a tiny fraction of the market.
This will be no different than the split from games that supported Win 98SE which everyone said would take forever, and then one day 98SE was gone, left only fondly remember by those still wanting to boost their 3Dmark2001 scores.
FAil Devolopers HATE split API's with a passion.
Yes, you could simply encapsulate every DX11 function in an If-Else clause to have a DX10 codepath if DX11 isn't avaliable, but it adds to code (DVD games are already approaching their 9GB limit, and more disks = less profit) and means extra work needs to be done to both the code itself, testing with diffrent setups, etc.
Its not 3 hours of work to add a DX11 code path, more like 300 (coding, peer review, revisions, peer review testing, revisions, peer review, testing, etc).
XP is still the dominant OS, and won't go away anytime soon, and with three exceptions (all Xbox360 exclusives (M$ whoring its products)) all games can run on XP and Vista.
Until XP drops below 20% market share, it makes no sense to sell a product with a handful of extra features that could potentally lose 20% in sales as a result of coding to a DX10+ standard.
Throw in the lag in the general public aquiring DX11 hardware, and the difficulty of having to ensure three seperate graphic API's work (Plus three OS codepaths, XP running DX9 and Vista/7 running either DX9E, DX10, DX10.1, or DX11) and you see why its so easy to simply stick with DX9.
Using your logic, we'd be seeing a heck of a lot more DX10.1 games, considiering 10.1 is such a minor addition to DX10.
In short, developers for games could care less about "the enthusiast crowd".
They want to make money, and coding to DX9, and spending only minimal time on advanced DX features, is the best way for developers to do that.
Dunno about you all, but I notice a huge difference between Crysis High and Very High (XP supports only High)
EDIT: This is Warhead.
It does not matter whether PS4 uses DirectX or OpenGL but both API still have to be newer version with newer features which will make graphics more realistic and then people are going to move into the new era of graphics of DX 11 and OpenCL or possibly even OpenGL 4.0.
However, Microsoft's Xbox 720 will use DX 11, for sure.
Microsoft will also stop officially giving support to Windows XP customers like Windows Update for XP by 2014.
Nobody obligate XP lovers to move on to Windows Vista or Windows 7 but they might have problems like not being able to use DirectX 11 and maybe even DirectX 12 in the future and they will be stuck with DirectX 9.0c with the old structure coding. So there is no need for the XP lovers/users to complain since they wanted to stick with older tech with older coding structure which disable them to use DX 10/11. Please correct me if I am wrong...
rags_20 said:Dunno about you all, but I notice a huge difference between Crysis High and Very High (XP supports only High)
EDIT: This is Warhead.
There is a "hack" that allows Crysis to run in Very High settings in DX9 mode. This looks very similar to Very High in DX10 mode.
The reason is DX10 didn't exist when the development in Crysis began, so CryTec used advanced DX9 rendering to "fake" DX10.
Even so, screenshots arn't everything. Motion blur looks much better in DX10 mode,. I didn't play Warhead, but I guess this will also be true.
Like the Ape said, the learning curve from DX10 to DX11 won't be as great as DX9 to DX10.
So I'm hoping DX11 titles will better utilize the full feature set, rather than just have DX10 features using the DX11 API.
Graphics card
They also wont benefit from MT in gaming nearly as much, and since dual cores are the norm, thatll gurt as well. Plus the memory savings, theres alot more to having a DX11 capable OS than just eyecandy. Once we see the differences put on display, itll get people off XP
when relic first released a dx10 patch for coh, i thought that'll the be trend for gamehouses. release dx10 patches for standard dx9 games so that the transition will be a lot more natural than completely obliterating dx9 support.
I know about the hack, but that's what it is - a hack. Its not the real deal. Anyway, from what I've read, DX10.1 cards will play DX11 games better than DX10 cards. And the 10.1 cards deliver better performance in DX11 games in DX10 games - some of DX11's features run on 10.1 hardware. So, I'm not upgrading to a DX11 card for at least 1.5-2 yrs.
1 / 2Newest
Can't find your answer ? Ask !
Related resources
More resources
Read discussions in other Graphics Cards categories
Ask the community
Top Experts
Tradesman1
65873 messages since 6/4/13
SR-71 Blackbird
95005 messages since 8/17/09
447 messages since 3/20/15
All badges
Forum help
Latest Reports
Tom's Hardware Around the World
Subscribe to Tom's Hardware
About Tom's Hardware
Advertising
Purch Privacy Policy
Terms Of Use and Sale
Copyright Policy
Copyright & 2016
Group, Inc. All Rights ReservedTom's Hardware Guide &
Ad choices