Intel Announces i9 Dode-Core CPU Starting At $999
Intel i9 CPUs
(Last Updated On: May 30, 2017)

If you needed any kind of hint that PCs are moving above and beyond anything that consoles are doing this generation, look no further than Intel’s recently announced The Core i9. This mean mamma jamma packs 12 cores across 20 threads, offering unparalleled power for $999.99. And you know what the crazy part about it is? That’s the weakest of the bunch.

Over on the official Intel website they announced that the i9 14 nanometer chip is the start of a whole new generation of high-powered central processing units for desktop computers.

The i9 comes in three different categories, with the weakest and newbiest being the i9-7920X utilizing a dode-core processor with 24 threads, and Blues reports that it starts at $999.99. The next step up from there is a 14-core processor utilizing 28 different multithreads at a time, which fits under the i9-7940X umbrella. The next step up is the i9-7960X featuring 16 cores and 32 threads for $1,699, and the final and most beastly of the group is the $2,000 CPU labeled as the i9-7980XE Extreme Edition, which features an unbelievable 18-core processor running up to 36 threads.

That processor is just unbelievably insanely powerful. At $2,000 it seems like overkill, especially considering that there are no games on the market right now that would even come close to making use of a processor that powerful.

Nevertheless, it’s coming soon and this is obviously Intel’s answer to AMD’s recently released Ryzen CPU line.

Unfortunately for AMD the cost-effective CPU only barely outperformed the i7 in certain strict tests, and it seemed to flounder in others. The highlight of the Ryzen, however, is that it’s a cheap little chip for what it has to offer and it obviously won’t cost you $2,000.

I suppose the i9-XE would be for those people running high-end software relying on heavy duty floating point calculations, or people who are doing lots of intensive graphic design work relying on real-time rendering with near photo-realistic images.

Otherwise, the i9-XE doesn’t serve much of a purpose in the realm of gaming right now unless you’re planning on brute-forcing emulation software to run at a certain speed, or you have an impressive multi-monitor setup. But it’s nice to see Intel making progress. Now if only the software industry could take note and follow suit.


Ads (learn more about our advertising policies here)



About

Billy has been rustling Jimmies for years covering video games, technology and digital trends within the electronics entertainment space. The GJP cried and their tears became his milkshake. Need to get in touch? Try the Contact Page.

31 comments Categories: News, PC Tags: , ,
  • But can it run Crysis?

  • But can it run Crysis?

  • Phasmatis75

    Honestly whenever tech like this comes out my first thought is always: Cool, but what really needs it? Or rather what can I do with it that I can’t do with something cheaper. Answer is always virtually nothing unless I’m mining a crypto currency.

    • 8K video is a thing. Just sayin’…

      • Phasmatis75

        Most people don’t own an 8k monitor and there is little reason to care when 4k is sublime already.

      • Phasmatis75

        Most people don’t own an 8k monitor and there is little reason to care when 4k is sublime already.

        • This is true.

          I think the majority who use 8K are on multi-monitor setups anyway (from the last time I checked the Steam hardware survey).

          • Phasmatis75

            Well for people who have that kind of money and want that kind of performance I’m happy for them. For the average hard core gamer like myself this is useless. It’s like buying a mansion, but still owning only enough stuff to fit an apartment.

        • This is true.

          I think the majority who use 8K are on multi-monitor setups anyway (from the last time I checked the Steam hardware survey).

  • Ax

    what is the point if nothing properly uses a Core i7 for example? it’s kind of a overkill or maybe is to hold the bad code that plagues the industry nowadays, is your game bad coded and filled with memory leaks? recommend at least 8GB RAM and a GPU with 2GB, the code still dog shit but hey at least it can hold the leaks/improper coding now.

    • Nikusuke Doguro

      You can run much more powerfully AI with such CPU. Or a lot more AI agents. Thinks of a game like Skyrim that simulates social interactions between NPC on a whole new scale. Or real time strategy with huge amount of units and large number of AI. Or a game with very realistic animations that are generated on-the-fly.

    • Nikusuke Doguro

      You can run much more powerfully AI with such CPU. Or a lot more AI agents. Thinks of a game like Skyrim that simulates social interactions between NPC on a whole new scale. Or real time strategy with huge amount of units and large number of AI. Or a game with very realistic animations that are generated on-the-fly.

  • Migi

    Wish they’d announce the price of the newer motherboards series that were shown. IF i recall the release dates are gonna be announced this june?

  • Bitterbear

    And right on cue, the MacHeads are starting to talk that this is the reason why Apple hasn’t resfreshed the MacPro lineup.

    The poor things, still clinging on the idea that people at Apple care about anything other than phones and laptops.

  • Waifu Engineering

    I’m wondering if this is effectively the Xeon market. I can see this chip useful for real-time military genome finding cure for cancer sequencing simulations in 4K and render stations.

    And starting at $999 a chip! Within a span of a month I just bought three $200 crappy laptops for my siblings and mother, they don’t use anything except Youtube and Microsoft Word, and they’re even cheaper than smartphones / tablets.

    It’s becoming pretty clear to me the market segmentation as follows:

    * branded (Asus / Acer / HP) crapware for normal people with built in graphics so they can use office applications and surf
    * mid-tier self-built rigs for gamers and prosumers (i5 / i7 and GTX level cards), good for just HD standard which is sufficient for now
    * extremely niche super high end like this product for server / render farms, VFX studios and the like.

  • Blake

    It seems crazy unless you look at how the next 5 years things will have to be rendered at 4kHDR. Video cards can output that resolution but for a game to run at a solid 60 frames 18 cores would help a lot

    • That’s assuming the games are designed with hyper-threading in mind. A lot of them are poorly designed in that respect and rely on a single or dual cores.

  • Gorgon

    Impressive, if a bit excessive for non-professionals. I have Core i7 67K, and it’s already leagues beyond consoles, including PS4 Pro and Scorpio. I really want to see carbon nanotube processors now, that seems to be the easiest next step for CPUs.

  • Gorgon

    Impressive, if a bit excessive for non-professionals. I have Core i7 67K, and it’s already leagues beyond consoles, including PS4 Pro and Scorpio. I really want to see carbon nanotube processors now, that seems to be the easiest next step for CPUs.

  • Gorgon

    Impressive, if a bit excessive for non-professionals. I have Core i7 67K, and it’s already leagues beyond consoles, including PS4 Pro and Scorpio. I really want to see carbon nanotube processors now, that seems to be the easiest next step for CPUs.

    • I thought quantum processing was next? I haven’t been keeping up on the futurism outlook, though.

      • Gorgon

        Nah, that’s ways off yet. Next is either carbon nanotubes or graphene based chips. Nanotubes are more likely, since it’s a more developed tech so far.

      • Mr.Towel

        There has been some developments regarding light based circuits, rather than electricity based transistors (on short, transistor that react to photons and seems to be much faster than transistor based on electricity), however, the biggest drawback which is putting development back is the fact that you can’t have an hybrid system (converting light to electricity would cost too much time, making the use of light based circuits redundant, their whole point is that they’re faster than electricity), if you want a light-based CPU, you will need a light based socket, RAMs, micro controller, databus lines, etc.

        Photons are defined as sub-atomic particles and therefore quantic. That’s the closest we’re getting to a quantic CPU for a few decades. Optic fiber already proved its superiority to electrical cable lines when it comes to speed so it shouldn’t take more than a century to have light-based computing circuits. Plus, electron-based transistors are near their division limit, it’s getting harder and harder the decrease the side of transistors to increase processing power, we’re getting to the point that they are becoming too small for electricity itself. That’s why we’ve been getting bigger and bigger CPU, just adding mroe cores to it, rather than being a more powerful one its design itself. Going for transistors reactive to sub-atomic particle will be the future sooner or later.

        Fantasy quantic stuff, like non-binary transistors states, is waaaaaay far off.

        This was the latest new I’ve seen from this field and it’s from 2015: http://www.colorado.edu/today/2015/12/23/breakthrough-light-based-microprocessor-chip-could-lead-more-powerful-computers-network

        Another veyr interesting one from also 2015, light-based memory: http://www.ox.ac.uk/news/2015-09-21-light-based-memory-chip-first-ever-store-data-permanently

        This is a big deal because today the biggest bottleneck in computers is the communications between CPU and RAM (which is why SoC have been getting more popular). Doesn’t matter how fast your CPU is, it will always get hold back by the much slower RAM and the slow databus between each other. A combo of light-based memory with light-based CPU will already provide a much greater data bandwidth than anything on the near future, it could be one of the big technological jumps for this century.

        • Phasmatis75

          People like you are why I love comments, you learn so much from passionate people.

          • Mr.Towel

            Thank you. It overlaps on my field so I have to know about it.

            Surprisingly, it’s a terrible conversation starter, glad it was interesting to someone.

          • Phasmatis75

            Tech is fascinating, but often times I have nothing to say in return. That might be why it’s not a good conversation starter except amongst tech enthusiasts.

          • Phasmatis75

            Tech is fascinating, but often times I have nothing to say in return. That might be why it’s not a good conversation starter except amongst tech enthusiasts.

          • Mr.Towel

            Thank you. It overlaps on my field so I have to know about it.

            Surprisingly, it’s a terrible conversation starter, glad it was interesting to someone.

        • Very informative post, thanks man. Haven’t had much time to dive back into that stuff. I remember they were making breakthroughs on the quantum front after they were able to freeze light and study it for use with data transfer, but I didn’t follow through to see what became of that.

    • I thought quantum processing was next? I haven’t been keeping up on the futurism outlook, though.

    • I thought quantum processing was next? I haven’t been keeping up on the futurism outlook, though.