I think it has something to do with the fact that Intel hasn't got, for the first time in many years, the "latest and greatest" mainstream processors anymore
I'm not sure I completely agree. On the top end, Intel is clearly ahead. However, multi-threaded workloads and higher IPC is becoming more important for many things. And given the price premium that Intel is charging, AMD is definitely competitive.
For that matter, ARM solutions could also be very competitive but seems to be overpriced by comparison to x86 for any server solution I've seen so far.
See this panel about Intel’s “project crush” and the defeat of the 68000 [1]. Intel inside was yet another brilliant marketing campaign to take pricing power from the OEMs.
Intel at its core is a marketing company. If they also happen to have the best tech that is nice too but not a necessity.
I don’t agree with his point, but I can see it. For years CPU performance has largely been irrelevant, but they convinced the market to over pay on underperforming hardware.
Intel inside is great marketing and creating different brands of processors like i7 vs i3 independent of form factor pulls the focus away from absolute performance. Getting software manufacturers to customize for your chip again gets out of playing the actual head to head game.
It's really not hyperbole I can confirm after being one of the main contacts at a cloud service provider. Marketing can push engineering and allocate budgets. Everything is driven by "Plan Of Record" agreed with marketing. If you need something done you push marketing to add or change the POR.
Project Crush and Intel inside are simply brilliant. It’s extremely rare for a component manufacturer to capture mindshare from the OEM.
Given they’ve led the charge technologically in CPUs for decades this is extremely high praise. I think people are taking this as suggesting Intel does not innovate technologically which is not the case.
The video provides evidence of the tech being nice but not required. They openly admit their CPU was “a dog” and all the software people hated it. Yet their strategy of going around them to management worked perfectly.
People forget how awful segmented 16-bit was. And other CPUs like the 68000 and Alpha were faster at various points in time.
I wouldn't count intel down and out just yet, when you look at the fundamentals they are still number 1 and when you add in extra bonuses like the amazing strides they have made in increasing their diversity I think intel is going to bounce back hard.
A worthy thing to note, the guy who invented finfet, Calvin Hu, was a TSMC CTO.
Immersion litho inventor Burn Lin was TSMC's RnD VP.
Full depletion tech - TSMC
Modern metal gate - TSMC
Copper - IBM/TSMC
First EUV initiative at TSMC was started in... surprise, 2000!
TSMC does play a very long game. Intel was great at commercialising "latest and greatest tech" quick, but TSMC was always ahead at mastering yields, and delivering more "mature" processes for mass manufacturing.
I didn't make an analogy. And, I was replying to the fact that my parent said they would lie about it.
It's one thing to have been wrong (even carelessly) about 10nm (more than once). It's another to say that a company officer would intentionally lie (to a group of professional investors no less)
> It's another to say that a company officer would intentionally lie (to a group of professional investors no less)
You will have to look back at every single statement in their pubic and investor meeting to judge wether they were intentionally lying about 10nm. To me, they were.
check those links provided in my reply. Intel has already committed such fraud multiple times.
let's don't forget that its ex-CEO sold all his shares before the Meltdown and Spectre went public, yet he managed to walk away free of any charges on such obvious wrongdoings.
If it is on track then one would expect at least a description of that track, but all I see is dates related to 10 nm and that was years late. They cleverly skip that by not offering a hard date but stating 'four years after 10 m', but that's not yet a done deal.
Intel is still in the planning stages of manufacturing capacity and there is zero proof that there is an actual POC or workable process yet to mass produce 7 nm devices at all, which is going to be much harder than 10 nm to get to acceptable yield.
I'll park this one right next to 'home fusion' and '5cts / KWh installed Solar Power' until there is more proof, I would be quite surprised if there really are mass produced devices for consumer use anywhere in the next 5 years, if the 10 nm experience is any guide somewhere in the next decade is more realistic so this isn't really news.
In practice, TSMC's 7nm node is actually functional, and producing working chips.
Intel's 10nm... well, is garbage, because they were overconfident.
Now Intel has to either rework their 10nm node, after wasting many years of work on nothing, or just jump straight at 7nm, which seems equivalent to TSMC's 3nm node plans.
To me, this sounds entirely plausible. Trying to describe a semiconductor process by a single number is almost as abstract as describing a song with a single number. First of all, the definition of that number is somehow vague. It describes only a single design rule, the gate width of a transistor. But that already assumes a certain fundamental layout of the transitor, which no longer applies in advanced nodes with for example fin fets and all kind of 3d structures. It also does not account for all the many hundred design rules characterizing that process. To compare two processes, you not only need to account for the transistor density, but also the performance characteristics, like energy used, leak currents, switching speeds and so on.
With the 7nm process, TSMC seems to have clearly pulled ahead of Intels 14nm process. The delayed 10nm would be competing with the TSMC 7nm process. The 10nm sound not very advanced, but it hides that other parts of the process were very aggressive in specs. As everyone in the tech industry knows, complex projects can get stuck easily. Sometimes for obvious mistakes, but often hitting a road block, where no one expected one. So for whatever reasons, the 10nm project was quite delayed, we will see in 2019 if they manage to hit the market at their current schedule.
The 7nm project is run independantly from the 10nm process, so it runs on its own time scale. If they are not directly based on each other, it is quite plausible that they made a completely different set of design decisions and thus were not hit by the same road blocks the 10nm process had. As the 7nm process is in an earlier phase of its development, it might of course have just not reached some of the road blocks yet, but it is not necessary that it does run into the same problems as the 10nm one.
Technically they did sell some 10nm chips in 2H'17. It's just that the GPUs didn't work at all and the CPU cores had poor yields and worse power usage than 14nm so they got shuffled off to obscure OEMs.
Hearing how much commotion news from semiconductor industry been making recently in the mainstream media, it seems that people have only began to realise it now that semiconductor manufacturing is the "water tap of of modern civilisation" - not facebooks, amazons, googles, or other dotcoms.
People have said Intel contracted out fab capacity that is competing with the 8700k, because they expected to be moving to another node and having excess capacity on that one but now they don't.
It will start having real problems when Zen 2 will ship using TSMC high-performance 7nm process instead of Zen 1's low-power process from Global Foundries. I.e. Zen 2 will have clock and IPC parity with current and future Intel parts no problem, while costing much less due to increased yields from multi-chip tech.
The 8700k cost might have as much to do with supply constraints at Intel's end as demand side issues. There are certainly workloads where the 8700k is a better choice than AMD's offering (and vice versa), some companies locked into the 8700k in advance, and some people are Intel fans. I expect the price to drop in a few months when Intel finishes expanding its 14nm fab capacity. They'd expected to be building their chips on 10nm by now so they'd moved chipsets up to 14nm.
You've got to admit that they were accurate about Intel's 10nm problems pretty consistently. And they were right about NVidia's excess inventory problem delaying consumer Volta. They aren't always accurate, but 2018 has been a pretty good year for them.
Every single News out of the Site about Intel/ Nvidia are either inaccurate, technically false, fake news or hyperbole. And all of its headline are nothing but drive you to buy thousands dollar subscription for their "professional" detail report.
Where is the $20B 10nm customer going bankrupt? Why is the 8160 Modem fake, just because it is reusing a photo? I could still have given them points if any of its statement or rumours had any technical or business merit. But it doesn't. As much as I support AMD and likes to bash Intel, this is not how Journalism should work.
Like the snapdragon 810 they kept having articles claiming that there was no termal issues after it had become more or less common knowledge that it was deeply flawed.
Probably because Charlie doesn't like Intel due to conflicts happening between them. Therefore, that make Charlie a bad guy, because he doesn't mindlessly spout praises in blind favour of Intel.
However, that doesn't mean that Charlie is suddenly an AMD fanboy, as I've seen the pro-Intel crowd proclaiming, as a attempt to defend Intel by smearing his reputation.
EUV is semiconductor industry's equivalent of commercial fusion: every 5 years, the prognosis is made that we will get commercial EUV in the next 5 years. And like that for the last 30 years.
They are doing 7FF+ in 2019 with few steps in EUV already. GF and Samsung also have similar schedule. You don't have to trust me. You could look at their investor meeting notes or report from Anandtech.
Yes, but they are not gonna be using EUV for every layer. The only way it makes economical sense is where they replace horrific SAQP layers with a single exposure EUV pass, and use conventional litho for the rest to maximise throughput.
Is is probably a safe bet to ignore any news from there until they actually deliver anything and read independent benchmarks.
It is really sad, they used to be very correct in their communication.