It's not as linear as Moore describes but in the 1990s we got the Pentium II, AKA the "Candy Bar" . The theory was pretty much what you write about 20 years ago. [Oh dear. I'm wicked old :-(]Supernova1138 said:Moore's Law is pretty much dead at this point. It strictly speaking means every 18 months you get double the number of transistors onto the same size of die. That's not happening anymore because we're approaching the limits of how far we can shrink the transistors on silicon. Case in point it looks like Intel will now have four generations on 14nm because they are having so much trouble getting 10nm to work, it used to be Intel would only have two generations per process node. It's part of the reason why CPU advancement has more or less plateaued in the past 6 years and the only way to get big improvements on CPUs now is to add more cores and go onto a larger die.
.jpg)
I anticipate great things, even if things appear stalled at the moment. (Will Quantum computers ever meet the proposed possibilities?)
I haven't seen the slow down there yet. And while GPUS are getting much faster, we're getting much more bang for the buck too. My HD 7970 w/ 8 Gig DDR5 cost $600. 5 years later, for $240, I got an RX 480 8 GB DDR 5.GPUs face a similar situation, but they aren't hurt nearly as bad as they are designed from the ground up for parallel computing vs. CPU tasks that are more serialized and where software developers have a hard time scaling that stuff across many cores.
The next big change over the next 10 years?
In the past, you had to pay, adjusted for inflation, about $150 for a good sound card. I can now get 5.1 surround sound from my mother board. I think we'll see an APU that can match the best CPU and GPU combos built into motherboards 10 years from now. Making this kind of power available to the masses is an advance as well. Game development will make better use of this power under such conditions.
I'm liking it so far. And if I understand it correctly, AMD optimized its cards to work well with it going back as far as my HD 7970 (replaced twice over since) while the Nvidia GTX 5 (and 6?) series do not work at DX 11. And I think things will only get better as developers get used to working with it.As for DX12, so far it hasn't been very impressive, not that many games have supported it so far, and those that do often run worse than in DX11 mode without any real improvements to visual fidelity.