Valve Boss: New Intel CPU Allows a "Console-Like Experience" on the PC

Recommended Videos

InsaneOne10

New member
Feb 9, 2010
68
0
0
As much as this is a good step for simplicity's sake, but like some of the other people on here I really don't like the idea of adding more work on a single piece of hardware. Also, I enjoy putting computers together so this just takes the fun out of it.
 

zfactor

New member
Jan 16, 2010
922
0
0
Jandau said:
I don't know about the elitists, but as a PC gamer, I think this might really be a good thing. The variety of graphics cards has undeniable negative side effects on PC gaming and I'd be glad to see it gone, or at least the eternal arms race between the few major manufacturers reduced.

The problem that I see with this is that integrated graphics cards (I know this isn't the same thing, but it's similar) were already attempted and were an unmitigated disaster. I hope this turns out better.
In my understanding...

Integrated graphics cards are built into the already crowded motherboard. They usually suck because they have no space to put all the components. Dedicated graphics cards are like mini-computers, they have their own RAM and processor and I/O ports and software (technically drivers...).

When they say "on-chip graphics processor," they mean "integrated graphics card". They seem to be the same thing. The only "new" thing they will do is use a smaller version of the i7 processor (which is epic fast) as the graphics processor. Which means it will be faster than everything else (hopefully) because the i7 is a beast.

So they are basically saying they will put mroe effort into making intergrated graphics better than they were. Which shouldn't be too hard because they suck now...

Who thinks ATI and Nvidia will release drivers that let you Crossfire or SLI with an integrated card? Because that would be awesome.
 

zfactor

New member
Jan 16, 2010
922
0
0
QuadFish said:
Woodsey said:
Anyway, CPU and GFX cards should be kept separate - if one fucks up then it's not the end of the world, relying on something that essentially combines the two makes things more expensive. Can't really see this completely taking over that set up in any case.
Did you mean that as a pun? Because that's how I read it the first time.
I pretty much agree with you. Given the almost completely compartmentalized nature of PC (hell, even console) components, I find it hard to believe that any system would comfortably merge components like that. Even combined choices that have some logic to them like combined WiFi and Bluetooth cards haven't taken off, much less the two most important parts of your system (to a gamer, anyway).
Well the computer will still see them as separate, they will just be built on the same board. So to get to the graphics now it goes through the mobo to the GPU, with this it goes through the mobo to GPU part of the mobo...

Of course, you can find the processor and RAM modules on an integrated motherboard and replace them with better ones yourself. You just need a soldering kit and a microscope... Maybe Intel will make it modular?

I think it is a step in the right direction, because, ideally, you want everything connected to everything else. That way you never bog down one component getting information for another component. I'm pretty sure that's how the human brain works too...
 

stone0042

New member
Apr 10, 2009
711
0
0
GamesB2 said:
Integrated graphics cards were never a good idea ._.
I agree completely. I made the mistake of buying my first computer without considering pc gaming. Even though I now have a large Steam library and enjoy it quite a bit, there are multiple games I can't play, and those that I can are all at minimal settings, all because I have a shitty integrated graphics card.
 

QuadFish

God Damn Sorcerer
Dec 25, 2010
302
0
0
stone0042 said:
GamesB2 said:
Integrated graphics cards were never a good idea ._.
I agree completely. I made the mistake of buying my first computer without considering pc gaming. Even though I now have a large Steam library and enjoy it quite a bit, there are multiple games I can't play, and those that I can are all at minimal settings, all because I have a shitty integrated graphics card.
Ha, I learned the hard way as well, when I bought a retail copy of Borderlands despite having only a desktop with a GeForce 7200 and a laptop with a GeForce 7600 (which is just enough to run TF2 well on lowest setting and high resolution). Then Bioshock also didn't work, then CoD4 was laggy.... you get the idea.

Then I got a new desktop and decided to buy an ATI (shush) 6870, which cost $273. Why? Because it looks awesome now.
 

pokepuke

New member
Dec 28, 2010
139
0
0
Tom Phoenix said:
Who said I was talking solely about the GPU? We need a standard set of system specifications (meaning, CPU, GPU, RAM...the whole shebang) that hardware manufacturers and game developers would follow for a certain number of years before switching over to a new standard.

Plus, the fact that all GPUs are DirectX compliant means jack squat. There are plenty of graphics cards that support a specific DirectX version and there are even more potential CPUs, RAMs, motherboards etc. they can be combined with, potentially resulting in technical issues and certainly resulting in the confusion of less technically-savvy customers.

The fact is, if we want PC gaming to be competitive with console gaming, there is little choice but to make purchasing and using PC hardware and software as easy or nearly as easy as purchasing and using console hardware and software. It won't be easy if undertaken, but it has to be done if PC gaming is to have a future.
So if the game runs like crap, lower the settings. That's how it has always been. And all video cards are backward compatible.

The main problem is people don't know how to take care of their computers. They corrupt their systems, then they can't install a game properly, and they get upset when they can't fix it.
 
Patrunic said:
TBH, most people on here claiming that a CPU / GPU combination is a positive thing for performance, are quite honestly wrong on so many levels. Look at it this way, A CPU's architecture is designed to maximise the efficiency of the CPU alone, if you attach a GPU onto that then all of a sudden arises the issue of bottlenecks for your whole system if you need to process graphics output and every other process through the same device.

So in order to get a decent level of performance (I'm defining decent as 40+ FPS) You would need to increase the CPU from its original level (Lets say an i7 950 for example) of 3.06ghz up to 4.36ghz in order to ensure there was no drop in performance due to the GPU running at 1.30ghz.

As already mentioned, the heat that would be generated from the voltage required to keep this stable would be astronomical with drastically different heat dispersion methods being required, as standard HSF systems simply cannot deal with these temperatures.

While this idea of a Combined CPU/GPU idea provides a cheaper and simpler alternative to PCI-E cards, the current technology will not allow it to be competitive in the PC gaming industry due to the level of graphics nowadays, so while the console experience could be had from this CPU/GPU combo, it would be that of many years past and not a current day experience.
It just sounds like you have no idea what you're talking about. Intel already integrated the GPU with the CPU in the previous generation, and it was fine. It ran like crap for games, but it was fine.

The first issue you mentioned actually did happen. They moved the QPI around to accommodate for traffic going back and forth between parts, which slowed it down. But now it is all designed as one part with a ring buffer designed for massive scaling of processing cores, so the QPI is back in place and now up to full speed, and the graphics are much faster as well.

Having the GPU sitting on-die with the processor is actually a huge advantage. That means there is virtually no travel time required for one to speak with the other. No latency = faster. That is how the Intel GPU is as fast as it is now. It's still kind of slow, but the GPU part is actually very diminutive (read: not going to melt down your processor), and if it were back on another part of the motherboard then it would take a drastic speed dive. Lots of stuff have been moving to on-die with the CPU for speed and efficiency.

 
 

Cyberjester

New member
Oct 10, 2009
496
0
0
Intel, taking the definition of "console port" to a whole new level.


I propose a freeze on Intel by everyone, no purchasing anything Intel branded until they smarten up.
 

Exort

New member
Oct 11, 2010
647
0
0
zfactor said:
Who thinks ATI and Nvidia will release drivers that let you Crossfire or SLI with an integrated card? Because that would be awesome.
That won't work at all.
SLI and Crossfire work by letting A graphic card render one frame and B graphic card render the next frame. Therefore (if you haven't notice) SLI and Crossfire always connect two same cards. That is also why we say a SLi or Crossfire of a 2G graphic card and another 2G graphic card still only have 2G memory, because both card are rendering the same thing just at different time.

Edit: actually it can work but result would look weild, letting a GTX 485 render one frame and Intel 3000 render the next...
 

RhombusHatesYou

Surreal Estate Agent
Mar 21, 2010
7,595
1,914
118
Between There and There.
Country
The Wide, Brown One.
Exort said:
zfactor said:
Who thinks ATI and Nvidia will release drivers that let you Crossfire or SLI with an integrated card? Because that would be awesome.
That won't work at all.
Seeing as ATI/AMD already have Hybrid Crossfire for use between integrated and discrete cardsit's not impossible, however the discrete cards used for it are not enthusiast level cards and even in Hybrid Crossfire they can't match the performance of a discrete GPU card.