Jump to content
Objectivism Online Forum

GPUs, a future rival of the CPU?

Rate this topic


Prometheus98876

  

11 members have voted

  1. 1. GPUs, a future rival of the CPU?

    • Yes (why do you think this is the case)
      1
    • No (why do you think this is not the case)
      5
    • Unsure (your thoughts)
      0


Recommended Posts

I was reading through one of my favorite computer magazines when I found an article on the potential future of GPUs ( go Here

to read the article if you want). I will sum up what up what I feel to be some of the key points of the article:

As you might know, if you are able to represent data in a graphical form, then it can be proccessed by the GPU. Given that the GPU is a massively parallel (at least compared to most CPUs. The Pentium 4 Extreme Edition 3.8 Ghz can handle about 15 gigaflops, compared to the NVIDIA 7800 GTX's 169 gigaflops) processing units this means that you can process massively complicated sets of data much more rapidly than you could on the CPU.

Of course, it is not easy to represent say data from say a database, or data such as multi-track music files in a graphical form. But there are programming languages for just this task, one of the most frequently used is BrookGPU, which is a C-like language.

So, it should be clear that it is a viable that Intel might have a rival on their hands here, at least when it comes to processor-intensive applications. But do you think that this will become a problem for Intel in the near future, and if so, how soon?

Link to comment
Share on other sites

The essential difference between GPUs and CPUs is that GPUs are massively parallel. They derive their benefit from being able to take a set of instructions and replicate them over large tables of data: take all the vectors from this table, perform a calculation against another common vector, multiply the result by the data in a second table, and deposit the result in a third. This is implemented cheaply by having an array of inexpensive and inflexible computation units working in parallel under a single control unit. When computation unit 1 is operating on the 1st element in each of the above tables, unit 2 is working on the second, and so on through 8, 16, 32, or more parallel units. The units can only operate in tandem, performing the same operation on sequential data.

Normal computer programs spend little time moving large amounts of data around. Most time is spent making decisions affecting program flow, where mass parallelism doesn't offer any advantage. Further, the result of each step in a general purpose program usually relies on the result of a very recent previous step, again thwarting parallelism: unit 1 can't operate in parallel with unit 2 if unit 2 can't start until it has the output from unit 1.

That said, most modern CPUs do have a small selection of GPU-like instructions. These are called SIMD extensions (Single Instruction, Multiple Data), and are usually known by the name of their specific implementations. MMX on Pentiums, Altivec on PowerPC processors. But these extensions remain unusued for virtually all of a computer's everyday tasks. Programmers only employ them for things like Photoshop filters and graphics tasks ini games if a GPU isn't present.

Link to comment
Share on other sites

I can't comment on how possible this is but it looks really interesting...

As the article suggests, it is very possible, albiet difficult. Of course, the real difficulty lies in representing data as images. It can be abit of a hassle from a programming aspect to find a way to convert a database into a graphical format, but possibly worth it if you are able to speed up the processing of such a large data set.

Link to comment
Share on other sites

  • 3 weeks later...
I think they won't take over mainly because most people won't do it that way and because CPUs still have to b in the pc. Also Intel and AMD aren't just going to step back and let that happn.

A good question is though, if they do take over, will they stay in top? It is possible that CPUS will eventually start using a better architecture (at least better than the Von Neumann architecture), and that they might end up more efficent than the GPUs... but this may be so much pixie dust, at least for ages to come.

Remember, Invidia could manage to outdo Intel and AMD on this one, it is unlikely in the short term, but in the longterm, the attidutes of customers could change, along with the need for using such parallel processing (which is fairly small currently), may shift enough one day.

Link to comment
Share on other sites

A big step along the road to more parallel processors may be the Cell processor. I am sure many of you already know abit about this, especially since it drives the upcoming Playstation Three hardware.

For those not so familar with it, the Cell is in fact nine processors, not nine sub-processors, but nine full processors. And it contains two different types of processors.

Basically it is comprised of eight smaller processors each with their own memory units (cache) )which each have access to the main memory), handled by another 'master' processor. They dont have all of the advanced features of say the Pentium 4 CISC processors, but they are really fast.

All this is connected by a really fast EIB bus which splits each task into smaller bits and then distrubutes them amongst the processors.

The EIB bus on the Cell can also communicate efficently with other Cells over an external bus or a network, making the Cell ideal for home multimedia networks.

Link to comment
Share on other sites

A big step along the road to more parallel processors may be the Cell processor. I am sure many of you already know abit about this, especially since it drives the upcoming Playstation Three hardware.

For those not so familar with it, the Cell is in fact nine processors, not nine sub-processors, but nine full processors. And it contains two different types of processors.

[snip,snip]

This GPU, multi-processor business is old news. Read up on the Hobbit, which is the code name for the first BeOS machines. It is the precursor in spirit to these multi-core gaming machines, and something like BeOS would make very good use of the Cell.

Also check out the architecture of the Amiga. The man who started Amiga, Jay Miner, believed he could make the world's best video game machine, and his creation's architecture looks a lot like these cutting-edge gaming rigs of today: Every subsystem has its own dedicated controller, leaving the CPU to do the higher-level thinking and choreography of video and music. The music system had a highly-advanced multi-track architecture which the current incarnation of sound cards have embraced and extended. The independent video processor is something which you have already waxed enthusiastic about in today's machines.

For all the wishbook architectural promises the XBOX360 and PS3 bear, there's one thing that's rumored to spoil the party: they're hard to program for. If the rumors leaking out through slashdot, arstechnica and anandtech are true, then the hype is far from reality.

I'm psyched about the upcoming Nintendo, tho’. For the system specs, the difference between it and the other two ‘contenders’ should be marginal, and the recently-revealed control-device is full of wonders. I will patiently await such wonderful delights as they may come.

Link to comment
Share on other sites

This GPU, multi-processor business is old news. Read up on the Hobbit, which is the code name for the first BeOS machines. It is the precursor in spirit to these multi-core gaming machines, and something like BeOS would make very good use of the Cell.

......

I'm psyched about the upcoming Nintendo, tho’. For the system specs, the difference between it and the other two ‘contenders’ should be marginal, and the recently-revealed control-device is full of wonders. I will patiently await such wonderful delights as they may come.

Old news maybe, but still something which I think should be put more into a practical context, in a wider range of systems, and with a better mainstream support base. More applications designed to use this technology would also be nice (although apparently a few more have started to come out lately due to the current popularity of GPU hacking).

I have never been a big fan of the Nintendo consoles, partiallty because i think using propereity disc formats was not a good decision (I mean sure, they have every right to do this if they wish) from a sales point of view. I would like to be able to play DVDS etc in my game console, which is one reason I would not bother with the Gamecube. That and I think that as good as the Nintendo games are, PS2 generally has better games.

I have heard rumours that Nintendo yet again intend to use a properiety disc format for their next console, do you know whether this is true?

Link to comment
Share on other sites

I have heard rumours that Nintendo yet again intend to use a properiety disc format for their next console, do you know whether this is true?

Nintendo will again be using a 'proprietary' format for the Revolution, but the discs will be 5 inches across. Part of the benefit of using their oddball format is to 'fence in' their intellectual property. While the PS2 and XBox use DVDs for their media, they also fall prey to pirates with a DVD burner and a mod chip handy. This problem is something that Nintendo does not have to worry about, because the Gamecube's optical disc is so odd that it cannot be copied.

By protecting its intellectual property rights, Nintendo makes sure that people properly pay for their product. I approve of that.

I am in favor of having discrete components for discrete jobs. I have a DVD player, and that is the device that I use to watch my DVDs, and it does a much better job than my PS2 at doing so.

As far as getting this thread back on topic, have you been following the console news on Anandtech and ArsTechnica? The preliminary, behind-the-scenes, off-the-record reports coming in so far say that the much-hyped Cell and Xeon processors [Cell-PS3, Xeon-XBox360] have only the hype going for them, and that the GPUs are going to save the day for the systems. As being the sceptical sort of person when it comes to these too-good-to-be-true press releases from Sony and Microsoft, it's hard to tell what to believe regarding these guys. From what I can tell is true, I don't care if Nintendo's Revolution will only be marginally better than the Gamecube, the new controller has such wonderful promise.

Link to comment
Share on other sites

Nintendo will again be using a 'proprietary' format for the Revolution, but the discs will be 5 inches across. Part of the benefit of using their oddball format is to 'fence in' their intellectual property. While the PS2 and XBox use DVDs for their media, they also fall prey to pirates with a DVD burner and a mod chip handy. This problem is something that Nintendo does not have to worry about, because the Gamecube's optical disc is so odd that it cannot be copied.

By protecting its intellectual property rights, Nintendo makes sure that people properly pay for their product. I approve of that.

I am in favor of having discrete components for discrete jobs. I have a DVD player, and that is the device that I use to watch my DVDs, and it does a much better job than my PS2 at doing so.

As far as getting this thread back on topic, have you been following the console news on Anandtech and ArsTechnica? The preliminary, behind-the-scenes, off-the-record reports coming in so far say that the much-hyped Cell and Xeon processors [Cell-PS3, Xeon-XBox360] have only the hype going for them, and that the GPUs are going to save the day for the systems. As being the sceptical sort of person when it comes to these too-good-to-be-true press releases from Sony and Microsoft, it's hard to tell what to believe regarding these guys. From what I can tell is true, I don't care if Nintendo's Revolution will only be marginally better than the Gamecube, the new controller has such wonderful promise.

I do see benefits for Nintendo in using their 'oddball formats' and disadvantages in the approach taken by Sony and Microsoft. And I do approve of Nintendo reasons for doing so after rethinking it abit. Still I am one of those whom like having one machine do lots of different things, so the Gamecube is of less use.

And besides, as I said the Nintendo games just do not seem as good these days. Mind you, all they would need is a few more games and I would be very tempted, so I will be keeping my eye on the games that come out on the next-gen console.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...