« GPU2 open beta | Main | How the GPU2 core works »

April 11, 2008

Comments

Einstein-Rosen-Podolsky

it's great :-)

will there's it a version for Ubuntu GPU client ?

Ivoshiee

At first there must be usable drivers and library set for the Linux. Only after those two the GPU FAH client can be made. But there seems to be hope as ATI is heavily pushing OSS GPU drivers and libraries for them. http://ati.amd.com/technology/streamcomputing/faq.html#8

Stephen

It seems like OS X would be an obvious direction since all new iMacs and Mac Pros come with ATI 2xxx standard.

tyler

Works great for me so far.

ATI 3850 @ 740/829
AMD AM2+ @ 2.75GHz

Although it doesn't use 100% of the GPU (which I realize a lot of people are noticing). It runs at around 74% usage.

Ivoshiee

I very much doubt that AMD is willing to port CAL to MacOS X.

rab38505

Actually, the 20" iMacs come with an HD 2400, which is very low-end and wouldn't be all that useful for GPU2. The 24" ones come with an HD 2600, but only with 256MB VRAM, which would still be kind of limitted for FAH use, though much moreso than the 2400. Linux and Windows are where the real GPU's are going to be because that's what gamers use (mostly Windows.)

Hil

I suppose, HD2600 is fine enough, as 3650 (512 VRAM) gets about 950 ppd (O/C to 920/720 though) and it seems like Vmemory size & speed aren't that important for this client.
But of course, Linux version is much more interesting - keeping in mind that it can be installed as 2nd OS on almost any PC.

hoyy

ATI 2600XT
AMD X2 4000+ @2,4GHz

Works fine and at 99% usage.

Team PD 86565

AMD Athlon64 X2 6400+ @3.2GhZ
ATI Radeon HD 3870 X2

GPU core 1 @ 17% (inactive for folding)
GPU core 2 @ 77% (active for folding)

Completing Project 2799 WUs at a rate of 1% every 2 minutes. This seems just a tad bit slow to me, though...

LarryPalmer

Would be nice to have the table of optimal performance of processor and videocard. For ex.: ati 2600xt (def.clocks) and amd x2 [email protected] ghz equals 99% videocard utilization etc.
I think by that it'll be easy for the new one's in gpu folding to understand what's the best use of their system: be it smp, gpu or general folding

AoD

That would be cool! You download a smart tool that analyzes your hardware and then says "Its recommended that you use the GPU folding client for best performance on your system"

WJS

How does the GPU2 client determine how many stream processors to use, and how does it manage the parallel processing?

From the above comments, it seems that 120 stream processors get taxed to 99% (Radeon 2600XT), but 320 stream processors only chalk up 74-77% utilization.

Does the GPU2 client pre-defines the number of stream processors to use for each different model of GPU, or does it determine that info by itself (maybe through some CAL GPUID routine or something)?

Ivoshiee

I haven't seen any GPU code, but hints here and there indicate that the shader count is being detected at run time and the 74% utilization bottleneck is the CPU.

Tyler

I believe that to be the case. The CPU is the bottleneck. As I've been experimenting with my overclock settings and have gone from 57% (at 2.5GHz) to ~74% (at 2.75GHz) GPU utilization.

Hil

Well, of course it's CPU. There is a lot of stats already, and O/C results above, and also me running 2 GPU2 clients on a single 3850 card (where 1 client gets about 1450 ppd due to "slow" CPU X2 5600+@3200 MHz and two clients get about 1980-2000 ppd - 990+ each).

7im

No offense intended, but this discussion of Points and Performance is somewhat premature. The client is only 3 days old, and we've only run ONE very small test work unit.

IMO, performance will change as the client develops and Stanford can make more tweaks as they get good feedback. Any conclusions made now would be made inconclusive with the next revision of the beta client, or with some good sized work units.

The feedback is good, but don't bet the farm just yet.

Philippe

Video card overheat ?!?!

Sapphire 3850 , 512 meg PCI-E
CPU INTEL E6750,
Vista ultimate 64 bit
4 Gb RAM

when A put the CPU usage % at 100% , the tempeture is 90 to 96 degre Celcius.... waw ..
and when a put at 60% = the tempeture is 76 to 82 degre Celcius

So, the video card at this tempeture is not to high ?!?!?! My video card I think can't work often like this without broke ?!?

Jim

Athlon 64 X2 6000+ @ 3GHz
Radeon HD3870
Vista Ultimate 64bits

Hi work fine but :

GPU usage top at 67%, and my CPU is at average 80% (2 cores used).

Is it possible to get more power with a 64bits compiled client ? :)

Neal

Please make the GPU2 console client a priority. I find it difficult to track the progress of the GPU2 client. On the CPU console client I can see plainly where it stands. Also, I can only run the GPU2 client as Administrator. Is there a fundamental reason why it can't run without Administrator privileges?

Stephen

OS X is where the graphic arts professionals are these days. And they're pretty darn serious about their video cards.

Shaheer

GPU client is way faster then the CPU one. I have them both running at the same time. Difference is pretty clear.

The comments to this entry are closed.