Is there a GPU version of the app in the works?

Paul D. Buck
Paul D. Buck
Joined: 17 Jan 05
Posts: 754
Credit: 5385205
RAC: 0

Heck, many MB are not coming

Heck, many MB are not coming with 3 slots for GPU cards ... Theory says I could put that many on my latest MB though it would likely suck the PS out of the wall ...

Also the case is really too small for that many high end cards ... I may add a couple lower end cards as time wends on ... but, for the moment the GTX 280 card is plenty ... :)

Gerry Rough
Gerry Rough
Joined: 1 Mar 05
Posts: 102
Credit: 1847066
RAC: 0

RE: Heck, many MB are not

Message 87151 in response to message 87150

Quote:

Heck, many MB are not coming with 3 slots for GPU cards ... Theory says I could put that many on my latest MB though it would likely suck the PS out of the wall ...

Also the case is really too small for that many high end cards ... I may add a couple lower end cards as time wends on ... but, for the moment the GTX 280 card is plenty ... :)

Actually, it would likely suck the wall right out of the power supply dude! [cough!] I just installed a 9800 GT today, and they call for a minimum power supply of 400W. I doubt you could run a C2D or quad core with more than one GPU card anyway with that much flame coming out of the socket. [cough!] [cough!] Remember that there are the other users too, with display, router, printer, and DSL modem, perhaps clock, etc., all from the same socket. If you live near a nuclear power station, you might be able to get away with it, but I doubt it. No doubt you would have to get a variance from the Nuclear Regulatory Commission for a power supply like that! Good luck, though. :-)


(Click for detailed stats)

Alinator
Alinator
Joined: 8 May 05
Posts: 927
Credit: 9352143
RAC: 0

Well, I have resisted posting

Well, I have resisted posting anything here regarding coprocessing capability here on EAH until now.

Here is my professional evaluation and recomnmendation.

Go ahead and develop a GPU app if you have the time, but I wouldn't even think of trying to roll it out here on EAH anytime in the near term time frame if there was anything of a fundamental science or operational nature which needs addressing.

The experience over at SAH has demonstrated the basic BOINC framework is no where near being able to deal with the adding complexities this introduces, and past historical record indicates that the required fixes aren't going to be here anytime soon.

Since EAH is well known to be one of the most stable and well thought out projects from all aspects, I would think long and hard about damaging that reputation just to appeal to a very small segment of the total host population.

Alinator

Paul D. Buck
Paul D. Buck
Joined: 17 Jan 05
Posts: 754
Credit: 5385205
RAC: 0

RE: Well, I have resisted

Message 87153 in response to message 87152

Quote:

Well, I have resisted posting anything here regarding coprocessing capability here on EAH until now.

Here is my professional evaluation and recomnmendation.

Go ahead and develop a GPU app if you have the time, but I wouldn't even think of trying to roll it out here on EAH anytime in the near term time frame if there was anything of a fundamental science or operational nature which needs addressing.

The experience over at SAH has demonstrated the basic BOINC framework is no where near being able to deal with the adding complexities this introduces, and past historical record indicates that the required fixes aren't going to be here anytime soon.

Since EAH is well known to be one of the most stable and well thought out projects from all aspects, I would think long and hard about damaging that reputation just to appeal to a very small segment of the total host population.

Alinator

Seconded ...

Though I am literally panting with the thought of being able to run a real science project with my GPU resources ...

The issues with BOINC Manager and Dr. Anderson's reluctance to address them squarely means that as more projects come on line with GPU capability we will be seeing a repetition of the angst and anger because of the inability of BOINC Manager and the system software to handle the needs of the participants.

Unless you also have the resources to begin to address these lacks in BOINC ... I know you guys (EaH developers) read the mailing lists so you what the issues are ... and how tepid the proposed response is ...

@Gerry

I am pulling about 400W out of the wall for the i7 with a GTX280 and with the 9800 in the same box it was about 450 or so ... But, the production is phenomenal ...

I am running three computers from one room's power and had to run an extension cord to another room for the other two ... I guess I really do need to have an electrician come out and change the 230 three phase line I had installed for my old UPS changed to a 20 or 30 AMP direct line now I am using two smaller 3,000 VA UPS ... I really need a third one for the two unprotected systems ...

And when I replace those two with another i7 (or dual i7?) later this year it would be nice to have all the systems protected ...

MarkJ
MarkJ
Joined: 28 Feb 08
Posts: 437
Credit: 137763410
RAC: 17654

RE: Well, I have resisted

Message 87154 in response to message 87152

Quote:

Well, I have resisted posting anything here regarding coprocessing capability here on EAH until now.

Here is my professional evaluation and recomnmendation.

Go ahead and develop a GPU app if you have the time, but I wouldn't even think of trying to roll it out here on EAH anytime in the near term time frame if there was anything of a fundamental science or operational nature which needs addressing.

The experience over at SAH has demonstrated the basic BOINC framework is no where near being able to deal with the adding complexities this introduces, and past historical record indicates that the required fixes aren't going to be here anytime soon.

Since EAH is well known to be one of the most stable and well thought out projects from all aspects, I would think long and hard about damaging that reputation just to appeal to a very small segment of the total host population.

Alinator

I agree. BOINC just isn't ready for cuda. Even 6.5.0 which gets cuda behaving a little better broke a whole bunch of other bits that used to work. Yes I know its a "development" version so thats to be expected. But even the release version (6.4.5) has issues. Then there is the patch to the work-fetch policy to fudge things through from the BOINC developers. Obviously they've thought things through - Not.

And lets not mention the stuff happening with the S@H cuda app that can't handle VLAR or VHAR and quite often just hangs. Obviously its ready for prime-time use, so they released it to the public.

The one little ray of hope in this is that GPUGRID have an app that actually works. Their problem is getting BOINC to fetch work units.

I'm sure Bernd would know all the BOINC dev stuff thats going on and is probably not too keen to finish up and release the cuda app until issues with BOINC get sorted out first.

Gerry Rough
Gerry Rough
Joined: 1 Mar 05
Posts: 102
Credit: 1847066
RAC: 0

I know I'm not upgrading my

I know I'm not upgrading my boinc manager past 6.2.19 until things mature much better on the BOINC side; hopefully in the next version or even the next after that. :-(


(Click for detailed stats)

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 89

I wouldn't worry about wall

I wouldn't worry about wall power much. A standard US outlet is 15A-120V, that's 1800W of power. Even the biggest PSUs don't draw much more than a thousand.

paul milton
paul milton
Joined: 16 Sep 05
Posts: 329
Credit: 35825044
RAC: 0

RE: I wouldn't worry about

Message 87157 in response to message 87156

Quote:
I wouldn't worry about wall power much. A standard US outlet is 15A-120V, that's 1800W of power. Even the biggest PSUs don't draw much more than a thousand.

except, most us homes are wired in circute. meaning one entire room is/could be one circute and that circute could be rated for 30A total so if you put a few computers on that one circute you can easyly max out the breaker even tho each plug is rated 15A V x W = A so lets assume 450W per pc and you put 4 pc's in one room thats hmm 3.75A but then add in monitor, printer, lamp, over head light. i belive 15A is about 1300W so a max of 2600W per room?

please feel free to correct my math folks. i really stink at numbers. but you see my point.

seeing without seeing is something the blind learn to do, and seeing beyond vision can be a gift.

Alinator
Alinator
Joined: 8 May 05
Posts: 927
Credit: 9352143
RAC: 0

RE: RE: I wouldn't worry

Message 87158 in response to message 87157

Quote:
Quote:
I wouldn't worry about wall power much. A standard US outlet is 15A-120V, that's 1800W of power. Even the biggest PSUs don't draw much more than a thousand.

except, most us homes are wired in circute. meaning one entire room is/could be one circute and that circute could be rated for 30A total so if you put a few computers on that one circute you can easyly max out the breaker even tho each plug is rated 15A V x W = A so lets assume 450W per pc and you put 4 pc's in one room thats hmm 3.75A but then add in monitor, printer, lamp, over head light. i belive 15A is about 1300W so a max of 2600W per room?

please feel free to correct my math folks. i really stink at numbers. but you see my point.

Only comment is that in the US, most standard outlet circuits are wired for 15 amps (older homes) or 20 amps (newer homes).

Alinator

Paul D. Buck
Paul D. Buck
Joined: 17 Jan 05
Posts: 754
Credit: 5385205
RAC: 0

When I bought my latest

When I bought my latest system (the i7) I also bought one of those plug in amp/watt meters so I can "size" my systems ... the i7 box has been as high as 450 W so far with CUDA running on two cards ... it is lower now as I moved one of the cards ...

But, this one room has the i7, Q9300, two Dell Xeons, and a Mac Pro (3.2 GHz full tower with you don't want to know how many disks ...) one 30" monitor and a 20" ... and a partridge in a pear tree ...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.