SPARC Solaris Einstein@Home?

log in

Advanced search

Message boards : Wish List : SPARC Solaris Einstein@Home?

1 · 2 · 3 · Next
Author Message
Profile UTSC Computing Labs
Send message
Joined: 22 Aug 05
Posts: 1
Credit: 44,328,480
RAC: 0
Message 17314 - Posted: 22 Aug 2005, 15:32:03 UTC

Is there a Solaris/SPARC application in the works?

Desti
Send message
Joined: 20 Aug 05
Posts: 117
Credit: 10,697,587
RAC: 12,046
Message 17512 - Posted: 27 Aug 2005, 14:45:14 UTC - in response to Message 17314.

Is there a Solaris/SPARC application in the works?


There are only some older statements:

http://einstein.phys.uwm.edu/forum_thread.php?id=55

http://einstein.phys.uwm.edu/forum_thread.php?id=662

http://einstein.phys.uwm.edu/forum_thread.php?id=2204


____________
Profile Bernd Machenschalk
Volunteer moderator
Project administrator
Project developer
Avatar
Send message
Joined: 15 Oct 04
Posts: 3563
Credit: 114,998,741
RAC: 76,506
Message 17535 - Posted: 28 Aug 2005, 2:54:13 UTC
Last modified: 28 Aug 2005, 2:54:42 UTC

We had a voulunteer for doing the port, but I haven't heard of him for months now. Apparently he ran out of free time for that (like we all do...). Currently there is done quite some work on the code of BOINC as well as on the science code. Maybe we'll start another attempt when the codebase is more stable again. But the priority for new ports is rather low for now.

BM
____________
BM

Profile xtbart
Send message
Joined: 9 Feb 05
Posts: 8
Credit: 332,804,511
RAC: 180,975
Message 17683 - Posted: 31 Aug 2005, 21:38:28 UTC

very interested in SPARC Solaris client too

got like 50 Blades ready for it when it gets released.......


____________

Profile Bernd Machenschalk
Volunteer moderator
Project administrator
Project developer
Avatar
Send message
Joined: 15 Oct 04
Posts: 3563
Credit: 114,998,741
RAC: 76,506
Message 25298 - Posted: 10 Jan 2006, 5:00:37 UTC
Last modified: 10 Jan 2006, 5:02:14 UTC

I just built a Solaris/SPARC version of the new Albert App, which seems to run well on Solaris 7 and 10 (and thus it should on 8 and 9, too). We plan to release it after some testing. Our SPARCs are somewhat slow, so this may take a while. Stay tuned.

BM

____________
BM

Profile yoyo_rkn
Avatar
Send message
Joined: 16 Feb 05
Posts: 8
Credit: 358,985
RAC: 158
Message 25338 - Posted: 10 Jan 2006, 20:23:34 UTC

This sounds very good, txs.
yoyo
____________
Member of Germany largest distributed computing community

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25408 - Posted: 11 Jan 2006, 20:47:22 UTC

I have also 5 SPARC-Solaris systems at hand and would be glad, if I could use them for another BOINC project too (only SETI@home available so far, by binary and source code both). If you would dare to give us the source, we might try ourselves (for example on x86-Solaris, where a number of fast AMD64 systems are waiting...). More precise, they would do Einstein@home primarily, because it has a higher priority in my profile than SETI@home...
____________

Profile Bruce Allen
Volunteer moderator
Project administrator
Project developer
Project scientist
Avatar
Send message
Joined: 15 Oct 04
Posts: 1103
Credit: 171,768,817
RAC: 0
Message 25613 - Posted: 14 Jan 2006, 17:50:46 UTC

The Solaris Einstein@Home app that Bernd built is now being distributed (command line version only; graphical version coming soon). Please use this thread to report success or problems.

Bruce

____________

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25640 - Posted: 15 Jan 2006, 4:38:53 UTC

Looks like the same issue as observed with SETI-BOINC: "Message from server: platform 'sparc-sun-solaris2.9' not found" --- I thought, you wanted to omit the annoying version number after solaris to prevent such problems, when I read the discussions on the BOINC-dev mailing-list? Is the same sparc-sun-solaris2.7 expected as by Berkeley anyway?
____________

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25641 - Posted: 15 Jan 2006, 4:42:34 UTC

Proposal for handling this Solaris version differences issue: either add a generic pattern in the MySQL database or manually add and link all variants from the supported one on (i.e. 2.7, 2.8, 2.9 and 2.10 and for the brave among us even 2.11 for the current (community/OpenSolaris) express versions, or just begin with 2.8 or the like). Otherwise I will try with the original 2.7 client.
____________

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25644 - Posted: 15 Jan 2006, 7:03:35 UTC - in response to Message 25641.

At least version 4.19 of the official BOINC sparc client exits with a SEGV on Solaris 9 --- I could try on Solaris 10 too (dtrace!) or analyze the core file, if one would be created. So I was forced to use the anonymous platform mechanism again, no good, when considering the beta state and likely updates for the albert application.
____________

Profile Bernd Machenschalk
Volunteer moderator
Project administrator
Project developer
Avatar
Send message
Joined: 15 Oct 04
Posts: 3563
Credit: 114,998,741
RAC: 76,506
Message 25647 - Posted: 15 Jan 2006, 9:11:48 UTC
Last modified: 15 Jan 2006, 9:14:28 UTC

The stock clients all request sparc-sun-solaris2.7 platform, as they are built on a Sol7 so they should run on all systems from then on. If you compile your own client, you are advised to use --build=sparc-sun-solaris2.7 as additional configure option, so the client will report the same platform.

Due to an old build process (which has been improved since then) the old 4.19 links to shared libraries that are not present on all systems (to say the least), in particular libstdc++.so.3 and libgcc_s.so.1. If you have a gcc installed, you can create a symlink named libstdc++.so.3 to your version of libstdc++.so.

I am running the stock recommended 4.43 client without problems.

BM
____________
BM

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25649 - Posted: 15 Jan 2006, 10:56:57 UTC - in response to Message 25647.

The stock clients all request sparc-sun-solaris2.7 platform, as they are built on a Sol7 so they should run on all systems from then on. If you compile your own client, you are advised to use --build=sparc-sun-solaris2.7 as additional configure option, so the client will report the same platform.

Due to an old build process (which has been improved since then) the old 4.19 links to shared libraries that are not present on all systems (to say the least), in particular libstdc++.so.3 and libgcc_s.so.1. If you have a gcc installed, you can create a symlink named libstdc++.so.3 to your version of libstdc++.so.

I am running the stock recommended 4.43 client without problems.

BM


Indeed I had to softlink the libstdc++.so v3 to a current v6 and not tried the more recent client version 4.43 there, but on another machine running Solaris 10 albert v4.36 works so far smoothly (with the anonymous platform mechanism too). Your proposal makes sense, to force an older Solaris version; so far it didn't make a difference, because SETI@home is open source like BOINC and I used to compile both subsequently, so I need in this situation always the app_info.xml to run the self-compiled application.

____________
Profile [AF>ALSACE>EDLS] Phil68
Send message
Joined: 30 Dec 05
Posts: 32
Credit: 39,832
RAC: 0
Message 25657 - Posted: 15 Jan 2006, 14:03:47 UTC

Hi...
I'm very happy to have one application for my solaris (other than seti)...
But at this time i have problems whit the computing... some WUs come with no time (and they stay with 0% in my BViewer...)...
is that a WU problem or a Application problem ?
____________

Profile Bernd Machenschalk
Volunteer moderator
Project administrator
Project developer
Avatar
Send message
Joined: 15 Oct 04
Posts: 3563
Credit: 114,998,741
RAC: 76,506
Message 25663 - Posted: 15 Jan 2006, 15:58:48 UTC - in response to Message 25657.

Hi...
I'm very happy to have one application for my solaris (other than seti)...
But at this time i have problems whit the computing... some WUs come with no time (and they stay with 0% in my BViewer...)...
is that a WU problem or a Application problem ?


Which client are you using? I don't know about BViewer, but what do you get when you grep client_state.xml for "fraction_done" on the machine in question?

BM
____________
BM
Profile [AF>ALSACE>EDLS] Phil68
Send message
Joined: 30 Dec 05
Posts: 32
Credit: 39,832
RAC: 0
Message 25706 - Posted: 16 Jan 2006, 7:51:20 UTC - in response to Message 25663.
Last modified: 16 Jan 2006, 8:13:04 UTC

Which client are you using? I don't know about BViewer, but what do you get when you grep client_state.xml for "fraction_done" on the machine in question?

BM


i have 2 processors and at this time 1 of them is occupied with seti...
is this application able to work with 2 WUs at the same time ?.......
my Version is -> BOINC client version 4.43 for sparc-sun-solaris2.7

after a kill and a restart from boinc-client.... the % is 0.0 but the CPU-Time is 05:07:20....

thanks

before kill
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1078_S4R2a_0</result_name>
<active_task_state>1</active_task_state>
<app_version_num>436</app_version_num>
<slot>1</slot>
<scheduler_state>2</scheduler_state>
<checkpoint_cpu_time>18440.180000</checkpoint_cpu_time>
<fraction_done>0.776293</fraction_done>
<current_cpu_time>18463.590000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1077_S4R2a_0</result_name>
<active_task_state>9</active_task_state>
<app_version_num>436</app_version_num>
<slot>2</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>

after restart

<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1077_S4R2a_0</result_name>
<active_task_state>0</active_task_state>
<app_version_num>436</app_version_num>
<slot>2</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1076_S4R2a_0</result_name>
<active_task_state>0</active_task_state>
<app_version_num>436</app_version_num>
<slot>3</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1075_S4R2a_0</result_name>
<active_task_state>9</active_task_state>
<app_version_num>436</app_version_num>
<slot>4</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>

____________
Augustine
Avatar
Send message
Joined: 22 Jan 05
Posts: 47
Credit: 474,071
RAC: 0
Message 25722 - Posted: 16 Jan 2006, 17:40:39 UTC

IMHO, it would make much more sense to have a native x86-64 client, volume-wise.

____________

Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25739 - Posted: 16 Jan 2006, 21:59:19 UTC - in response to Message 25722.

IMHO, it would make much more sense to have a native x86-64 client, volume-wise.


It depends, but Solaris and Linux clients for x86_64 CPUs would be fine regarding performance, as I know from SETI@home. Can't beat heavy SMP SPARC systems on the other hand (even a 4 core system with AMD 275 Opterons is not able to top 32 or the like UltraSPARC III systems, for example).
____________
Profile Stefan Urbat
Send message
Joined: 9 Feb 05
Posts: 16
Credit: 147,672
RAC: 0
Message 25767 - Posted: 17 Jan 2006, 7:08:10 UTC

The first result seems to be cleanly completed on the fastest SPARC-Solaris system I have access too, though it is not visible on the website (log statement).

But the performance looks rather poor: on this 1062 MHz UltraSPARC IIIi CPU it took nearly exactly 100000 seconds, i.e. more than one day CPU time, to finish it. I would have expected values of 40000 seconds on this hardware taking into account the relations in resp. to SETI@home among different CPUs.

So there seems to be a lot of potential for optimization of the SPARC client, doesn't it? The other machine, launched earlier for Einstein@home, a mere 550 MHz UltraSPARC II CPU driven, will take about two CPU days to complete, it seems...
____________

Profile [AF>ALSACE>EDLS] Phil68
Send message
Joined: 30 Dec 05
Posts: 32
Credit: 39,832
RAC: 0
Message 25770 - Posted: 17 Jan 2006, 8:03:40 UTC - in response to Message 25767.
Last modified: 17 Jan 2006, 8:53:06 UTC

Today i have 3 WUs on my Sun-machine.
I let the first one work alone, i have paused the 2 other and the SETI-ones....
This WU (z1_0175.0__1072_S4R2a_0) seams to be blocked now.... and i'm sure that if i restart the boinc-client, the WU will have 0 CPU-Time.....

<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1072_S4R2a_0</result_name>
<active_task_state>1</active_task_state>
<app_version_num>436</app_version_num>
<slot>1</slot>
<scheduler_state>2</scheduler_state>
<checkpoint_cpu_time>3606.320000</checkpoint_cpu_time>
<fraction_done>0.141345</fraction_done>
<current_cpu_time>3606.320000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1071_S4R2a_0</result_name>
<active_task_state>9</active_task_state>
<app_version_num>436</app_version_num>
<slot>2</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1070_S4R2a_0</result_name>
<active_task_state>9</active_task_state>
<app_version_num>436</app_version_num>
<slot>3</slot>
<scheduler_state>1</scheduler_state>
<checkpoint_cpu_time>0.000000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>0.000000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>



and after the kill of the boinc....
<active_task>
<project_master_url>http://einstein.phys.uwm.edu/</project_master_url>
<result_name>z1_0175.0__1072_S4R2a_0</result_name>
<active_task_state>1</active_task_state>
<app_version_num>436</app_version_num>
<slot>1</slot>
<scheduler_state>2</scheduler_state>
<checkpoint_cpu_time>3606.320000</checkpoint_cpu_time>
<fraction_done>0.000000</fraction_done>
<current_cpu_time>3606.320000</current_cpu_time>
<vm_bytes>0.000000</vm_bytes>
<rss_bytes>0.000000</rss_bytes>
</active_task>

and the work didn't go farther.... and after i canceld it.... here the results (as if it were completed)...

15004119 3706440 17 Jan 2006 0:51:59 UTC 17 Jan 2006 8:45:49 UTC Over Client error Computing 3,606.32 3.66 ---
____________

1 · 2 · 3 · Next

Message boards : Wish List : SPARC Solaris Einstein@Home?


Home · Your account · Message boards

This material is based upon work supported by the National Science Foundation (NSF) under Grants PHY-1104902, PHY-1104617 and PHY-1105572 and by the Max Planck Gesellschaft (MPG). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the investigators and do not necessarily reflect the views of the NSF or the MPG.

Copyright © 2016 Bruce Allen