Invalid against Linux or Mac from Windows

Der Mann mit der Ledertasche
Der Mann mit de...
Joined: 12 Dec 05
Posts: 151
Credit: 302594178
RAC: 0

...and here comes a new one

...and here comes a new one from today; same Host of mine.

https://einsteinathome.org/workunit/234472886

waiting for the Result. ;-)

BR

Greetings from the North

Der Mann mit der Ledertasche
Der Mann mit de...
Joined: 12 Dec 05
Posts: 151
Credit: 302594178
RAC: 0

...thanks a lot. BR

...thanks a lot.

BR

Greetings from the North

Logforme
Logforme
Joined: 13 Aug 10
Posts: 332
Credit: 1714373961
RAC: 0

RE: I'm going to

Quote:
I'm going to investigate this a bit more in-depth in January. I think there will be enough such tasks in the database then, so you don't need to collect the task IDs in the meantime.


Something to take into the investigation:
Maybe the problem is not (just) Windows vs Linux? Maybe it's Pentium vs Xeon as in this WU?

Richard Haselgrove
Richard Haselgrove
Joined: 10 Dec 05
Posts: 2142
Credit: 2774211346
RAC: 851229

RE: RE: I'm going to

Quote:
Quote:
I'm going to investigate this a bit more in-depth in January. I think there will be enough such tasks in the database then, so you don't need to collect the task IDs in the meantime.

Something to take into the investigation:
Maybe the problem is not (just) Windows vs Linux? Maybe it's Pentium vs Xeon as in this WU?


That particular one looks like it might be a thermal or overclocking issue.

The two validated Linux tasks both look like server-grade hardware - a 40-core Xeon, in particular, is likely to be locked to stable speeds, and the 4-core E3-1220 is a member of the Atlas cluster - need I say more?

The Windows i5-4670'K', on the other hand, is an unlocked consumer-grade CPU which allows overclocking, though I can't tell whether any overclocking has actually been applied. It's possible that the FGRP4-SSE2 application is pushing that particular CPU beyond its stable limits, or beyond the capacity of its cooling system.

Der Mann mit der Ledertasche
Der Mann mit de...
Joined: 12 Dec 05
Posts: 151
Credit: 302594178
RAC: 0

Hi Folks, after a couple

Hi Folks,

after a couple of Months with no Invalids I'm getting three in a row on different Hosts again by crunching against some Linux Hosts.
Is perhaps the Validator tuned a little bit?

These are the WU's:

https://einsteinathome.org/workunit/247518482 https://einsteinathome.org/workunit/247468209 https://einsteinathome.org/workunit/247416249

THX

Greetings from the North

Christian Beer
Christian Beer
Joined: 9 Feb 05
Posts: 595
Credit: 127711258
RAC: 347639

We haven't changed anything

We haven't changed anything regarding validation. The validation errors are due to normal fluctuations and just by coincidence are between different operating systems.

Here is an example for 247518482:
563083451 (Linux) vs. 563083452 (Windows)
Those two didn't match within the boundaries of validation. There were two lines that didn't match.
563559379 (Linux)
This was calculated to find a quorum. Compared to the first Linux result it is a match but compared to the Windows result there were again two unmatched lines.

The one interesting thing in your examples is that all Windows Systems where 32bit while the Linux systems where 64bit. But this combination is not uncommon and usually has a good validation rate. Here are the rates for the last week:

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

Thanks for the table

Thanks for the table Christian, really interesting. i'm guessing an invalid task is assumed 0.5 quorums.

I'm not sure i understand the errors column, but i was surprised the i686 percentage averages were lower than those x64 for linux.

Is there a similar table for BRP6 and other apps?

Christian Beer
Christian Beer
Joined: 9 Feb 05
Posts: 595
Credit: 127711258
RAC: 347639

An invalid task was always

An invalid task was always compared to two other valid tasks. So a 0.5 means that 1 result of the app version on the left was compared to 1 result of the app version on the top. This is needed in case you compare the results of three different app versions. Once you know this, it's easy to read the table. E.g. 782 to 780 shows 90 valids and 1 invalid and looking at the cells 780-782 and 782-780 you can see that the windows_x86 result was invalid. You can also see that the third result was not a 780 also but probably a 782, 783 or 784 result.

The errors column is for validation errors this is the case when we know before comparing the result to any other result that it is not a valid result. Possibilities are that the data contains strings like "Inf" or "NaN" that are not allowed or the data was truncated and does not have the correct format for whatever reason possible. A high percentage there also means something may be wrong with the application version.

And yes of course we have those tables for all our applications. The FGRPB1 being the smallest and with the least errors in it. The BRP4G matrix is 22x22 because it contains all the Beta application versions too.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.