Home Forums Software ELAN Issue running inter-annotator reliability

Issue running inter-annotator reliability

This topic contains 7 replies, has 3 voices, and was last updated by  Han 2 months, 1 week ago.

Viewing 8 posts - 1 through 8 (of 8 total)
Author Posts
Author Posts
November 12, 2018 at 15:50 #12494


Hi all,

So for reasons I can’t quite figure out yet, I can’t run reliability on two files I have. The “error” message I get is this: “Warning:
Operation interrupted: There are no segments for the agreement calculation, process stopped.” I know that this isn’t the case, as opening both files separately, you can clearly see each has its own set of segments.

Has anyone else encountered this before? Any advice as to what might be the problem?

Much appreciated.

November 13, 2018 at 14:38 #12497


Could it be that this has to do with the settings for pairing files and/or pairing tiers? Is there any more information concerning the problem in the log after the warning message (View->View Log…)?

November 13, 2018 at 19:59 #12498


Hi Han,

Thanks so much for replying! I don’t think it’s the tier pairing, as we’ve done the same process for other files without error and the tiers are currently matched by name. I checked the log and what it said was that there were consistency errors within the transcription, with an end time before the starting time in one of our tiers. I checked the file and that doesn’t seem to be the case, but any suggestions you would have would be much appreciated.

Thanks in advance!

November 14, 2018 at 16:34 #12499


Hmm, not sure if such an error (end time < begin time) can cause the comparison to completely fail, but it seems that’s the only clue we’ve got.
Are you sure there is no such error in the file? Did you check e.g. in the Grid viewer (where a negative duration can often easily be detected)?

January 23, 2019 at 23:04 #12619


Hi Han,

Sorry for the massive delay in response. Semester got the better of me. As it turns out, that was an issue. When looking in Grid Viewer there was somehow a negative duration segment in one of the files. However, once that spurious segment was deleted, the two files still couldn’t be compared.

In general it seems that inter-annotator reliability as a function is wholly unusable for my files. The two error messages I’ve received are either: “Operation interrupted: No matching files found in the list of selected files” OR “Operation interrupted: There are no segments for the agreement calculation, process stopped.”

Again, I know this is not the case anymore (or for other file comparisons) as this defunct segment only existed in the one file and has since been deleted, the segments required for this calculation exist in each of the files when opened separately, and the files are kept with their source material on a secure server.

I really can’t imagine what the problem is from this end and was hoping for a little guidance. Anything you’d have to say on the matter would be GREATLY appreciated.


January 24, 2019 at 13:37 #12621


No problem.
Just to be sure, does this state of being unusable still concern the two files you mentioned earlier (it seemed to work for your other files)?
I looked into code where the second of the two error messages you mention, “no segments”, is produced and this occurs when both tiers involved in the comparison have 0 annotations, or if one of the tiers involved in the calculation does not exist at all. I noticed that some more messages could be logged to make it easier to discover what the cause of the error exactly is. (I assume you didn’t find additional relevant messages in the log after seeing the error message box?)
It’s difficult to determine what the actual problem is in your case, whether the tiers are not found (maybe looking for the wrong tier in the wrong file) or the annotations couldn’t be found somehow etc.

I don’t suppose the two .eaf files could be send to me (han.sloetjes AT mpi.nl), so that I can reproduce the problem?

March 18, 2019 at 18:27 #12807


I am having the same issue. I’ve checked the files and the tiers have annotations, and are aligned. Did you find a solution?

March 19, 2019 at 11:40 #12808


In the previous case it appeared that the files couldn’t be matched because of inconsistencies in their names. Matching of files and tiers is case sensitive, white spaces etc. matter and is, depending on the settings, based on consistent use of prefixes or suffixes.
If you are sure this is all correct, you can send me two files from which you want to compare tiers, then I can try to reproduce the problem.


Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.