I am using ELAN to transcribe and code parent-child interactions for research purposes. I have assistants transcribing by creating annotations along a tier each time the parent says something and marking key terms or actions on different tiers. If I wanted to check the reliability between two sets of annotations, what would be the best way? Is there any way to have annotations listed vertically in a word document or excel so I could compare side by side?
you could merge the files of two assistants to inspect two sets of annotations in ELAN.
you could export the tiers you want to compare to tab delimited text, open them in Excel (or similar) and combine them in one sheet (e.g. by copy/paste). It might be difficult to correlate the annotations because they could be in on different rows (e.g. if one annotator created many more annotations than the other).
you could try the (multiple file) Calculate Inter-Annotator Reliability… function to get e.g. (modified) kappa values. (This doesn’t provide the possibility to visually compare side by side.)