I have also tried merging the two projects together into one new project, and then running the coding query again. As listed towards the bottom of the document, when running the query all the Kappa coefficients are either 0 or 1, so I know something wrong. I tried going through the steps listed in this document to conduct a coding comparison query am a little confused as to how this would be able to measure agreement between the two projects. I am trying to measure inter-rater reliability between two different coders who have coded the same set of interviews separately, using their respective NVivo accounts, (so essentially, there are two separate projects, one from coder A, and one from coder B, both containing the same files same codes, but they have been coded separately).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |