If you have ever submitted a manuscript to an academic journal for publication, there is a chance your submission was screened for possible plagiarism. Many academic journals use a screening service, such as Similarity Check, powered by iThenticate, to detect the amount of similar text between submitted manuscripts and published articles. One question that many authors have is how much similarity is too much. Another is whether there is a cut-off level of similarity that authors should strive to be under.
Unfortunately, there is no set standard across journals, which makes this a murky area for authors seeking clarity on this issue, especially young researchers who may be inexperienced science writers. For example, similarity below 20% may be acceptable for some journals, whereas other journals may reject a submission if the similarity is higher than 10%.
Moreover, large areas of highlighted text in the similarity report could be phrasing or terminology that is commonly used in academic papers, and rewording this text to reduce any similarity may result in awkward phrasing or wordiness. Acceptable levels of similarity may also vary depending on the type of submission, such as a review article versus an original research paper. One thing to keep in mind is that similarity does not necessarily mean intentional plagiarism.
An important aspect when reviewing any similarity report is to distinguish between what would be considered clear plagiarism and duplication of short, commonly used phrases. Being able to interpret the results of a similarity check will help in determining the areas of your manuscript that may need to be reworded. Once you understand the results, you can employ strategies for rewording the text, such as paraphrasing, using synonyms, or changing the sentence structure.
Click here for the Japanese version.