I have a maximum likelihood tree inferred from DNA sequences and I don't know how to interpret the scale in terms of sequence divergence. I know that a given length represents a number of nucleotide substitutions per site. On my tree, 1 cm is equal to 0.1 nucleotide substitutions per site. As the sum of branch lengths between some terminal nodes is 10 times the scale; does it mean that their divergence is 10 x 0.1 = 1 nucleotide substitution per site, i.e. 100% divergence?