In my opinion, read merging is a good idea. Assuming all reads are merged correctly, it greatly increases the quality of the reads, and of mapping. And increases your ability to call long indels, since the length of indels you can call are proportional to read length.
The disadvantages are:
1) Not all read mergers are equal. Using a bad read merger can make your results worse.
2) It's kind of a pain to deal with a merged reads file and an unmerged reads file.
That said, I think the optimal approach to variant-calling is always to merge reads, map the merged and unmerged reads independently, and combine the sams.
@Gabriel, who wishes to play the Devil's advocate:
1) It's pretty easy to recalibrate the quality scores prior to merging. This increases the successful merging rate, as well. Specifically:
bbmap.sh in=reads.fq out=mapped.sam ref=ref.fa
calctruequality.sh in=mapped.sam callvars
bbduk.sh in=reads.fq out=recal.fq recalibrate
bbmerge.sh in=recal.fq out=merged.fq outu=unmerged.fq
2) Quality score output does not matter much in the overlapping portion. Raw quality scores are inaccurate anyway. Your best effort at assigning a quality score to a base will generally be better than whatever Illumina assigns it since they don't give best effort. As long as matching bases get a higher quality score than raw, and mismatching bases get a lower quality score than raw, then assuming the overlap is correct, you're improving the quality scores. It's obviously best to generate perfect quality scores, but that's not really possible given the input data.
3) True, but as long as you use a good merging program, not an issue.
The main disadvantage I see is that some variant callers like to give more weight to variants seen in properly paired reads. But, as long as the variant caller is crafted to give similar weight to a 250bp read compared to a 2x150bp pair with 250bp insert size, that's not an issue either.
Edit - incidentally, as to the issues with variant calling using with both merged and unmerged reads (both the fact that some variant callers prioritize variants indicated by proper pairs, and the fact that it's a pain to map both a merged and unmerged file, then combine them), BBMerge offers "ecco" mode (meaning "error correction by overlap"). This merges the reads, error-corrects the overlapping portion by consensus, and outputs the paired reads as pairs, just with the overlapping portion error-corrected. It's quite convenient. I still recommend actually merging reads to produce longer reads for optimal indel-calling, but I use ecco mode when for whatever reason paired reads are most useful.
I agree whole heartily that merging should be done, how I interpreted the original question was why would some people think that read merging is not an ideal course of action. I never said I agreed with my "counter-arguments" hence the use of the devil's advocate phrasing :-)
Right, I know. I just wanted to play the Angel's advocate :)
Sorry to necro this comment but is merging optimal when performing de novo hybrid assemblies also?