I have tried couple methods. The biggest problem I found is, since you have no idea what's the real data suppose to be, there is no way you can really measure the accuracy of these methods. If I have to pick one to use, I will choose MNN or Scanorama (but most of the time, I prefer not to torture the data).
I don't think I have tried those methods. It has been a little while since I worked on a scRNA-Seq dataset, but those were the main things that I could think of.
I have used the Seurat wrapper around fastMNN and can recommend it. It is generally less heavy-handed than Seurat's normal integration method and explains the amount of variance lost between each batch, which is a useful check to ensure only a small amount of (presumably technical) variation is lost at each step.
I can't speak to the other methods, as I haven't used them.
I have tried couple methods. The biggest problem I found is, since you have no idea what's the real data suppose to be, there is no way you can really measure the accuracy of these methods. If I have to pick one to use, I will choose MNN or Scanorama (but most of the time, I prefer not to torture the data).
Agreed - if you can avoid batch correction (or have no evidence that it's occurring), you should definitely avoid it.
I think you already asked this previously: about batch correction in scRNA-seq
yes, thank you Igor.
I was wondering what the experience of the people was ... , or if there are any other suggestions.