found no concordant and consistent mapping using Salmon
0
0
Entering edit mode
2.4 years ago
keren.danan ▴ 20

Hi,

I am new to bulkRNA-seq and trying to convert my fastq files to count matrix so I could run a differential analysis. I have read it is suggested to use Salmon. I have a few questions:

  1. My data consists of healthy and control patients, so I created an index to salmon using the gencode human transcriptome and mentioned in the command to create the index --gencode:

    salmon index -t gencode.v41.transcripts.fa.gz -i salmon_index --gencode

When running quant method:

salmon quant -i salmon_index --libType A -1 ${prefix}_R1_001.fastq.gz -2 ${prefix}_R2_001.fastq.gz -o /bulkRNAseq_human/INCPMPM-14146.0/quant_samples/quant/${prefix};

I got a warning for all of my samples saying : "found no concordant and consistent mappings". When looking at some of the metadata info, I found most of them with above 90% mapped. Is it a problem? should I change anything?

  1. If I want to use salmon on bulk RNA data from mice but I see there is only one read for example, one fastq file look like this : FGC2321_s_1_AGCTCGCT-GCAGAATC.fastq, how am I supposed to run the quant method?

Thanks :)

RNAseq salmon bulkRNAseq • 2.2k views
ADD COMMENT
1
Entering edit mode

What is the read length of the experiment? Please show the salmon logs. Single-end quantification for the 2nd question is covered in the salmon manual.

ADD REPLY
0
Entering edit mode

after running : gunzip -c H1_S11_L001_R1_001.fastq.gz | awk 'NR%4 == 2 {lengths[length($0)]++} END {for (l in lengths) {print l, lengths[l]}}' It is printed - 67 9145551. About the logs:

[2022-07-17 14:34:08.236] [jointLog] [info] setting maxHashResizeThreads to 96
[2022-07-17 14:34:08.236] [jointLog] [info] Fragment incompatibility prior below threshold.  Incompatible fragments will be ignored.
[2022-07-17 14:34:08.236] [jointLog] [info] Usage of --validateMappings implies use of minScoreFraction. Since not explicitly specified, it is being set to 0.65
[2022-07-17 14:34:08.236] [jointLog] [info] Setting consensusSlack to selective-alignment default of 0.35.
[2022-07-17 14:34:08.236] [jointLog] [info] parsing read library format
[2022-07-17 14:34:08.236] [jointLog] [info] There is 1 library.
[2022-07-17 14:34:08.236] [jointLog] [info] Loading pufferfish index
[2022-07-17 14:34:08.236] [jointLog] [info] Loading dense pufferfish index.
[2022-07-17 14:34:09.413] [jointLog] [info] done
[2022-07-17 14:34:09.463] [jointLog] [info] Index contained 250,381 targets
[2022-07-17 14:34:09.507] [jointLog] [info] Number of decoys : 0
[2022-07-17 14:34:24.335] [fileLog] [info]
At end of round 0
==================
Observed 8335066 total fragments (8335066 in most recent round)

[2022-07-17 14:34:24.334] [jointLog] [info] Computed 426,076 rich equivalence classes for further processing
[2022-07-17 14:34:24.334] [jointLog] [info] Counted 7,629,092 total reads in the equivalence classes
[2022-07-17 14:34:24.485] [jointLog] [info] Number of mappings discarded because of alignment score : 1,020,939
[2022-07-17 14:34:24.485] [jointLog] [info] Number of fragments entirely discarded because of alignment score : 143,414
[2022-07-17 14:34:24.485] [jointLog] [info] Number of fragments discarded because they are best-mapped to decoys : 0
[2022-07-17 14:34:24.485] [jointLog] [info] Number of fragments discarded because they have only dovetail (discordant) mappings to valid targets : 0
[2022-07-17 14:34:24.485] [jointLog] [info] Mapping rate = 91.5301%

[2022-07-17 14:34:24.485] [jointLog] [info] finished quantifyLibrary()
[2022-07-17 14:34:24.489] [jointLog] [info] Starting optimizer
[2022-07-17 14:34:24.663] [jointLog] [info] Marked 0 weighted equivalence classes as degenerate
[2022-07-17 14:34:24.671] [jointLog] [info] iteration = 0 | max rel diff. = 2580.12
[2022-07-17 14:34:25.353] [jointLog] [info] iteration = 100 | max rel diff. = 18.1319
[2022-07-17 14:34:26.025] [jointLog] [info] iteration = 200 | max rel diff. = 19.3459
[2022-07-17 14:34:26.696] [jointLog] [info] iteration = 300 | max rel diff. = 15.3254
[2022-07-17 14:34:27.368] [jointLog] [info] iteration = 400 | max rel diff. = 12.866
[2022-07-17 14:34:28.040] [jointLog] [info] iteration = 500 | max rel diff. = 0.283219
[2022-07-17 14:34:28.712] [jointLog] [info] iteration = 600 | max rel diff. = 5.67953
[2022-07-17 14:34:29.383] [jointLog] [info] iteration = 700 | max rel diff. = 0.208217
[2022-07-17 14:34:30.053] [jointLog] [info] iteration = 800 | max rel diff. = 4.05825
[2022-07-17 14:34:30.725] [jointLog] [info] iteration = 900 | max rel diff. = 2.69145
[2022-07-17 14:34:31.396] [jointLog] [info] iteration = 1,000 | max rel diff. = 0.333874
[2022-07-17 14:34:32.066] [jointLog] [info] iteration = 1,100 | max rel diff. = 0.0407324
[2022-07-17 14:34:32.147] [jointLog] [info] iteration = 1,113 | max rel diff. = 0.00810947
[2022-07-17 14:34:32.163] [jointLog] [info] Finished optimizer
[2022-07-17 14:34:32.163] [jointLog] [info] writing output

[2022-07-17 14:34:32.499] [jointLog] [warning] NOTE: Read Lib [[ H15_S1_L001_R1_001.fastq.gz, H15_S1_L001_R2_001.fastq.gz]] :

Found no concordant and consistent mappings. If this is a paired-end library, are you sure the reads are properly paired? 
ADD REPLY
0
Entering edit mode

Can you provide the contents of the file lib_format_counts.json?

ADD REPLY
0
Entering edit mode
"read_files": "[ H15N_S2_L001_R1_001.fastq.gz, H15N_S2_L001_R2_001.fastq.gz]",
"expected_format": "IU",
"compatible_fragment_ratio": 1.0,
"num_compatible_fragments": 7165735,
"num_assigned_fragments": 7165735,
"num_frags_with_concordant_consistent_mappings": 0,
"num_frags_with_inconsistent_or_orphan_mappings": 7440128,
"strand_mapping_bias": 0.0,
"MSF": 0,
"OSF": 0,
"ISF": 0,
"MSR": 0,
"OSR": 0,
"ISR": 0,
"SF": 3777656,
"SR": 3662472,
"MU": 0,
"OU": 0,
"IU": 0,
"U": 0
ADD REPLY
1
Entering edit mode

Thanks! So it looks like you have reads mapping but almost always as orphans. You may want to look at running your data through repair.sh from the BBmap toolkit to "repair" or re-synchronize the files, as it looks like they may may somehow have gotten desynchronized.

ADD REPLY
0
Entering edit mode

That having said, dod you manipulate the data somehow, e.g. trimming, renaming of reads, something like that?

ADD REPLY
0
Entering edit mode

No, this is the raw data I received. I will try to use repair.sh than. Thanks!

ADD REPLY

Login before adding your answer.

Traffic: 1664 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6