Hi. Novice question here.
I have been practicing a very basic exercise for a project in one of my introductory courses and I have been doing fine with the quality trimming and filtering of a set of reads. However, when I used the data provided by our course facilitators, the quality filtering step in FastX Toolkit outputs "Segmentation Fault".
I have been looking elsewhere in the internet for answers to this error and the most common would be that the computer runs out of memory. However, when I asked our core facility where we have remote access, they told me that I should not be getting this Segmentation Fault because we are supposed to assemble and verify an E. coli genome that they have curated already for this exercise and they have provided us with more than enough memory. I really have no idea what goes wrong with my data seeing that all my file and tool paths are correctly typed.
P.S. I haven't had the chance to talk personally to the core facility since I've been processing the data on a conference out of town.
Fastx toolkit should be retired, really. Nowadays, there are a lot of better and much faster tools for quality filtering.
It's an introductory exercise so I guess it's all right for training. I'm sure we would be introduced to more recent and efficient tools later on.