In addition to this we have powerful HPC server on which usually we perform more important computational work.
This is the really important part of the situation.
The ideal configuration, from my experience, has been to use a macOS system to ssh in to your institution's HPC (or cloud). iTerm2 + CyberDuck + VS Code is pretty much all I have needed for many years. I have got some notes here with the typical extra software I ended up using in my macOS systems, among other things.
I think the question in the OP is itself a little misguided, because in real life you will pretty much never want to be using Windows for bioinformatics (genomics) work. However if you had a Windows PC you could just install Linux (such as Ubuntu) on it and have a much more appropriate system. But at this point, I would still have a hard time recommending it to anyone for their lab's machine, simply because you want your lab workstation to "just work" and not require a tech-head to know how to manage a local Linux workstation. Not that its hard, especially if you are gonna be using Linux on the HPC anyway, but I really think its important that if you are requisitioning something it needs to be as easy as possible and reliable to use. macOS is really what you would want here. Others mentioned the ability to use MS Office; I think this is a critical consideration as well. Also consider the ease of integrating the system into your institution's IT infrastructure. I run a lot of Linux systems at home for personal use, and despite the huge strides that Ubuntu has made in their out-of-the-box experience and driver support, there's still occasional hiccups that I would loathe to deal with in a lab setting.
use it for analysis in python/bash/R and
Python and bash are by far the easiest to work with in macOS over Windows. R (with R Studio) is surprisingly painless on Windows and the experience is about the same on both. Linux makes all three trivial.
I want to make me free from server work for base/medium weight work.
I do not think this is really a worthwhile expectation to have. Just about the only work that I have ever experienced that was feasible on the local system, instead of on the HPC, was downstream prototyping with local R and Rstudio, which would ultimately get back ported to the HPC for final execution. But no matter how beefy of a workstation you get, Mac or PC, you will simply never get the type of throughput that is available on the HPC, to the extent that it makes it not worth your time to bother trying to do much work outside of it. Also consider the headaches involved in keeping your analysis' software configurations in sync between two different systems. I think its far more worthwhile to do two things;
For software, conda
has pretty much got you covered, especially if you stick with newer package versions (some old packages dont have M1 versions available for macOS), and you can pull and run pretty much all Docker containers flawlessly. Building containers on macOS ARM (M1, etc.) to run on x86 Intel HPC is a bit of a headache however. Singularity does not run natively on macOS or Windows, but that is not really an issue if you just stick with Docker (and convert to Singularity from within the HPC environment).
Ultimately my best suggestion is to get a MacBook Pro with the most memory and storage space you can budget (32GB / 1TB or 2TB would be good to shoot for), and utilize the HPC for most everything possible, and utilize network storage for your lab's data archives (dont store all your research data on the laptop). Also worth considering that if you want to be mobile, the 14" laptop models are significantly lighter and easier to take with you e.g. to a conference or to the breakroom to do work while you drink the professors' coffee in the faculty lounge.
Most of the bioinformatic work will probably have to be offloaded to the HPC, especially for the shotgun metagenomic data. The statistical analysis and data visualization component can be done on any of the machines you described in your post. I personally use a 2020 M1 Macbook Air with 8GB of RAM and 256 GB SSD. The machine is currently going for 750 USD new. I am extremely happy with the purchase.
Yup - this exactly.
Plus if you're using any tooling outside of the bioinformatics universe; usually those tools run very well on Macs.
I would not use a Mac these days since they're hilariously overpriced, not upgradable and need another layer of VMs to run something like Docker and Singularity which I use to manage software versions. Any Windows machine with WSL2 to have native Linux (I use standard Ubuntu 22) is a good choice. So in a nutshell, I'd go with the Dell.
I am not sure that "overpriced" is a valid criticism anymore. Ever since the introduction of M1, Mac (and especially MacBook) has had better performance compared to x86 systems at the same price point. And that is before considering the power efficiency which has allowed for truly full-powered fanless systems, and even the MacBook Pro with a fan is pretty much inaudible, combined with a 20hr battery life and superb build quality. "Upgradeability" is a bit of a red herring as well, since a lot of the incredible performance comes from having the entire system on a single unified chip. I dont think its a valid criticism. Especially when most users never upgrade their "upgradeable" systems either for the life of the devices.
Thank you all for the reply. I think that I will choose a MacBook pro, since it is well integrated into the institute's ecosystem and many here use it, having full support from IT with a series of software and any hardware problems/faults. As some of you also suggested is a matter of choice and preferences, I read carefully all the reply and is always fascinating the different way in which different people approach to this things, so really thank you. As the institute will provide me this machine, I think that I will buy a personal laptop (I think I will choose a Dell) to be more free with other things and also in relation to managing of personal data and research data with are not linked to my actual institute.
That's a good call. It's been a while since I had a different OS on my home machine vs work machine, so I'm no sure if this point is still valid, but context switching between Windows and macOS machines can make for a bit of frustration. Hitting the Win key because muscle memory says it's the Cmd key is super frustrating.
I suggest you go with a Mac. You'll get to do all the same science, it'll look great, and you'll annoy the Linux and Windows ppl, as they deserve.
To look great and annoy others may be one of the main reasons people spend extra money in general.
ATpoint has provided reasons for what to choose in his comment. It is your choice and money at the end of the day.
At a technical level all software (in theory) should work on M chips subject to limits of unified memory architecture M chips use (i.e. you are permanently limited to RAM you choose to buy and can't add more later). Much of software should be installable via conda.
surprised that no one has mentioned Rosetta 2, which lets x86 software run seamlessly on Mac's ARM based M1/M2 CPU's. You can
docker run
containers built on x86 Intel on your M1 Mac and it will just work as if it was native, very few issues. The same goes for a lot of other legacy software that doesnt yet have a native ARM build available