After a few minutes since booting my machine, the memory usage mentioned at free doesn’t match with the usage by processes listed in System Monitor (or top). The sum of all processes that I can find combined don´t use nearly that much RAM.
The user experience is being horrible, with my IDE being randomly killed (I suspect that it happens due memory RAM starvation).
You have 15 G of total ram and 9.5 G available / free (the last column), so 5.5 G used. The processes on the right use up about 3 G at least. The rest could just be system / kernel services that aren’t shown in the process list? Are you sure your RAM is getting exhausted?
Personally I’ve found that Linux isn’t very nice when it gets low on memory, so I use earlyoom to kill unimportant processes on low RAM so that the UI doesn’t get stuck, and make sure I have so much RAM installed that I never risk running low unless there’s a memory leak.
Based on my own experiences in the past I think the excessive memory use is most likely to be due to one of your development tools (VS code, an LSP client, a compiler, a VM) or even the program you are writing and testing than a mysterious system/kernel service.
In most cases I can diagnose the problem using top. Sort processes by memory use, increase the update frequency to 1 or 0.5 seconds, and watch it while I do stuff. I can often spot problems before an OOM killer kills a process.
“free” is not very useful because various caches are subtracted from “free”, so it is quite normal for “free” to be low even in systems with enough usable RAM. The free manpage (man free) has a basic explanation of things, as well as an explanation of why “available” is a good thing to look at.
I would not spend a lot of time adding up columns from different programs expecting things to correlate with one another. The Linux memory system is complex, and memory use often changes dramatically within short amounts of time, especially so when doing software development tasks. For example, it is not uncommon for one compile to use many gigabytes, and if you do parallel builds this can add up quickly. This, of course, depends on the kind of development you are doing.
For comparison, in recent years these are the root causes I have found for why my machine has run out of memory:
In one case my own C++ program consumed huge amounts of memory very suddenly.
In another case the compiler for the D language consumed many gigabytes when compiling a specific file in a program I had checked out from github. Ironically, the program was D’s own LSP client.
I noticed that the clangd LSP client can consume many gigabytes, and I had many of them running.
I don’t use VS Code, but it is based on Electron which is known to consume a lot of memory.