Block malicious links via robots.txt

Hello guys,
Recently my site was infected with malware, which caused me a lot of problems. In particular, many spam links have been created and indexed. I managed to get a lot out of them with Google search console, but it still appears in some key searches. Is there any chance of blocking the link prefix in robots.txt to deleting itself from google?

<snip>

I want somehow to block indexing all links

I know i can block like this:
User-agent: *
Disallow: /product/categories

But this one is different, its not like a parent page/category. I would appreciate very much if you can help me, cheers!