Blocked by robots.txt in GSC going down

The number of known pages in Google Search Console blocked by robots.txt in the Coverage report just recently started going down. Over the course of this month, it went down from about 400K pages to 200K pages.

No changes to robots.txt file in 6+ months nor any big structural changes, 404'd pages, etc. in that amount of time either.

What would cause this number to go down on its own?

Googlebot ignores robots.txt

I'm noticing Googlebot is not respecting my robots.txt. I'm seeing Googlebot's user agent crawling pages that have been in my robots.txt file for many months. Some of them are showing up in GSC as "Indexed, though blocked by robots.txt" with Last crawled dates indicated as recent as yesterday.

Additionally, I'm seeing Googlebot crawl my robots.txt file a few times a day, and the URLs are definitely blocked per the Google robots.txt tester.

My robots.txt is in the following format:

Sitemap: ...

User-agent: *

# ...

Disallow: ...
Disallow: ...
etc. ~ 40 lines

# ...

Disallow: ...
Disallow: ...
etc. ~ 60 lines

# ...

Disallow: ...
Disallow: ...
etc. ~ 20 lines