Google Search Console unable to fetch robots.txt - strange issue

Hello everyone. I would like to share an issue on a client's website that has been ongoing for several weeks and for which I see no solution.

I have a Google Ads account that has come to a complete standstill because Google Search Console cannot read the robots.txt file.

As a consequence, it also cannot read the sitemap.xml.

Strangely, the Search Console can retrieve the robots.txt and the sitemap.xml on the site's version without "www".
I have practically tested everything there was to test, but the problem persists:

Permissions of the robots.txt file are okay.
All redirects on the site are okay.
Canonical URLs are enabled in Gridbox.
Security certificates: okay.
The domain is not listed on any blacklist.
Google Bots are not blocked in the server's firewall.
The .htaccess file has the correct rules.
The robots.txt and sitemap.xml are accessible via browser.
etc.

Has anyone here had a similar problem?
I know this topic is certainly not related to Gridbox, but could it be related to canonical URLs?
As I've been in this struggle for several weeks now, I thought I'd bring up the situation here. It never hurts to try.
Thank you.

Best regards
Nuno F.

Replies are visible only to logged in members with an active subscription.