I just passed my website made with Grav on the Spaghetti scanner.
It listed a lot of common used files and folders. I tried many of those knowing those files and folders didn't exist at all on my webserver.
Every links (like https://example.com/node.xml.zip) give a HTTP 200 with my homepage displayed ; it should be a 404.
It can be a SEO disaster and a pain in the 4ss to discover real vulnerabilities with a scanner…
I'm using the default .htaccess
file, directly on a domain name.
Am I missing something ?
Regards
Turns out this is indeed an error with Grav. I opened an issue on Github and the bug has already been patched.
You should see a fix for this behavior in the next release.
Update:
As of the 1.3.8 release (October 27th, 2017), this bug fix has been officially released.