Hello, everyone. I joined this forum specifically to solicit feedback on my website, Digital Gravity Agency. My website was infected with malware, resulting in the creation of thousands of malware pages. I was successful in removing the pages and blocking them in the robots.txt file. My concern is whether there is any way I can scan for additional malware that I may have left behind, as well as whether this has affected or continues to affect my SEO. I used:
Disallow: /*.html
Disallow: /*?
code in the robots.txt file. Except for these, all of my current pages are crawlable, so it’s fine.
Kindly help.