"How do you know this, Michael? " In the first case it's been the pattern that there are follow up probes when they detect a script they might be able to exploit. The first probe flags the detected URLs in a database somewhere and a second app comes along - 1-2 weeks later, maybe - and tries to break in. I saw that in server log file analyses many times. In the second case, I've read a lot of research papers where the researchers said something like, "We ran a crawler on the Web looking for files of type [X] . " And in the last case I've seen a few discussions in marketing groups where people said "you should crawl sites looking for files of type [X] because those people are using the software you're interested in. " Of course, there could be a bajillion other reasons why people do this stuff but those are the three reasons I keep encountering after 20+ years.