Googlebot is Google's web crawling bot—an automated software that systematically visits web pages to collect data. It scans websites by following links, reading content, and evaluating site structure. Once a page is crawled, the data is sent to Google's indexing systems, where it is processed and, if deemed useful and unique, considered for inclusion in Google Search.
However, just because a page has been crawled does not guarantee it will appear in search results. That decision is made during the indexing and ranking phases.