**Step-by-Step Python Tutorial: Automate Google Search for Daily Keyword Rankings with Selenium and CSV Export**

  • Stop Manual Tracking: Manually checking your Google rankings is slow, error-prone, and doesn't reflect what your customers actually see.
  • Build Your Own Tool: You can create a free, automated keyword rank tracker using Python with the Selenium and Pandas libraries in about 30 lines of code.
  • Get Daily Reports: The script automatically searches Google for your keywords, scrapes the top results, and saves them to a new timestamped CSV file each day.

I once watched a friend, a brilliant founder, spend the first 30 minutes of every single day in a state of pure misery. He wasn't dealing with investor emails or bug reports. No, he was manually typing his top 20 keywords into Google, scrolling through the results, and marking his company's ranking in a spreadsheet.

He called it his "daily SEO ritual." I called it a soul-crushing waste of time. The shocking part? He was paying over $300 a month for an SEO tool that he didn't trust, which is why he was doing it manually in the first place.

That madness ends today. Forget expensive, clunky SEO software. I'm going to show you how to build your own automated keyword rank tracker with about 30 lines of Python. It's fast, it's free, and it's completely under your control.

Introduction: Why Automate Your Keyword Tracking?

The Problem with Manual Rank Checks

Let's be real: manual rank checking is a terrible use of human brainpower. It’s repetitive and prone to error (did you forget to use Incognito mode?).

Google's results are personalized, so what you see isn't what your potential customers see. It's an unreliable method that provides very little real-world value.

The Solution: A Custom Python Scraper

The solution is to command a robot to do the work for us. By using Python with the Selenium library, we can create a script that: 1. Reads a list of keywords from a simple file. 2. Opens a browser (invisibly, in the background). 3. Performs a Google search for each keyword. 4. Scrapes the top results and their positions. 5. Saves everything neatly into a CSV file with today's date.

Run it daily, and you have a powerful, historical log of your SEO performance without lifting a finger.

Prerequisites: Setting Up Your Development Environment

Before we write a single line of code, we need to get our tools in order. Don't worry, this is the easy part.

Installing Python and Pip

If you don't have Python installed, head over to the official Python website and download the latest version. The installer typically includes Pip, Python's package manager, which is what we'll use to get our libraries.

Installing Selenium & Pandas Libraries

Open your terminal or command prompt and run this simple command. This is the magic wand that installs all the packages we need.

pip install selenium pandas webdriver-manager
  • Selenium: The core library that automates browser actions.
  • Pandas: A powerhouse for data manipulation. We'll use it to easily read our keywords from a CSV.
  • webdriver-manager: My favorite little hack. This library automatically downloads and manages the correct browser driver for you, so you don't have to mess with it manually.

Downloading the Correct WebDriver for Your Browser

Just kidding. Thanks to webdriver-manager, you can skip this annoying step that trips up so many beginners. The library handles it for you.

Step 1: Preparing Your Keywords

First, let's create a simple file to hold all the keywords we want to track.

Create a new file in your project folder named keywords.csv. Inside, create a single column header called keyword and list your keywords below it, one per line.

keywords.csv:

keyword
python selenium tutorial
google search automation
daily keyword rankings

Step 2: Building the Scraper - Automating the Google Search

Now for the fun part: writing the code.

Importing Libraries and Initializing Selenium

We'll start by importing the libraries we installed and setting up the basic function.

# Step 1: Import Libraries
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.by import By
from webdriver_manager.chrome import ChromeDriverManager
import pandas as pd
import csv
import time

# Step 2: Read Keywords from CSV
keywords_df = pd.read_csv('keywords.csv')
keywords = keywords_df['keyword'].tolist()

This code block imports everything we need and then uses Pandas to read your keywords.csv file into a simple Python list.

Writing the Function to Perform a Search Query

Next, we'll create a function that takes a keyword, runs the search, and extracts the results. This is the heart of our script.

def google_search(keyword, max_results=10):
    options = Options()
    options.add_argument('--headless')  # Run browser invisibly
    options.add_argument('--no-sandbox')
    options.add_argument('--disable-dev-shm-usage')

    # This is where webdriver-manager does its magic
    service = Service(ChromeDriverManager().install())
    driver = webdriver.Chrome(service=service, options=options)

    # Handle Google's URL structure
    search_url = f"https://www.google.com/search?q={keyword.replace(' ', '+')}&num={max_results}"
    driver.get(search_url)
    time.sleep(2)  # Give the page a moment to load

    # --- Parsing code will go here in the next step ---

    driver.quit()
    return [] # Placeholder for now

The --headless argument is key—it runs Chrome in the background so you don't have a browser window popping up while the script runs. We also construct the Google search URL by replacing spaces in our keyword with + signs.

Step 3: Parsing the Results to Find Your Ranking

Once the search is performed, we need to "read" the results page.

Inspecting Google's Search Result Page (SERP) structure

Google's search results are just HTML. We can target specific elements on the page to extract the data we want. The main organic results are typically contained in <h3> tags for the title and <cite> tags for the display link.

Using XPath to Isolate Search Results

XPath is a language for navigating an HTML document. It's like giving GPS coordinates to find an element on a webpage. We can use a simple XPath to grab all the titles and links.

Looping Through Results to Extract Titles, Links, and Positions

Let's update our google_search function to include the parsing logic.

def google_search(keyword, max_results=10):
    # ... (previous setup code remains the same) ...
    service = Service(ChromeDriverManager().install())
    driver = webdriver.Chrome(service=service, options=options)

    search_url = f"https://www.google.com/search?q={keyword.replace(' ', '+')}&num={max_results}"
    driver.get(search_url)
    time.sleep(2)

    # NEW: Extract top results (title, link, position)
    results = []
    for i in range(1, max_results + 1):
        try:
            # This XPath finds the i-th h3 tag on the page
            title_elem = driver.find_element(By.XPATH, f'(//h3)[{i}]')
            # This XPath finds the i-th cite tag
            link_elem = driver.find_element(By.XPATH, f'(//cite)[{i}]')

            title = title_elem.text
            link = link_elem.text

            results.append({'keyword': keyword, 'position': i, 'title': title, 'link': link})
        except Exception as e:
            # If an element isn't found, just skip it (e.g., for ads, People Also Ask boxes)
            pass

    driver.quit()
    return results

This loop iterates from 1 to 10, trying to find the first result, then the second, and so on. We wrap it in a try-except block because sometimes Google mixes in ads or other features, and we don't want that to crash our script.

Step 4: Saving Your Data with a Timestamp to CSV

Finally, let's bring it all together. We need to loop through our keywords, call our search function for each one, and then save all the collected data.

Structuring the Data and Putting the Full Script Together

Here is the final part of the script that executes the logic and handles the file export. Add this to the bottom of your Python file.

# --- Your function definition from above goes here ---

# Step 4: Loop through all keywords and collect the results
all_results = []
for keyword in keywords:
    print(f"Searching for: {keyword}...")
    results = google_search(keyword)
    all_results.extend(results)
    time.sleep(1) # Small delay to be nice to Google's servers

# Step 5: Export to a CSV with a timestamp
filename = f'rankings_{pd.Timestamp.now().strftime("%Y%m%d")}.csv'
with open(filename, 'w', newline='', encoding='utf-8') as f:
    writer = csv.DictWriter(f, fieldnames=['keyword', 'position', 'title', 'link'])
    writer.writeheader()
    writer.writerows(all_results)

print(f"Done! Results saved to {filename}")

This code creates a unique filename for each day (e.g., rankings_20260426.csv). This ensures you can track your progress over time without overwriting previous reports.

Conclusion and Next Steps

Reviewing Your Automated Rank Tracker

And that's it! Run the script, and in a few moments, you'll have a perfectly formatted CSV with the top 10 rankings for every keyword you're tracking. You just built a tool that SEO agencies charge hundreds of dollars for.

Ideas for Improvement (Error Handling, Proxies, Scheduling)

This script is a powerful starting point, but it's not invincible. Here are some ways you could level it up: * Rate Limits & Proxies: If you run this for hundreds of keywords, Google might temporarily block your IP. For large-scale operations, you'll want to integrate proxies and increase the time.sleep() delay. * Error Handling: A more robust version would log errors if a search fails or if Google changes its HTML structure (which it does!). * Scheduling: Use cron on Linux/Mac or Task Scheduler on Windows to run this script automatically every morning. * Find Your Domain: Modify the script to find your specific domain in the results and report back only its position.

You've just automated a tedious task and taken control of your own data. Now go build something amazing with all that time you just saved.



Recommended Watch

📺 How To Check Search Volume Of A Keyword Using #python | #google #trends Api
📺 FREE Keyword Rank Checker | Check Keyword Ranking in Google (Hindi)

💬 Thoughts? Share in the comments below!

Comments