Google Trends State-/County-Level Scraper

Data from Google searches often provides a wonderful insight into what people are thinking. There is a lot of great research out there that describes the usefulness and validity of Google Trends data that I won't discuss here. Instead, we have developed scripts that make it easier for researchers to make use of Google Trends data, especially when they want to relate them to other variables at the state- and county-level (thanks to Jacob Arleigh for development support!). Best of all, it comes with a simple to use user interface that makes scraping Google Trends data as easy as a button click.

For Mac users, to make use of the Google Trends scraper:
    1) Download the following four files into the same folder: First, Second, Third, Fourth.
    2) Make sure you have Python and Selenium downloaded.
    3) Open the application "GoogleTrendsScraper" and in Settings, check the 'Bash Path', 'Python Path' and 'Chromedriver path'? They should be         something similar to 'bin/bash', '/Library/Frameworks/Python.framework/Versions/2.7/bin/python' and '/usr/local/Cellar/chromedriver/2.29/bin/'         respectively. They will likely be different for you depending on how you install Python and Selenium.
    4) In "Search Term" enter the keyword you want to search for. For "Time Frame" select how long you want to collect data for (best results are            usually had with "Last 7 Days").
    5) Click on "Scrape" and wait for a few second. This should open a new browser window. Maximize this window. You can drag and drop it to            another monitor (if you have one open). Otherwise, keep the browser open and in the front of your screen. Don't do anything that might                interfere with the browser. It should automatically go through the list of counties and states now. You can check the progress in the graphical         interface, which updates throughout.
    6) After ~ 1 hour, the scraper will be finished. At this point, click on "Parse into County Data." This should run through all data and assign it to            the list of states and counties. Once this process is done, the graphical interface should say, "Finished!"
    7) Check your folder for the file "GoogleTrendsScrape.csv" and "FoundAndMissing.csv" — the former is your output, and the latter is a report on         what data was available. Note that data availability differs based on your search term and time frame.

For PC users, to make use of the Google Trends scraper:
    1) Download the following four files into the same folder: FirstSecondThirdFourth.
    2) Make sure you have Python and Selenium downloaded.
    3) Open Terminal and type in: "python executable" "python script" "search term" "output CSV path" "desired time frame":
  • "python executable" refers to the location of your python executable file, e.g. C:\Python27\python.exe
  • "python script" refers to the location of the python script you downloaded, e.g. C:\Users\Username\Desktop\GoogleTrendsScrape.py
  • "search term" refers to whatever you want Google Trends to look forr
  • "output CSV path" is the path you want the output to write to
  • "desired time frame" ranges from 0-8 (0: past hour; 1: past 4 hours; 2: past day; 3: past 7 days; 4: past month; 5: past 3 months; 6: past 12 months; 7: past 5 years; 8: all time)
    4) This should have worked and the same outputs as above should have been produced.

Please send me an e-mail if you have any questions, or just to let me know that you used it and it works!

Twitter State-/County-Level Scraper

We are currently working on scripts that will allow users to pull tweets within a specific time range and location from Twitter, and uses LIWC (or other software) to provide state- or county-level averages for the focal observation in csv format (building on pyCurl and twiQuery). We hope to be ready by summer/fall 2017.

Some other code/scripts I like: