By domcushnan On

In Journal

**Experimental**

As part of my attempt to streamline and automate some of the data collection used in the team and to be analysed on a regular basis, I have explored a few ways to collect data.

In my journey, I have found that Google Sheets can do something called IMPORTXML / IMPORTHTML and this allows you to pull in some information that is publically available. (all be it with a few glitches)

I wanted to have something that did not needs loads of coding and can be cloned for other teams and projects.

What I have discovered is that Google doesn’t handle very well more than 50 IMPORTXML requests.

If you see this it’s going to take a while for the data to import.

click image to view spreadsheet

Since the page pulls and tries to get the data everytime the page is loaded. This is going to have intermittent issues.

What I have added in is a script that pulls all the data and adds to the “log” sheet so that the data is saved daily. (Between 12am-4am).

 

To cut down on the IMPORTXML / IMPORTHTML in the Google Sheet. I created a seperate file for the YouTube and Facebook user numbers.

 

click image to view spreadsheet


Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Story
Letter to my future self