Pages - Menu

Thursday, October 6, 2016

Find the best location for your next shop in Google Spreadsheet (Voronoi Algorithm)



About 14 months ago I wondered how a Czech retail market is divided into regions according to their locations. Where is the best place to build next shop? I remembered to a Voronoi algoritm from university that could divide desire an area into small parts along the nearest spot.

Let's check a definition from Wikipedia
Voronoi diagram is a partitioning of a plane into regions based on distance to points in a specific subset of the plane. That set of points (called seeds, sites, or generators) is specified beforehand, and for each seed there is a corresponding region consisting of all points closer to that seed than to any other.


Why should you upload your a dataset into an unknown online web application when you want to visualize data? There were a lot of examples on the internet, but I wanted to create as easy as possible for everyone. Finally I found article by Chris Zettter, where code was written over Leaflet and Open Street map layer. It inspired me to rewrite as Google Add-on  to Google Spreadsheet with Google Maps underlay (I love Google Maps SDK)

I will introduce you shortly how to use Voronoi Map from scratch in lesss than 5 minutes.

1) Create a new Google Spreadsheet. Then insert data into Sheet 

For this demo I've downloaded data of California retailers from this Fusion Tables and filtered only Walgreens stores in San Francisco (my favorite place :-). 

You need to prepare data as separate columns for Name, Latitude, Longitude nad Type of point of interest. I've added a new column "brand" (green color) as a Type and filled it with text "Walgreens"


2) Open menu Add-ons and select Get add-ons. Find voronoi map and than click at + FREE button for install.

3) After installation open menu Add-ons -> Voroni Map and select Create Voronoi Map

4) Select appropriate colums and click Create map


Now you see a result. The blue points are point-of-interest (Walgreens stores) and regions along the nearest point. If you have more type of points-of-interest, you can filter them in right top panel.


Probably you are thinking where you should built you next shop? I would recommend you find intersection of region and think about this location :-)

Tuesday, February 23, 2016

Analyse city traffic from webcams through Google Cloud Vision API


The Google has recently introduced Cloud Vision API, which allows you to analyse image data through API instead of creating custom algorithms.

As Google Developer Expert I was involed in tester program of this API during alpha stage. My GDE's friend +RiĆ«l Notermans inspired me to connect Vision API into my favorite Google Apps Script. I was thinking how to use it and suddenly I found out useful something-like smart city solution.

Intro

I'm living in Prague. The municipal transportation provides almost real-time records of traffic from webcams The images are available on web (http://www.dpp.cz/en/webcams/). Anyone could check traffic before heading out to work.  I am able to connect, download and send these images to any API, because each image has own public URL

The maps of webcams in Prague

I have created Google Spreadsheet  as a database with new a Google Apps Script project.
Next step was activated required API in Google Developer Console. OAuth 2.0 dance was managed by great cGoa library by +Bruce Mcpherson. Cloud Vision "client" library was generated from Cloud Endpoints by +Spencer Easton's code.

My script fetched the image from URL as a Blob, converted into base64 and sent to Cloud Vision API with parameter for LABEL_DETECTION (Label detection tutorial available here)


As a JSON response I received description of image as labels with confidence score (min 0 - max 1).
The last step was setup Trigger for automatic run Apps Script every 5 minutes.

The result

I have run my script on webcam at Argentinska street at February 17th between 12am-12pm. The all collected data is available as interactive visualization. http://data.kutil.org/visualization/analysis-of-traffic-by-cloud-vision-api

The label "road" means, that higher value looks like only street without street, so traffic during lunch was low.

The chart for label traffic looks like inverse function of road label. The higher value means, that Google's recognizes traffic on the image with greater confidence.


It absolutely not 100% accurate, but it could inspire you how to use Google Cloud Vision API.

Sunday, February 14, 2016

Integrate Google Apps and Slack with Google Apps Scripts (Incoming Webhooks)

Google Apps and Slack are two the most popular productivity tools for unicorn startups (=early stage companies with valuation $1B+).  Almost 80% unicorn startups are using Google Apps as primary platform for email, calendar or storage and 50%  of them are using Slack for internal communication [source]
It means that nearly half of these cool startups could integrate their internal process with Google Apps Scripts.


Slack API has several possible ways how to integrate your application/script into their platform. Today we will start the simplest way - through Incoming Webhooks.  Webhooks are solid and unique published URL address, which allows you send any data, which will be saved into channels.




The small guide how to integrate Google Apps with Slack in 10 minutes

1) Visit link https://my.slack.com/services/new/incoming-webhook/. You will be redirected into settings page of your board.

2) Select the target channel (e.g #tech) - don't worry you can change afterwards. Next click at green button Add Incoming WebHooks integration. 


3)  The remain Slack webpage contains settings like name of the robot (=it's not actually pseudo-robot), icon, default channel and so on. We only need to copy webhook url. (e.g. https://hooks.slack.com/services/xxxxxx/xxxxxxx/xxxxxxxxxxxxxxxxxxxxxxxxxx

4) Create a new Google Apps Script project and copy code snippet. Insert your webhook url from previous


5) Now run START() function in Apps Script. If you insert right url, you will see a new message is relevant channel.

Now you have connect your Google Apps Script with Slack. You can simply load data from Calendar, Gmail, Drive or Sites and create alert on events. I will show you in some next article.

I recommend check tips how to format text, add links or attachments: https://api.slack.com/docs/formatting

Are planning to integrate your Google Apps with Slack? Let your ideas in comments section

Sunday, January 24, 2016

Easy data scraping with Google Apps Script in 5 minutes

I'm using Google Apps Script for a lot of things - from automate tasks to data analysis. I have discovered, that there was repetitive use-case: scrape data from web  and parse exact value from HTML source code. If you are novice in programming, you probably know, that's difficult to write and use regular expresion. For me too :) I have written Google Apps Script library, which helps you to parse data in 5 minutes.

Let's create a small example. In our company we have created a Google Apps application SignatureSatori to create and setup email signatures for all users in domain. Like a good growth hacker I benchmark the competitors how quickly get new users. There is a Google Apps Marketplace, which estimate number of users. I need save that numbers each day.


1) Create a new Google Apps Script and insert a new library (Resources -> Library)
M1lugvAXKKtUxn_vdAG9JZleS6DrsjUUV

Parser library takes three parameters - input text and pattern which bounds desired text.
Parser // name of library
    .data(content) // input text
    .from(fromText) // from text pattern
    .to(toText) // to text pattern
    .build(); // run parser and return value

2) Now open desired web-page (e.g Chrome Webstore in our case). Click on specify HTML element by right mouse button and select Inspect element.


3) Find the right part of HTML and copy fromText and toText

4) Now we have all required information to complete script
function getData() {
    var url = "https://chrome.google.com/webstore/detail/signaturesatori-central-s/fejomcfhljndadjlojamaklegghjnjfn?hl=en";
    var fromText = '<span class="e-f-ih" title="';
    var toText = '">';
  
    var content = UrlFetchApp.fetch(url).getContentText();
    var scraped = Parser
                    .data(content)
                    .from(fromText)
                    .to(toText)
                    .build();
    Logger.log(scraped);
    return scraped;
}

5) The last and the easiest step is copy parsed data into Spreadsheet
function SAVE_DATA() {
 var sheet = SpreadsheetApp.openById(spreadsheetId).getSheetByName(sheetName); // insert Spreadsheet Id and Sheet name
 sheet.appendRow([ new Date(), getData() ]);
}

6) If you want to log during scraping (e.g. if you want to debug wrong value), call .setLog() function before final .build() function:
 
Parser
    .data(content)
    .setLog()
    .from(fromText)
    .to(toText)
    .build();
Completed code of Parser library Enjoy!