Monday, February 24, 2020

Extract archived files directly from Google Drive with serverless tool Google Colab

Today I will show you how you can extract files from zip archives directly in Google Drive without any external 3rd tools. Everything is completed serverless and in da cloud :-)

Google Colab is a pretty neat tool for data scientists, who operate and manage Jupyter notebooks in the cloud. I prefer Google Colab because supports Python 3 runtime with CPU, GPU or TPU processors. You can also mount your Google Drive as an external drive.



Colab prepares the virtual machine for you with Jupyter notebook. Notebook contains several cells, where you can write Python code and run it with "Play" button.

There is also the magic behind "magic syntax sugar", which allows you to use more advanced commands. One of them is %%bash, which converts the rest of the commands into bash commands.



When we combine the above knowledge, we will get a powerful tool like an army knife:

1. Open Google Colab https://colab.research.google.com/notebooks/intro.ipynb#recent=true and create a new notebook.

2. Create a new notebook and rename it. At the right top click to button Connect, which starts your virtual machine. 

3.Now it is time to mount your Google Drive. Click to button Mount Drive. You have to authorize Colab application to your account. The path to your files is drive/My Drive/





Copy this snippet of code int your Colab cell


There are two - change directory and unzip desired files.

Click to "play" button to run the selected cell (=snippet of code). Files will be extracted into same locations.

In my example, I did Google Takeout to my Google Drive and backup zip files were saved into Takeout folder to my root folder. 



Note
Bash run cell in a subprocess, so it could take some time to unzip a lot of files



Monday, June 3, 2019

How to measure latency between Google Apps Script project and Google Cloud Platform regions

If you are debugging your Google Apps Script application, you find useful to measure time spent on requests. There are several regions (now 20) and zones (now 61) in Google Cloud Platform. (source)

G Suite and their smart-and-powerfull tool App Script are running on the same infrastructure. Integration between Apps Script and some GCP service (e.g. Vision API, Natural Language API, Cloud Functions) is not through Google private network, but through public internet (in other words: you are calling REST APIs with UrlFetchApp). That means, it is good to care about your region.


If you would like to get more insights and find out where your application is "hosted", you can use code snippet.


It measures latency between your Apps Script application and several GCP regions around the world. UrlFetchApp makes HTTP requests to VMs running in each region.

Here is my result:
(Apps Script Project Timezone: Paris, Europe, computer located: Prague, Czech republic)

[19-06-03 13:09:05:115 PDT] us-east1: 8.0 ms
[19-06-03 13:09:05:116 PDT] global: 10.0 ms
[19-06-03 13:09:05:116 PDT] us-east4: 33.0 ms
[19-06-03 13:09:05:117 PDT] northamerica-northeast1: 60.0 ms
[19-06-03 13:09:05:118 PDT] us-central1: 86.0 ms
[19-06-03 13:09:05:118 PDT] us-west1: 142.0 ms
[19-06-03 13:09:05:119 PDT] us-west2: 144.0 ms
[19-06-03 13:09:05:119 PDT] europe-west2: 183.0 ms
[19-06-03 13:09:05:120 PDT] europe-west1: 195.0 ms
[19-06-03 13:09:05:120 PDT] europe-west4: 197.0 ms
[19-06-03 13:09:05:121 PDT] europe-west3: 204.0 ms
[19-06-03 13:09:05:121 PDT] southamerica-east1: 246.0 ms
[19-06-03 13:09:05:122 PDT] europe-north1: 258.0 ms
[19-06-03 13:09:05:122 PDT] asia-northeast1: 319.0 ms
[19-06-03 13:09:05:123 PDT] europe-west6: 343.0 ms
[19-06-03 13:09:05:124 PDT] asia-east1: 389.0 ms
[19-06-03 13:09:05:124 PDT] australia-southeast1: 421.0 ms
[19-06-03 13:09:05:125 PDT] asia-east2: 433.0 ms
[19-06-03 13:09:05:125 PDT] asia-southeast1: 463.0 ms
[19-06-03 13:09:05:125 PDT] asia-south1: 563.0 ms

Inspiration and browser version is located here: http://www.gcping.com/

Tuesday, January 22, 2019

Deploy Google Apps Script web app as an Android application

The main advantage of serverless Google Apps Script against of serverless Cloud Functions is online editor (IDE) not only for writing code for server-side, but even for client-side. You can create HTML web application with HTMLService


Google has recently introduced (G Suite update 12/2018) option to deploy web application as a native Android application for G Suite users. If you are reading this, you are probably power-user so you can imagine what you are able to do now! Yes, deploy native Android application with Google Apps Script.

Notes:
-To use web apps, your user’s devices must have Google Chrome installed

1) Create a new Script at https://script.google.com

2) Insert code snippet into code.gs


3) Create a new HTML file and rename it to "view.html"



4) Edit HTML for your future über-cool-application (just for now, insert Hello World!)

5) Open menu Publish --> Deploy as web app and copy URL address



6) It is time setup in G Suite. Open Google Admin console (https://admin.google.com) and pick category Device management

7) Then choose App Management from left panel.

 8) In App Management you will see section in the middle of page - Manage apps for Android devices. This section contains button Manage whitelisted apps. Click it.



9) Now you are in mobile application management. First, you have to Add a new whitelisted application (with yellow circle button)


10) You will see Google Play directory. Change category (left panel) to Web Apps.



11) For now, you have to again click at Add button (circle button at right bottom of page.



12) Now you can define behaviour of your Android application.
- Title is name of application on home screen
- URL of your Google Apps Script (from step 5)
- I recommend you for display choose Full screen or Standalone
- Icon of your Android application


13) After you click at Create button it will take about 10 minutes to prepare your application.
(There is a small text Not available yet below during preparation )

14) When your application is successfully prepared, then you can click at desired icon



15) During distribution strategy, you can distribute application for all G Suite users or subset according to Organization unit.




16) Last step contains option to automatically install application on all devices (This option doesn't work for me)



17) When you visit Google Play at your phone, then navigate to section Work Apps. Click to application and install it.

The was a final step. Your Google Apps Script application is on mobile's homescreen



Friday, March 9, 2018

Analyse emotion in OSCAR movies 2018 with Google Apps Script and Google Natural Language API

And OSCAR goes to...

There are a lot of indicators, who predict the winner's movie. One of them, no doubt, is emotions. When the screenplay is written in a way to change emotion from positive to negative and vice versa, it is creating a big impact on peoples mind.

Nowadays you can you Natural Language Processing without deep knowledge of all things like syntax analysis, grammar, external libraries. Especially big companies like Google, Amazon or Microsoft invest a huge amount of money to improve their algorithm and provide as service/API.

I am always interested in easily and fast proof-of-concept, so today I will show you how I put these things together.




I have chosen a category for Best picture (90th Academy Awards) and these movies
  • Call Me by Your Name – Peter Spears, Luca Guadagnino, Emilie Georges, and Marco Morabito
  • Darkest Hour – Tim Bevan, Eric Fellner, Lisa Bruce, Anthony McCarten, and Douglas Urbanski
  • Dunkirk – Emma Thomas and Christopher Nolan
  • Get Out – Sean McKittrick, Jason Blum, Edward H. Hamm Jr., and Jordan Peele
  • Lady Bird – Scott Rudin, Eli Bush, and Evelyn O'Neill
  • Phantom Thread – JoAnne Sellar, Paul Thomas Anderson, Megan Ellison and Daniel Lupi
  • The Post – Amy Pascal, Steven Spielberg, and Kristie Macosko Krieger
  • The Shape of Water – Guillermo del Toro and J. Miles Dale
  • Three Billboards Outside Ebbing, Missouri – Graham Broadbent, Pete Czernin, and Martin McDonagh

1. I downloaded subtitles from server Opensubtitles and saved these files (.srt) to my Google Drive.

2. I created a new project in Google Apps Script and setup Google Natural Language API endpoint in menu Resources -> Cloud Platform -> left menu API & Services -> Library -> Cloud Natural Language API.
Note: Google Apps Script creates a new Google Cloud project for you, so you don't have to create it yourself.



In Google Cloud platform dashboard I also got API key, which identifies applications.

API & Services -> Credentials > Create credentials -> API key
(You should setup Application restrictions for HTTP referrer as a minimum)




2. First I did preprocess. I borrow a term from machine learning - bucketing. I aggregated multiple lines of subtitles into time-framed text "window" of length 2 minutes. Here is a code:





3. The rest is simple - iteration over all "buckets" and send each grouped text to Google Natural Language. This API response with two numbers - sentiment and magnitude. In my case, I used sentiment number in the range <-1;1>. Everything is saved in Google Spreadsheet and charts are rendered directly from there.
4. I put together two functions from above:
You can see how sentiment changes during the movie (time is on the x-axis) for the best OSCAR movie of the year 2018:

Rest nominated movies:










Thursday, October 6, 2016

Find the best location for your next shop in Google Spreadsheet (Voronoi Algorithm)



About 14 months ago I wondered how a Czech retail market is divided into regions according to their locations. Where is the best place to build next shop? I remembered to a Voronoi algoritm from university that could divide desire an area into small parts along the nearest spot.

Let's check a definition from Wikipedia
Voronoi diagram is a partitioning of a plane into regions based on distance to points in a specific subset of the plane. That set of points (called seeds, sites, or generators) is specified beforehand, and for each seed there is a corresponding region consisting of all points closer to that seed than to any other.


Why should you upload your a dataset into an unknown online web application when you want to visualize data? There were a lot of examples on the internet, but I wanted to create as easy as possible for everyone. Finally I found article by Chris Zettter, where code was written over Leaflet and Open Street map layer. It inspired me to rewrite as Google Add-on  to Google Spreadsheet with Google Maps underlay (I love Google Maps SDK)

I will introduce you shortly how to use Voronoi Map from scratch in lesss than 5 minutes.

1) Create a new Google Spreadsheet. Then insert data into Sheet 

For this demo I've downloaded data of California retailers from this Fusion Tables and filtered only Walgreens stores in San Francisco (my favorite place :-). 

You need to prepare data as separate columns for Name, Latitude, Longitude nad Type of point of interest. I've added a new column "brand" (green color) as a Type and filled it with text "Walgreens"


2) Open menu Add-ons and select Get add-ons. Find voronoi map and than click at + FREE button for install.

3) After installation open menu Add-ons -> Voroni Map and select Create Voronoi Map

4) Select appropriate colums and click Create map


Now you see a result. The blue points are point-of-interest (Walgreens stores) and regions along the nearest point. If you have more type of points-of-interest, you can filter them in right top panel.


Probably you are thinking where you should built you next shop? I would recommend you find intersection of region and think about this location :-)

Tuesday, February 23, 2016

Analyse city traffic from webcams through Google Cloud Vision API


The Google has recently introduced Cloud Vision API, which allows you to analyse image data through API instead of creating custom algorithms.

As Google Developer Expert I was involed in tester program of this API during alpha stage. My GDE's friend +Riël Notermans inspired me to connect Vision API into my favorite Google Apps Script. I was thinking how to use it and suddenly I found out useful something-like smart city solution.

Intro

I'm living in Prague. The municipal transportation provides almost real-time records of traffic from webcams The images are available on web (http://www.dpp.cz/en/webcams/). Anyone could check traffic before heading out to work.  I am able to connect, download and send these images to any API, because each image has own public URL

The maps of webcams in Prague

I have created Google Spreadsheet  as a database with new a Google Apps Script project.
Next step was activated required API in Google Developer Console. OAuth 2.0 dance was managed by great cGoa library by +Bruce Mcpherson. Cloud Vision "client" library was generated from Cloud Endpoints by +Spencer Easton's code.

My script fetched the image from URL as a Blob, converted into base64 and sent to Cloud Vision API with parameter for LABEL_DETECTION (Label detection tutorial available here)


As a JSON response I received description of image as labels with confidence score (min 0 - max 1).
The last step was setup Trigger for automatic run Apps Script every 5 minutes.

The result

I have run my script on webcam at Argentinska street at February 17th between 12am-12pm. The all collected data is available as interactive visualization. http://data.kutil.org/visualization/analysis-of-traffic-by-cloud-vision-api

The label "road" means, that higher value looks like only street without street, so traffic during lunch was low.

The chart for label traffic looks like inverse function of road label. The higher value means, that Google's recognizes traffic on the image with greater confidence.


It absolutely not 100% accurate, but it could inspire you how to use Google Cloud Vision API.

Sunday, February 14, 2016

Integrate Google Apps and Slack with Google Apps Scripts (Incoming Webhooks)

Google Apps and Slack are two the most popular productivity tools for unicorn startups (=early stage companies with valuation $1B+).  Almost 80% unicorn startups are using Google Apps as primary platform for email, calendar or storage and 50%  of them are using Slack for internal communication [source]
It means that nearly half of these cool startups could integrate their internal process with Google Apps Scripts.


Slack API has several possible ways how to integrate your application/script into their platform. Today we will start the simplest way - through Incoming Webhooks.  Webhooks are solid and unique published URL address, which allows you send any data, which will be saved into channels.




The small guide how to integrate Google Apps with Slack in 10 minutes

1) Visit link https://my.slack.com/services/new/incoming-webhook/. You will be redirected into settings page of your board.

2) Select the target channel (e.g #tech) - don't worry you can change afterwards. Next click at green button Add Incoming WebHooks integration. 


3)  The remain Slack webpage contains settings like name of the robot (=it's not actually pseudo-robot), icon, default channel and so on. We only need to copy webhook url. (e.g. https://hooks.slack.com/services/xxxxxx/xxxxxxx/xxxxxxxxxxxxxxxxxxxxxxxxxx

4) Create a new Google Apps Script project and copy code snippet. Insert your webhook url from previous


5) Now run START() function in Apps Script. If you insert right url, you will see a new message is relevant channel.

Now you have connect your Google Apps Script with Slack. You can simply load data from Calendar, Gmail, Drive or Sites and create alert on events. I will show you in some next article.

I recommend check tips how to format text, add links or attachments: https://api.slack.com/docs/formatting

Are planning to integrate your Google Apps with Slack? Let your ideas in comments section