Building a Data Studio Connector with Apps Script

Creating data studios is a pretty awesome way to create data visualizations. However, if the connectors don’t already exist, you have to find ways to make them exist. That is where Google Apps Script comes into play. That is what we are going to do now!

Part 1: Building and deploying the connector Part 2: Refactoring the code and caching the data Part 3: Design the dashboard and embed in a website

First step is creating an file and writing some simple Authentication data. There are other ways to do the authentication, but we are going to stick with the easy way to start.

var cc = DataStudioApp.createCommunityConnector();

function getAuthType() {
  var AuthTypes = cc.AuthType;
  return cc

function isAdminUser() {
  return false;

Next up is building the configuration. These are parameters that get attached to the request that we can use throughout the connection process. In this project we don’t really need them, but I am going to put my user id on there for fun.

var cc = DataStudioApp.createCommunityConnector();
var DEFAULT_USER = PropertiesService.getScriptProperties().getProperty('STACKOVERFLOW_USER_ID');

function getConfig() {
  var config = cc.getConfig();

    .setText('Enter the User Id to pull StackOverflow data');

    .setName('Enter a single user id')
    .setHelpText('e.g. "4541958"')

  //  config.setDateRangeRequired(true);


Now comes the schema. This is the backbone of the whole connector. The connector uses the schema to define the data that comes back from … wherever it comes from. In this case, it is coming from a couple of different APIs.

function getFields() {
  var fields = cc.getFields();
  var types = cc.FieldType;
  var aggregations = cc.AggregationType;

    .setName('Display Name')


    .setName('Profile Image')

    .setName('Gold Badges')

    .setName('Silver Badges')

    .setName('Bronze Badges')


  return fields;

function getSchema(request) {
  var schema = { schema: getFields().build() };
  return schema;

After the backbone, now we gotta put some muscles on this thing. The getData function does that, it organizes the work that needs to be done.

function getData(request) {
  var requestedFields = getFields().forIds( {

  try {
    console.log('start request');
    var apiResponse = fetchDataFromApi(request);
    var data = formatData(apiResponse, requestedFields);
  } catch (e) {
      .setDebugText('Error fetching data from API. Exception details: ' + e)
        'The connector has encountered an unrecoverable error. Please try again later, or file an issue if this error persists.',

  return {
    rows: data,

function fetchDataFromApi(request) {
  request = request;
  var url = [
    '' + request.configParams.userId,
    '&key=' + PropertiesService.getScriptProperties().getProperty('STACKOVERFLOW_KEY'),
  var response = UrlFetchApp.fetch(url);
  return JSON.parse(response.getContentText());

function formatData(response, requestedFields) {
  var item = response.items.shift();
  var row = requestedFields.asArray().map(function(field) {
    switch (field.getId()) {
      case 'display_name':
        return item.display_name;

      case 'link':

      case 'profile_image':
        return item.profile_image;

      case 'gold_badges':

      case 'silver_badges':
        return item.badge_counts.silver;

      case 'bronze_badges':
        return item.badge_counts.bronze;

      case 'reputation':
        return item.reputation;

        return '';
  return [{ values: row }];

Finally, we can write the manifest and then deploy from manifest. This will provide us with a project ID that we can connect as a data source in data studio.

  "timeZone": "Asia/Hong_Kong",
  "dependencies": {},
  "dataStudio": {
    "name": "Developer Data",
    "logoUrl": "",
    "company": "",
    "companyUrl": "",
    "addonUrl": "",
    "supportUrl": "",
    "description": "get developer data",
    "sources": ["stackoverflow", "github"]

Whew! With any luck that will generate a data source that you can use to start building your dashboard. In part 2 we will be adding a cache to limit the number of requests we have to make to various APIs (because we will also be adding some other data sources!).