Bora Beran, PhD.

I was born in 1979 in Turkey. I always had a passion to build things. Of course the media changed over time: Lego blocks, words, musical notes, programming constructs… but the process was always the same. Envision it, build it, make sure it is elegant and beautiful, something you can be proud of and surely it will fuel the next cycle.

I took an interesting path by pursuing a degree in Environmental Engineering first which at the time was hailed as the job of the future.  I had my side projects of course, like singing and playing the guitar in band, designing flyers and book covers not to mention leading a team that produced interactive product catalog, yearbook and educational CD-ROMs. Ah.. Good ol’ Macromedia Director and Lingo.

During my Masters, my education came a step closer to my future profession as I focused on computational modeling, while not losing touch with the hands-on environmental scientist within.  I’ve seen my fair share of diatoms among other, not so pretty things. Mathematica was certainly one of the prettier things.

I came to the US in 2004 for my PhD to work in a computational fluid dynamics lab but took courses on information systems and database design  in addition to the regular Civil Engineering curriculum. My work focused on application of semantic web technologies (OWL/RDF), interoperability standards (ISO 19xxx series, OGC‘s GML, O&M, SensorML…), webservices and GIS to change the way hydrologists communicate with sensors, discover, gather and understand data. Out of which among many other things came Hydroseek search engine that was mostly coded in JAVA and relied on Jena framework‘s OWL reasoner.

In 2006, I met Jim Gray in 2006 who convinced me to join Microsoft Research.  It was shortly after his untimely loss, when I joined his lab in San Francisco, CA. I built a similar tool named SciScope ground up with a custom built triple store and simple inference engine in C# and with much bigger scope in terms of data coverage and the size of the knowledge base in addition to collaboration (collaborative tagging, sharing datasets, comments/annotations) and visualization capabilities. Visualizations ranged from 2D time series plots of sensor observations to  density visualizations with area-weighted aggregations (hydrologic units, ecoregions…) creating on-the-fly thematic maps that rely on color as well as elevation/extrusion (spatial bar chart) by taking advantage of OLAP. SciScope was showcased to United Nations in February 2009 and at World Economic Forum in January 2009. During this period I also led the development of APIs compliant with Open Geospatial Consortium standards for .NET platform.

After 2.5 years in Microsoft Research, I joined High Performance Computing team and later SQL Server Information Services team as a Program Manager to help address visualization and big data & big compute problems of domain specialists like myself through parallel and distributed computing including in memory (MPI) and disk bound (Dryad/Hadoop) workloads. I worked with a lot of experts in different domains such as finance, actuarial and biotech in this period and wrote a good amount of R and Python code for a number of scenarios ranging from risk assessment and stock pricing to traditional BI (classification/clustering of structured data) as well as social media and clickstream analytics. In 2011 I joined Lync team as a Program Manager. After shipping Lync 2013, Microsoft’s enterprise unified communications solution in the Office suite, I jumped back into the data realm by joining Tableau.

Currently I’m a Program Manager at Tableau Software. My teams focus on statistics & calculations features, query generation and technical partnerships.

35 thoughts on “About

  1. kamesh peri says:

    Hello Bora,

    I wanted to reach out to you regarding a tableau and R (igraph) integration problem.
    I have parent child nodes in two columns and a completions metric in third column. I would like to get the co-ordinates of nodes and edges from R dynamically. Node size is proportional to the metric. Is this possible?


  2. Tony Course says:

    Hi Bora

    We are in Australia, doing some exciting work building an off the shelf BI tool for retailers, using Tableau as the front end. We work closely with your Australian team.

    The next stage is to use the Tableau R integration to help identify fraud and loss. We are after some assistance in the initial analysis of data – to find outliers and the potential for fraud.

    We have prepared a brief and some real client data. Is there someone within your team – with knowledge of both R and Tableau – who could assist.


  3. Paul Bernal says:

    Hello Bora, I am created the following script to produce neural network autoregressive forecasing:



    TimeS<-ts(.arg1, start=c(1994,10), frequency=12);


    NNETARforecastoutput<-forecast(NNETARfit, h=.arg2[1]);

    LargoInicial = length(.arg1);

    SetParcial =.arg1[1: (LargoInicial – .arg2[1])];

    LargoNuevo = length(SetParcial);


    append(SetParcial, NNETARforecastoutput$mean, after=LargoNuevo);


    [Select Metric],

    [Select Number of Periods to Forecast]


    However, Tableau sends me an error message saying that I am passing a non-numeric argument to a mathematical function, this doesn´t really make sense to me. I tried using Rs forecast and auto.arima functions and was able to see the results in Tableau without any problem.

    Do you have any idea of what could be giving me trouble here?

    Best regards,


    • Tableau passes the parameters as vectors as well. So you need to either refer to .arg1 as .arg1[1] so you use the scalar in position 1 (since it is a vector with all some values doesn’t matter) or you can pass it inline in which case it won’t be passed as a vector e.g.

      STR([MyParameter])+”1)”, SUM([NumberToBeRounded]))

      I suspect you’re getting an error since most of the functions you use expect to get a scalar number but they get a vector and don’t know what to do with it.

  4. Yılmaz Kandiş says:

    Hello Bora,
    This is Yilmaz from Istanbul. Another Tableau-mania in the World. I just came across with your blog. I was there in Seattle for TCC 2014, but unluckily didn’t know you.
    I am checking each of your posts one by one. Your are doing great!

  5. Hi Bora,

    Thanks for all your helpful information! I wrote a script to find t statistics using the iris data in R:

    SCRIPT_REAL(‘ttest <- function(v) t.test(v)$statistic ;

    tapply(.arg1, INDEX = .arg2[1], ttest)',

    SUM([Petal length]), ATTR(Species))

    The code (tapply(iris$Petal.Length, iris$Species, function(v) t.test(v)$statistic) works fine in R, but in Tableau I am seeing error saying that there are not enough x observations.

    Can you see where I'm going wrong?


    • INDEX should have the same length as x in tapply. In this case .arg1 is the full vector while .arg2 has one member because of .arg2[1]. Can you try using

      tapply(.arg1, INDEX = .arg2, ttest)

      instead of

      tapply(.arg1, INDEX = .arg2[1], ttest)

  6. Pushkar says:

    Hi Bora,
    I am a new member of the Tableau community and learning my way through this great visualization tool. I have came up with a requirement for which I was hoping your expert guidance.
    I have to design a report for a shipment tracking system which maps all the points in the route of the shipment from the source to destination. The shipment starts from source to the export port then to the import port and then to destination. So we have to plot all these points and a route traversed by the shipment.
    I have seen different post for the Path maps and it says we require different data structure for this requirement. I was able to plot the points but some how the line always appears to be a straight line with high width causing my visualization very hard to understand.
    Can you please help me with this requirement ? I need curved path and if possible arrow direction from source to the destination.
    I am new to tableau so I appreciate your help and request you to make it simple for me to understand

  7. Gauthaam says:

    Hi Beran,

    I have created a Tableau workbook which allows the user to select variables from a list of variables in Tableau and execute regression for the selected variables through R.

    I am using the above logic in three different views. While in a view it seems to work perfectly fine with each view taking under 40 seconds to load whenever a variable selection is made. However, when i include these views in a dashboard – they take forever to load.

    Can you please help me out here. This is kinda urgent as my project depends on it. Any help is appreciated.


    • Please check your addressing/partitioning settings for the calculation. Even 40 seconds sound very long. Tableau makes one call per partition to R. It is likely you’re unintentionally breaking it into multiple requests or maybe making a separate request to R for each row. You can click on the calculation and select Edit table calculation… then look under Compute using setting. There are a number of posts that explain table calculation settings. Here is one http://drawingwithnumbers.artisart.org/want-to-learn-table-calculations/

      Please make sure any field that you don’t want to slice the data by is on the box to the right “Addressing”. Most likely, in your case everything needs to be on the right.

      • Gauthaam says:

        Thanks for replying so promptly Beran. Really appreciate it.

        I double checked the table calculation and it seems to be fine. I have just included Date field, computing the model fit across Date field.

        Is it possible that it is taking such long time to compute since I am passing ~650 variables to R? If so, are there any work around to reduce the computing time?

  8. Arun Prakash says:

    Hi Bora Beran,

    I want to create a sun burst chart with 4 levels – Make of the car, Model of the car, Mobile Device Manufacturer and Model of the Mobile Device. We have like 10,000 rows of Data. It’s tough to create a data model in Excel like what you did. So, I cannot replace the data source like you suggested. Is there any other way to create a Sun Burst chart for my requirements? Can you kindly help on how to create sunburst chart using level as a parameter?

    Thank You,
    Arun Prakash

  9. Priya Ramakrishnan says:

    Hello Bora,

    Just wanted to say that your blog is very inspiring and it has helped me a lot.

    I presented at the Sydney Tableau User group yesterday (Using Tableau and R for text analytics) and the response has been very positive. It wouldn’t have been possible without your blog.

    Thank you !!

    Priya Ramakrishnan

  10. Tushar Uttarwar says:

    Hello Bora,

    I need a help to forecast a weekly data using R.
    I tried following code

    jjearnts <- ts(u,deltat=1/52,start=c(2011,1),end = c(2015,38),frequency = 52);
    fcast <- forecast(jjearnts, h=.arg2[1]);
    append(u,fcast$mean, after = n)",
    SUM([Quantity]),[Number of Week to Predict])

    Or Bora please suggest me which script shall i use for forecast in Tableau using R for Week wise data.

    Thank you

  11. Heather Lewis says:

    Hi Bora, could you recommend any SQL resources? I’m trying to learn both R and SQL to truly harness the power of Tableau. I’m a training SPSS user. I do not quite understand the logic in the RScripts calculated fields. Is there some kind of map/breakdown or “recipe” on how to formulated these particular calculated fields?

  12. Anand Krishnamurthy says:

    Hi Bora,
    Is there a way to do probability density plots in Tableau. You know the same ones that is created using the densityplot function in R. It appears in tableau if you try to replicate it it does not smooth things out. Not sure how to call the R function and then plot in Tableau. I am a novice in R. Would appreciate some help

      • Anand Krishnamurthy says:

        Thank you. Looking fwd to it. Also Bora how can I find “knowledge” to know the right way to change the Rscript to make it work in Tableau. I have learnt the basics of doing classification , random forest, regression etc in R. I would really like to learn how to tie as much of this as possible with Tableau.

      • It has to return either one row or as many rows as going into R. E.g. if table in Tableau (what you see when you do “View Data” on a sheet) has 100 rows, R needs to return 100 rows. A lot of the changing to make it work in Tableau is to achieve this goal.

  13. Tuya says:

    So happy I chanced upon your blog. Very interesting and helpful articles. Currently I am Tableau BI consultant.
    Even your picture is created using Tableau haha

  14. Hi Bora Beran,,

    I am amused by so many R scripting (Statistical) integrations with Tableau. I am curious if you did something similar for Weibull Plotting and curve-fitting measures such as Anderson-Darling or Likelihood Ratio (MLE approach of curve fitting). Thanks.
    Good reading your blog.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s