As AI becomes increasingly integrated into our daily tools, combining it with Tableau opens up new possibilities. This post explores a toolkit of AI solutions that expands your capabilities with Tableau. We'll look at options for dashboard builders, those looking to upskill and even develop AI solutions powered by Tableau.
Community GPTs
These community build AI tools, typically for solving specific problems. These are free to use on the OpenAI store but will be subject to daily credit limits.
VizCritique Pro provides detailed feedback, based on your submitted data visualization. It offers actionable advice on a range of areas such as:
Chart Usage
Ease of Use
Accessibility
It also provides its overall impression and recommendations. For example, in a recent visualisation, I was advised to clarify my smaller charts with "either additional labels, clearer y-axis markings, or improved scaling." What I love about this is that it's a quick way to get very personalised feedback on your work.
Tableau Certificated Data Analyst Exam GPT is a study tool for preparing for the Tableau Exam. It can provide you with:
Questions based on the syllabus content
Explanations of what you need to know on a topic for the exam
Accessibility on the go via the ChatGPT app
The GPT scored 85% in a full mock exam and has helped many in the community pass the certification. What I like about this GPT is how you can stay on top of studying even if you don't have much time - you answer a few questions on your phone before the next meeting or on a commute home.
Data Mockstar helps you generate mock datasets that feel like real-world data quickly and easily. If you've ever tried to share work externally but don't want to share company information this GPT can help.
Define the Dataset Domain, e.g. Healthcare, Finance, Renewable Energy.
Specify column names, data types, and constraints like cities in Europe.
State the number of rows needed, default is 1,000.
Output format, the default is CSV but JSON, Excel, and SQL are options.
And you've got a dataset you can use instead of sensitive data, perfect for sharing dashboard designs, training sessions, and community projects.
Tableau LangChain
Tableau LangChain is a modern AI framework that lets you combine the Tableau platform with AI technologies. Large language models, embedding models, and vector data stores can now have inputs from Tableau endpoints.
For example, with VizQL Data Service you can connect to a Tableau data source and query the data within it. Add Tableau LangChain to this, and now you can have an AI large language model querying the data source on a user's behalf. In effect the question, "How have sales performed over the last 2 years?" returns an output of recent sales data.
It opens up a lot of new opportunities. Typically to answer the question I would open up Tableau, connect to the data and build a chart. But now I can have an AI answer that user query for me, and that can be deployed as an app for multiple users to use.
Effectively we can start building AI solutions with the data and tools we know on the Tableau platform. This gives us more opportunities to service different stakeholder requests and develop high-demand skills.
Getting started building your own chat with a data source app can be completed with a few lines of code from the tableau_langchain project. The script below connects a tableau data source to OpenAI, but you could use any large language model of your choice.
This link will show the script outputs: Basic Agent Jupyter Notebook
import os
import json
from dotenv iimport os
import json
from dotenv import load_dotenv
load_dotenv('.env')
#base langchain library imports
from langchain.tools import tool
from langchain_openai import ChatOpenAI
from langchain.agents import initialize_agent, AgentType, create_tool_calling_agent, AgentExecutor
#tableau_langchain imports
from ..utilities.tableau.metadata import read
from ..utilities.tableau.setPrompt import instantiate_prompt
from ..tools.tableau.getDataTool import get_data
#read in the environment variables to query a Tableau Published Data Source from a target Tableau Cloud Site or Server Instance
datasource = os.getenv('DATASOURCE_LUID')
read_metadata_url = os.getenv('READ_METADATA')
query_datasource = os.getenv('QUERY_DATASOURCE')
token = os.getenv('AUTH_TOKEN')
## Use the read module to pull in the necessary metadata from your target Tableau Published Data Source. This module also augments the
## base metadata provided by the read_metadata endpoint by removing unnecessary fields and adding sample field values to each string field.
metadata = read(read_url=read_metadata_url, datasource_luid=datasource, auth_secret=token)
# use the instaniate_prompt module to add the data model metadata to the nlq_to_vds prompt
instantiate_prompt(metadata = metadata)
# Initialize an LLM and an agent with the get_data tool to answer natural language questions with Tableau Published Datasources
llm = ChatOpenAI(model='gpt-4o-mini', temperature=0)
tools = [get_data]
tableauHeadlessAgent = initialize_agent(
tools,
llm,
agent=AgentType.OPENAI_FUNCTIONS, # Use OpenAI's function calling
verbose = True
)
# Run the agent
response = tableauHeadlessAgent.invoke("avg discount rate by region")
print(response)
Bringing Modern Search Functions to Tableau Server
I'm a Tableau Server customer, and we have several data sources and dashboard views on our server, so being able to find the right data is highly important. Yet sometimes you'll find some searches will return a very different number of results, for example, "US" vs. "USA" or "bike" vs. "bicycle."
Seeing what's possible with Tableau LangChain, I was inspired to see if it was possible to bring an AI search to Tableau Server. What I have prototyped uses the Rest API and the Metadata API, which should be available to those on server version 2019.3 or later.
The workflow is:
Using the Metadata API extract details of my Tableau data sources, i.e. names, descriptions, project folders, and column details.
Combine this information to become my search criteria, namely the data source details with all the column names appended.
Pass search criteria to an embedding model to convert it into a vector and store it in a vector database.
Then route my user's query to my embedding model and find the best matches.
What is returned is a semantic search that performs well at matching the user's intent to the data provided. For example when I searched for "USA election" the original server search returned no results but the semantic search's first result was a "US politics" data source.
We can go further than this, it can also be used to match a natural language query like "How do I improve my fitness?" I can find a relevant data source, that matches the intent of my query. This functionality works as its own application but could be combined with Tableau LangChain's chat with your data to help determine the data source to be used. i.e. "How do I improve my fitness?" could find a good data source, and then the chat application could return insights from that source.
AI is providing how we develop new solutions to meet changing demands. For Tableau users, there's a growing number of options to integrate AI into your existing workflows, enhancing both your skills and the value you deliver. Community GPTs offer targeted solutions to specific challenges, whether it's obtaining feedback, sharing your work securely, or preparing for certification. With tools like Tableau LangChain, you don't have to start from scratch—you can build upon your existing Tableau projects to create fresh, AI-driven solutions. Adopting these tools helps you expand what you can deliver as well as develop skills in a high-demand field.