Fixed issue with exceptions being thrown when attempting to read packaged files after opening a TWBX or TDSX file in certain cases.
Fixed issue with exceptions being thrown when attempting to read packaged files after opening a TWBX or TDSX file in certain cases.
We’ve been working on updating Power Tools: Desktop, Deployment, and Server for Tableau 10.5 over the past few months. Today, we’re releasing an update to Power Tools: Server bringing beta support for Tableau 10.5 support across the suite.
“Beta support” simply means we’ve done an initial round of testing with the latest version of Tableau and all critical functionality is working as expected. A future update will bring “official support” which means we’ve fully ensured compatibility with the latest version of Tableau.
We mentioned last month that we’ve made a few changes across all the tools to prepare for Tableau 10.5. You can read last month’s post for the details but as a quick reminder:
Power Tools: Server v1.5 includes many improvements to overall stability. Rather than focusing on new features, this release was all about hardening up the features we already have. But, of course we also added support for Tableau 10.5! Read the full release notes for v1.5.
Already running Power Tools: Server? Contact our support team and we’ll help you get upgraded right away.
We are already hard at work on the next update to Power Tools: Server. Stay tuned for more news as we continue creating the best solution for monitoring your Tableau Server performance.
Added support for Tableau 10.5 including support for Hyper data connections.
Added ability to modify the schema property of HP Vertica data connections.
Fixed bug in the handling of text marks with empty content.
Removed support for Tableau 9.1. WBSDK continues to support Tableau 9.2-10.5.
Our development team has been steadily building in compatibility adjustments for the upcoming release of Tableau v10.5. Since we know that Power Tools for Tableau users often represent the early adopter segment of the Tableau community, we wanted to go ahead and pass along some information that will be helpful as you prepare for this exciting new release of Tableau.
If you’re brand new to our toolset, this may be more detail than you want to dig into right at first. Feel free to check some of our other blogs on new features, video demonstrations or our 2017 Spotlight Series. For those who already know and love Power Tools for Tableau, the following updates will be important to be mindful of as you continue to use the tools.
We have a new requirement for the enablement of the REST API on Tableau Server. Most of our users already have this enabled, but we wanted to be clear that it’s now required for everyone. The REST API on Tableau Server is actually enabled by default (and has been since Tableau released version 8.3), but we know that there is a possibility for people to go in and change those settings. We want to make sure that this isn’t overlooked.
Above: To enable the REST API, head to the command prompt.
If there happens to be a scenario in your organization where someone requested the disabling of the REST API, please feel free to share the specifics of your scenario in the contact form below. We’re happy to talk through more of the specifics with you. Even if you don’t plan to upgrade to v10.5 for a while, you’ll want to have this option enabled on Server. It will ensure that there are no hiccups in user experience with our tools.
Several folks have also been asking us about the brand-new capability in v10.5 to create sub-projects on Tableau Server. As you can imagine, that kind of adjustment in content hierarchy is a big deal for both Tableau and for our tools, as well. We are actively developing enhancements to allow our users to take advantage of new functionality around scanning and retrieving the workbooks in sub-projects on Tableau Server. For the time being, however, all our tools will continue to work with top-level projects only. If you’re eager to get any updates on the progress of these enhancements, please let our team know. We can keep you in the loop.
The Performance Analyzer was one of the first tools that we made available to the community, and we’re proud of how it’s helped so many users manage their Tableau workbooks. With this upcoming Tableau v10.5 release, the time has come to say goodbye to this tool in its present form. Don’t worry though – it’s not disappearing from the application. We’re just moving it to the Legacy Tools section in our next release of Power Tools: Desktop.
Above: The Performance Analyzer interface.
Performance Analyzer is tailored for inspecting individual queries that are executed in a workbook. It is also good for seeing the distribution of time spent rendering each sheet on a dashboard to understand which sheets take the most time to render. With the advanced functionality in Tableau v10.5, the Performance Analyzer will not be able to take advantage of the complex concurrency that is taking place with Tableau workbook queries. Since it will not be suited for these in-depth comparisons, we have decided to designate it as a Legacy Tool.
You will still be able to utilize it for Tableau v9.3-10.4. If you’ve been a big fan of the Performance Analyzer and you’d like to know about more of the specifics behind our decision, just drop us a line here and let our support team know how they can help.
2017 has seen some exciting announcements come out of the data community! Linux is finally coming to Tableau and Hyper replaced Tableau Extracts (TDE). Power Tools for Tableau has also had its fair share of impressive releases. From updated versions, new features and case study blogs, we’re eager to share this year’s Power Tools highlights!
Each application of Power Tools for Tableau saw important updates and features added. Power Tools: Deployment had the most versions released this year. After beginning 2017 with version 1.22, it is now ending with 1.29. Likewise, Power Tools: Desktop and Server have also been upgraded and their versions now stand at v1.26 and v1.4. To gain a comprehensive view of what changed in each tool, check out our version summary blog.
Other features we were excited to roll out this year include:
Viz Templates, however, is possibly the most exciting feature announced this year. Thousands of users work with Power Tools: Desktop and it has saved them countless hours. One of the most popular tools has been Style Management, and we’ve received many requests over the years for increased functionality around customizable style sheets and themes. We responded to those requests with Viz Templates.
With Viz Templates, you have the ability to create clean worksheets with just a few clicks. Your organization’s design best practices and custom templates can be saved and quickly applied to workbooks in an automated fashion. And that’s not all! In the coming months, Viz Templates will have some brand new capabilities added to the graphical style editor and more options for field labels, data highlighters, totals and marks! Learn more about how you can get started with Viz Templates on our blog.
Above: Example of the Viz Template home screen.
We also amped up our blog content game! We participated in two webinars over Best Practices for Tableau Performance and Automation and Quality Assurance with Power Tools. Plus, we created a number of videos to help the Tableau community get better acquainted with these tools. Our demo videos looked at:
Also in 2017, we published many new blogs for our Spotlight Series, which looks at individual use cases of certain applications in the Power Tools suite. We interviewed several InterWorks consultants to find out how they have used Power Tools to help their clients. Unsurprisingly, Power Tools became a necessity for maintaining and organizing their workbooks. You can see the full list of Spotlight posts below:
Above: The Environment Overview dashboard in Power Tools: Server.
It’s been exciting to see Power Tools expand throughout the global Tableau community in 2017. These tools are now used in more than 20 countries and word is rapidly spreading about their ability to transform the way that people use Tableau. We saw several thousand users make their way to the Power Tools website to download trials this year. In fact, there was a 40% increase in the number of downloads from last year! If you haven’t yet had a chance to download Power Tools: Desktop and get started on your discovery, don’t wait any longer!
This was also our first year to get things churning with Power Tools: Server. Several clients around the world got up and going with this brand-new tool and we’ve been excited to see them consider it as an integral part of their process for managing Tableau. We would be glad to help set up a free trial of Power Tools: Server so you can see how it helps monitor and troubleshoot the performance and usage of Tableau Server.
If you’d like further information regarding any of these features or tools, please get in touch with us!
I work with a lot of organizations that want to take advantage of the efficiencies of hosted data sources. Their IT and BI teams manage data that’s been provided to business users for ad hoc analysis. The challenge that these teams inevitably run into is change management across all of their centrally hosted data. When it comes time to rename a column, how do you go about handling that in the least disruptive way?
I’ve seen some teams go ahead and make the change. Depending on the number of business users, that could end up being messy and resulting in tons of complaints. It’s usually best to socialize the changes before they happen. But how do you go about doing that when it’s so hard to track down all the different data source dependencies?
My colleague, Jubail Caballero, shared that he recently ran into this exact scenario at an oil and gas company in Houston. The number of calculations they needed to manage was a few hundred. It would have likely taken Jubail several days to build out a list of all the different dependencies.
Fortunately, he was able to save that time and use the Data Source Audit tool found in Power Tools for Tableau: Desktop. After generating just one initial PDF export from this tool, they were convinced it was going to be a critical part of their governance process. It’s now something they can generate on a regular basis.
Above: An example of Data Source Audit’s results.
The Data Source Audit tool enables a constant awareness of how these hosted data sources are utilized throughout the various workbooks on Tableau Server. The output gives a list of the worksheets and calculations that depend on a specific field and data source. It’s easy to track down the authors of those workbooks to discuss the changes.
The great thing about this approach is there is no room for human error in the process of tracking down all of the different dependencies. When it’s a manual process, it’s easy to forget that a workbook has a dependency. That is especially the case if it was built several weeks or months ago. Data Source Audit eliminates this never-ending guessing game.
Instead of taking several hours to circulate a change request, you can simply run a quick scan of all the workbooks on a particular site or project on Tableau Server. You will then have the option to export the results to either a Tableau workbook, an Excel file or a PDF. I’ve worked with a lot of folks that prefer the Tableau workbook because it’s very easy to publish it back to Tableau Server where everyone can see results of the audit on a regular basis.
The efficiency happens at different points in the process. You are, of course, saving time by not breaking things. Another big gain is in adoption. It’s generally not great for the subject matter experts to be emailing business users saying they’re going to make a change and break a bunch of stuff. When workbooks start breaking in unplanned ways, it can definitely hurt user adoption.
If you can carve out fifteen minutes or so, it’s easy to run a test audit on your workbooks and see the advantages for yourself. The free trial usually takes less than a minute to download and it’s very easy to go through the process. This five-minute video will show you all the basics. We look forward to hearing how it goes, and please send us a note if you need any assistance.
Since Power Tools for Tableau: Desktop was released, I’ve seen it have a significant impact on several InterWorks clients. Any organization that’s attempting to implement iterative design or an agile process for publishing Tableau workbooks would really benefit from testing out the features of this tool. Why is that? Because when you’re moving Tableau workbooks between disparate environments on a regular basis, it’s important to consistently check the quality and make sure that workbooks are up to the highest possible standard.
If you’ve not yet set up a structured process, Power Tools: Desktop can help you accomplish the transition by giving clear standards and measures for benchmarking all your Tableau workbooks on Tableau Server. We work with a lot of Fortune 100 clients that find themselves in a place where these benchmarks become critical to scaling Tableau.
When setting up the necessary benchmarks, I usually start by looking through all the data connections with the Data Source Audit tool found in Power Tools: Desktop. It gives a quick glance at the types of data sources being used, the list of calculations in the Tableau workbooks and if there are lots of unused columns.
It also helps me see things from a bird’s eye view and immediately find out how many data sources are being utilized for a set of Tableau workbooks on Tableau Server and how well we’re utilizing those data sources. I’m then able to see where there are redundancies and where I can clean up the data to make it as performant as possible.
There are also some great exports for the Data Source Audit tool. It’s now common practice to have these exports generated every time workbooks are about to move into production. Some people will export to a PDF format and use that as a data dictionary.
Perhaps my favorite export option is the Tableau workbook export. It’s been great to see that export evolve over time to become more and more useful. In fact, I think it’s one of the most used features in the Power Tools suite as I run into a lot of situations where disparate data sources are being used.
Above: A look at the dashboard in the Tableau workbook export from the Data Source Audit tool.
In a recent scenario, the workbooks were pulling data from Snowflake, Teradata and some different flat files. We were able to quickly show where everything was coming from and validate the connections. In addition to checking data sources, most large organizations have strict style standards that need to be applied to all their Tableau workbooks when going into production.
Power Tools: Desktop is great at checking styles and bringing conformity to formatting throughout a set of workbooks. The most recent addition is a tool called Viz Templates. It lets the user save predefined style selections for their workbooks. Here’s another article that goes more in depth.
The last step I’ll take to make sure my workbooks are ready for production is to check performance with a couple other tools in Power Tools: Desktop. The Best Practice Analyzer tool shows me if there are any Tableau workbooks experiencing a significant number of performance issues. It will call out specific issues in the workbooks that are causing slower loading times.
I usually take those Tableau workbooks over to the Performance Analyzer tool and narrow down the specific loading times for the problem worksheets and dashboards. That helps me see how far the workbooks are from performance benchmarks.
After each of the best practice issues have been addressed, I’ll then run the workbooks back through the Performance Analyzer. By that point, the Tableau workbooks are usually back within the performance standards that we have for the production environment.
If you’re an organization that knows you need more structure in your workbook design process, let InterWorks show you how Power Tools: Desktop can become a linchpin in that process. I’ve seen it have a tremendous impact not only in the realm of time savings but also in how it brings a ton of insight to administrators and everyday Tableau users.
When it comes time to start moving Tableau workbooks between your QA environment, I’d also recommend looking into Power Tools: Deployment. It will streamline the whole process of getting your approved Tableau workbooks into the production environment. We have several blogs and videos on Power Tools: Deployment. I hope you’ll be able to check those out, as well.
The other option is to just skip that step and talk through all the possibilities on the phone. If you’ll give us a heads up in the contact form below, we can give you a quick rundown.
Over the years, I have seen several organizations begin to lose trust in the data behind their carefully crafted Tableau dashboards. It can be easy to get to that place, but it’s completely unnecessary. Why get to a point where you start jeopardizing sustained levels of user adoption because data source usage is out of control? Maybe you’re at that place now or perhaps you want to know how you can avoid becoming desperate. There’s a better way to document your data sources and bring established methods to the madness.
Let me run through the details of a scenario that played out recently with one of my clients. The client was looking to outside consultants to provide insight on getting better performance in their workbooks. Sadly, they kept coming up empty-handed. Their deadline was getting closer and they needed someone who was willing to take on a seemingly impossible situation. So, they turned to InterWorks, and I was the one tasked with helping them get a better understanding of how they could turn things around.
I immediately realized that we needed to focus on how they were using calculations. Their particular set of workbooks had over a thousand calculations and the logic was very complex. Figuring out a way to reduce the number of calculations was a big deal in and of itself. But we also needed a way to monitor how calculations were changing on a regular basis. I could see why they hadn’t set out to do this sort of thing before. It’s a daunting task to try and build out documentation for that many calculations, plus keep it updated on a regular basis.
The good news is that I was able to leverage a tool that InterWorks had already built. It’s called the Data Source Audit tool and it can be found in Power Tools for Tableau: Desktop. If you haven’t checked out a free trial yet, you can do so here.
In less than a minute, we were able to use this tool to export a list of calculation logic in their workbooks on Tableau Server. We were able to quickly determine where the calculations were coming from. In this case, it was either going to be coming from an Excel worksheet, the Salesforce system or a data warehouse.
The Excel export from our Data Source Audit provided a list of all their calculations. We distributed the list to their finance team and asked them if they agreed with the calculation logic. After receiving feedback, we were able to make adjustments and reduce the number of calculations. They could easily run the exports on their own whenever they needed to go back through the process of verifying their calculations. This meant that documentation would be visible to the end users on a regular basis.
There is also a great Tableau workbook export that is probably my favorite option in the Data Source Audit tool. It’s easy to publish to Tableau Server where everyone can see it. I also like to use the workbook extract and make some of my own custom views that show calculated fields along with some additional notes about the logic. I’ll typically add a tab to that before I publish it as a glossary.
The Data Source Discovery tool, also in Power Tools: Desktop, goes hand-in-hand with Data Source Audit. It helps boil down a large volume of information about your data sources. In this project, we ended up using it to answer some other critical questions. There were over 100 fields in one of the data sources. We were trying to find out in which field we could find this piece of information. Instead of having to make more calculations, we had several instances where we were able to attain the necessary information from an existing field. The Data Source Discovery tool helped narrow that down.
Above: The initial data source selection screen in the Data Source Discovery tool.
This project ended up having a drastic impact on performance, and we were able to resolve their issues before the looming deadline. You may be in a similar situation where you’re trying to avoid some really expensive, potential band-aids. Perhaps you’re in a better position where you’re choosing to invest the time up front in data architecting. These tools will help you in either situation and we would be glad to walk you through some examples of how they could be used in your environment.
If you want to learn more about these features and more, head over to the Power Tools for Tableau: Desktop page and try a free trial for yourself.
While working with one of our clients in Germany, I came across an excellent use case for Power Tools for Tableau: Deployment. For a little background, this client serves their clients by analyzing survey data. Here’s an example of how they work with data:
Let’s say you have several gas stations and you want to get a better understanding of the customer experience at each of your locations. They would send out a survey to customers/employees to gauge things like cleanliness, convenience and overall satisfaction with the gas station. They would then compile all that survey data into the key metrics for you to provide actionable insights.
When I started working with this client, they were just starting to put together a plan that would involve building standard survey result dashboards in Tableau for each of their customers. At the time, this included about 50 different customers.
This effort soon gained a lot of visibility and demand was skyrocketing. We quickly saw the number of customers go into the hundreds and even the thousands. The problem there was that the manual effort involved with building these dashboards started to get out of hand. Here’s what we had to do on a regular basis:
The survey data would come in and need to be cleaned up before it could go into a SQL table. Once it was in a separate SQL table, I would make a Tableau dashboard that would incorporate some standard design principles but would be slightly tweaked for the customer.
With this kind of process, you’re looking at having thousands of SQL tables and thousands of dashboards that must be individually created and maintained. We were quickly getting into a realm where it was not feasible to scale anymore. Even if multiple people were tasked with doing this sort of thing all day – and every day at that – there would be so much potential for human error. What we really needed was a way to automate all of this.
The good news is that we had started seeing Power Tools: Deployment used for these types of scenarios with some of our other clients. It was my first time to work with the tool, but the learning curve was not steep at all. In less than a day, I had a firm grasp on how the tool could be a game-changer for this client in particular. Here’s how we did it:
Instead of creating thousands of dashboards, we just created one. It would serve as our template dashboard that could then be replicated across all our different customers. Within Power Tools: Deployment, there is the Data Source Transformation feature, which allowed us to swap out data sources before deploying to a different destination on Tableau Server. We used that feature to make sure each report was pointing to the right SQL table and would then configure the other tweaks to the dashboard. Other edits included the modification of user and permissions for the report on the destination server.
Above: Power Tools: Deployment moving mass amounts of workbooks.
Now that we had a viable solution for replicating the template dashboard, we just needed to figure out a method of getting the data formatted in a similar way. We ended up writing a simple script that hooked into Alteryx, which automated the creation of the SQL tables. By adding Tabcmd into the mix, we could also automatically provision new users and create a new site/project on Tableau Server. We now had a solution that could be automated start to finish.
If Power Tools: Deployment and Alteryx were not in the picture, it would take anywhere from two to five days to handle this sort of process manually for each individual customer. When you multiply that by thousands of customers, you are looking at some significant time savings. Beyond the time it saved by not having to manually create thousands of dashboards, it was also clutch from a security standpoint. We were able to keep client data in separate SQL tables as opposed to creating one huge table that stacked the data from thousands of customers together. It would have been way more difficult to provision security using the old way.
This brand-new approach created tremendous value for everyone in the picture and drastically cut down the amount of time involved with getting new customers online with this unique service. If you’d like to explore how Power Tools: Deployment can automate your Tableau reporting process, be sure to learn more and contact us on the official Power Tools: Deployment page.
I’ve had the opportunity to work with several organizations who have used Tableau for many years. They all eventually get to a place where they need to implement some standards. This can be a real challenge when there are a lot of users who have all become set in their Tableau ways.
I recently worked with one of the largest sports apparel providers in the world on how to consistently achieve best practices in Tableau across multiple departments. In addition to applying design standards to their workbooks, we started analyzing which manual processes were stealing time from their users and what could potentially be automated.
When you multiply each manual process by hundreds or even thousands of users, the amount of time sunk into those processes is massive. For example, it can take a very long time to ensure your workbooks point to the right data sources before they are moved into production.
As a consultant and frequent Tableau user, I always start by inspecting the data sources in the Data Source Discovery tool. It provides me with high-level statistics regarding the data sources used in the workbooks. Plus, it can quickly identify issues such as duplicate rows or excessive null values.
Next, I use the Data Source Audit tool. The audit generates a list of calculations from the workbooks and helps ensure the logic is correct. It can also build the documentation I need to easily export from the tool. Thanks to Power Tools: Desktop, the client’s organization has increased governance surrounding the auditing of data sources. I’ve seen firsthand how this tool has played a critical part in this shift.
Above: Example of the Tableau workbook export from the Data Source Discovery Tool
In addition to the Data Source Audit tool, the client also used Alteryx. Part of this process is to check terms against their data dictionary. If any terms are used in the workbook but are not used in the data dictionary, this enables them to catch those terms. If a term happens to be new, then it’s added to the dictionary. If it is not new and has been named incorrectly, it will be replaced with the dictionary’s term to maintain consistency.
Over time, this audit has played an important part in encouraging their team to make use of Tableau features, such as field captions. The client’s documentation is now more thorough as a result of storing information in their workbooks and being able to export it through Data Source Audit.
Another move toward standardization is in workbook styles. Since this client has built one of the most powerful consumer brands in the world, they are keen for their branding to be represented accurately in their Tableau reports. Before the production stage, the Style Management tool is used to scan all their workbooks on Server. During the scan, it checks if each formatting selection is within their style guide.
Performance is another increasingly hot topic for this client as they want to ensure each workbook has the highest possible loading times. Just like the other tools I’ve name-dropped in this article, Best Practice Analyzer is also found in Power Tools: Desktop. It has become an important benchmark in the process of moving workbooks into production for this client. It quickly indicates which items can be addressed for the biggest gains in workbook performance. There’s also enough flexibility within the tool for these best practice rules to be configured for the client’s specific needs.
Don’t just take our word for how revolutionary Power Tools can be for bringing standardization to your Tableau workbooks. Try a free trial of Power Tools: Desktop today and experience the efficiency for yourself.