To kickstart our focus on collaboration I’m looking at using collaborative tools for research and benchmarking performance. We’ve written recently about the Technical SEO Audit Tools we use, but in this post we are focussing on 3 tools that use data sourced from multiple locations collaboratively to give that data a bigger picture, greater context and more insight as a result.
RivalIQ in a Nutshell
RivalIQ is a tool that gathers intelligence about Social and SEO activities for a website, benchmarked against a set of competitors that you identify when configuring a landscape. The core features we use RivalIQ for are reviewing social media data which gives insight into how a websites social activities stack up against the sites that you identify as your competition. It tracks progress of your audience in that landscape over time so you can review which of your competitor’s are performing activities that are really building an audience.
As well as this RivalIQ also tracks reach (potential audience/views), interactions (comments), applause (likes/favourites etc) and amplification (shares/retweets). This allows you to see which social media updates are really gaining traction for your own website or gain insight into what campaigns the competition are getting success with.
How it uses API Data
On its own, this is a really useful feature set with all of this data from multiple sources pulled in via APIs to give you the bigger data picture across the primary social networks. Without the ability to pull in data from these social networks, RivalIQ wouldn’t exist. Through using this data you are able to determine strategies to help you improve clicks, engagement and shares by working out what elements of your strategy are already performing, whilst learning from competitor activities that might be outperforming your own.
Added to this though, RivalIQ uses the SEMrush and Moz.com API’s to pull in an overview of SEO performance as well. This allows you to see a snapshot of organic visibility in terms of ranked organic keywords and estimated organic traffic levels along with data about the volumes of links and linking domains pointing at each of the sites in your landscape.
Overall, due to the varied data sets that RivalIQ has, it gives a really good big picture dashboard overview of performance for your digital marketing efforts and offers real insight into the elements that are working well for you or your competition.
Searchmetrics in a Nutshell
Searchmetrics is a search and content marketing intelligence platform. It collects its own ranking data for the top 100 million keywords searched for in Google across a number of international markets and produces visibility scores based on the search volumes and ranking positions of those keywords in organic and paid search:
As we’re an organic digital marketing agency, the primary purpose of us using Searchmetrics is to gain insight into overall organic visibility for the sites we are working on or ones we are taking an interest in (we might be looking for a marketing opportunity on one of those websites, look at a prospect to determine what levels of visibility they have or be doing competitor research).
How it uses API Data
Beyond its own dataset gathered from gaining rankings for a particularly set of terms, Searchmetrics use of API and other external data sources makes it function as a tool. Without being able to pull in that data, it wouldn’t know how which keywords were of sufficient volume to be checking, let alone use a weighted algorithm to produce its visibility scores.
Searchmetrics offers a lot more than simply organic and paid search visibility intelligence though. From various social and link APIs and data sources it pulls insight into a websites top performing content across social media platforms and on the web in general by looking at backlink data to specific pages.
The social visibility data gives you a spread of the networks that content is being shared on as well as overall visibility scores in tandem with total shares across networks for individual content pages.
As well as the search and social visibility data it also gives an overview insight into the backlink volumes for a website, giving details on the most popular pages on the website from across the web:
LinkResearchTools is a link intelligence suite with a range of tools for a number of different tasks for link analysis.
There are lots of different tools within this suite which can be used for quick domain comparisons, link detox auditing for reviewing poor quality links that might be hampering a websites SEO, tools for prospecting as well as screening tools for looking at sites before you secure a link on them.
How it uses API Data
These tools pulls in data from a number of different sources including:
- Google Webmaster Tools
- Google Analytics
In total it pulls in data from 24 external sources and then uses its own internal proprietary algorithms to amalgamate this data into a bigger picture.
Without API and externally sourced data LinkResearchTools wouldn’t exist today as we know it. Whilst there are plenty of elements within the tool that are their own “property”, LRTs excellence is borne from the data that it’s users plugin to it from, with that data then spun into a more usable form by the elements that LinkResearchTools brings to the table – either through sorting that data with its own algorithms, evaluating that data and giving “scores” based on proprietary metrics or through simply displaying summary data from across those sources in an easily digestable format.
An example of how they sort data and score it with their own algorithms can be seen below with a Quick Backlinks report.
LRT takes the data its pulled in from other sources to give an indication of the power and trust of that domain from its own scoring data system, but also displays summaries of an amalgamation of the data its pulled in to give break downs of anchor text, deeplink to home page rations, popularity by top level domains and country and much more.
In terms of summarising data in a clean, easy to consume way, the Quick Domain Comparison report gives an overview of the popularity of domains using this combined data:
This overview report illustrates how data from a number of sources can be displayed to give a quick overview of popularity.
One of the main activities we use LinkResearchTools for is for backlink analysis, particularly for websites that have manual penalties or are suffering from algorithmic devaluations. Its detox reports uses the data that it pulls in from a variety of sources to give an indication of a particular links quality.
From there though, they help make your life a little easier by using a whois API to pull in contact details for the owners of that website to help you know where to reach out to when trying to get problematic links removed that point at the website you are working on.
We hope you’ve enjoyed this post and learned about how tools can use multiple data sources to give a bigger picture than simply showing a smaller subset that they collect and display on their own. We’d love to hear what collaborative data tools you like to use too.