A Guide To Schema Markup & Structured Data SEO Opportunities By Site Type

Structured data can help you to send the right signals to search engines about your business and content. But where do you start? Columnist Tony Edward has some suggestions. The post A Guide To Schema Markup & Structured Data SEO Opportunities By Site Type appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

SearchCap: Google Knowledge Graph, Schema Markup & Local SEO

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google Knowledge Graph, Schema Markup & Local SEO appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Spammy Structured Markup Penalty & Recovery – Use Schema Markup With Caution!

Columnist Tony Edward notes that Google has begun issuing penalties for improper implementation of structured data markup. If you’re making use of structured data markup, here’s how you can avoid getting slammed. The post Spammy Structured Markup Penalty & Recovery – Use Schema Markup With…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

SearchCap: Google Magazine Ad Format, India Investigates Google & Bing Encourages Schema Markup

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: With New GDN Ad Format, Text Ads Will Compete In Display Auctions Today, Google unveiled a new ad format on the Google Display Network (GDN) called magazine ads…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

SearchCap: New Google Penalty, Action Schema & Subscribe To Google Trends

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Google Slaps Another Guest Blog Network: PostJoint Google’s Matt Cutts somewhat confirmed on Twitter that Google has taken action on another guest blogging…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Schema For User Actions Now Available

Schema.org announced a new form of Schema they have introduced with support from Google, Microsoft Bing, Yahoo and Yandex named Actions. Schema Actions are a way to communicate via markup on your web page the actions they enable and how these actions can be invoked. Technically, Schema.org…

Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Scraping Schema Markup for Competitive Intelligence

Structured mark up is crucial for e-commerce websites if they want to stand out in the SERPs. Because e-commerce sites are generally set up to scale, scraping all of their information is very easy. All it takes is a Screaming Frog crawl and Outwit Hub.

For dropshippers and affiliate sites, harvesting competitor data within schema mark up tags can be extremely useful. If you are selling the same products as your competitors, you can compare pricing, product descriptions, calls to action/special promotions – anything – and analyze how you stack up against your competitors.

Before we can start, we need to figure out where products live on the competitor site. If your competitor has clearly built out information architecture, it shouldn’t be too tough. On Target.com, they use the directory  /p/ for their products.

target_IA_example

Step 1) Crawl and Collect Product Pages

In order to get the pages that live under the /p/ directory, fire up Screaming Frog and under Configuration > Include,  add .*/p/.*

include p directory to snag products

Now your Screaming Frog export will only include product pages

So everyone can follow along and work with the same data, I’ve gone ahead and scraped all the laptops that are currently listed on the Target.com site, which you can get here:

 List of Target Laptops (09/10/2013)

Step 2) Analyze Structured Markup and On Page Elements

Take one of the product pages from your Screaming Frog Export, for this example, we’ll use the Acer Aspire 11.6″ Touch Screen Laptop PC page. If you enter the URL into the Rich Snippet Testing Tool you can see that Target is using a ton of structured markup on their product pages.

For this, exercise, we’re going to scrape:

  • Price
  • SKU
  • Product Name
  • Battery Charge Life (non-schema element)
  • Call to action/Promotion (non-schema element)

Step 3) Fire up OutWit Hub

Outwit Hub Logo

Outwit Hub is a desktop scraper/data harvester. It costs $ 60 a year and is well worth it. Outwit can utilize cookies, so scraping behind a pay-wall or password protected site is a non-issue. Instead of having to use Xpath to scrape data, Outwit Hub lets you highlight the source code and set markers to scrape everything that lies in between. If you are not a technical marketer, and you find yourself having to collect a lot of data/wasting your time – this is a good tool to have in your arsenal.

 Step 4) Build Your Scraper

This may be intimidating at first, but it’s so much more scalable then trying to use Excel or Google Docs to scrape 1000s of data points

In the right-hand menu, click on Scrapers. Enter the example Target URL. This will load the source code.

Click on the “New” Button on the lower portion of the screen and name your scraper. I’m calling mine, “Target Laptop Scraper.”

Outwit_Scraper_Build

In the search box, start entering in the markup for the schema tags you want to scrape for. Remember this isn’t Xpath, you don’t need to worry about the DOM, you only need to figure out what unique source code goes before the element (the schema tag) and what’s after it.

Extreme Close Up!

Scraper_Build_Close_Up

It will take some practice at first, but once you get the hang of it, it will only take a few minutes to set up a custom scraper.

Step 5) Test Your Scraper

Once you’re done entering in the markers for the data you want to collect, hit the execute button and test your results. You should see something like this:

scraper_test_for_outwit_hub

 

 Step 6) Put the list of URLs into a .txt file and save it.

disks for saving

Any of these storage devices or your local machine will do

 Step 7) Open the .txt file in Outwit using the file menu

If you go to the left navigation, just under the main directory, there is a subdirectory called “Links.” Click on Links in the left-hand nav. This is what you should see:

a list of links from outwit to scrape

Select all the data using Control+A and then right click on the row with all the URLs.

 Step 7) Fast Scrape!

scraping tons of schema with outwit

In the right click menu, select: Auto-Explore >Fast Scrape (Include Selected Data) > And select the scraper we just built together.

Here’s a video of the last step in Outwit

Step 8) Bask in the glory of your competitor’s data

 scraped pricing data from target using outwit

In the left-hand navigation, there is a category called “data”, with the subcategory “scraped” – just in case you navigated away from it, that’s where all your data will be stored, just be careful not to load a new URL in Outwit Hub or else it will be written over and you will have to scrape all over again.

You can export your data into HTML, TXT, CSV, SQL or Excel. I generally just go for an Excel export and do a VLOOKUP to combine the data with the original Screaming Frog crawl from step one in Excel.

Got any fun potential use cases?

Share them below in the comments!

Image source via Flickr user avargado

The post Scraping Schema Markup for Competitive Intelligence appeared first on SEOgadget.


SEOgadget