The social network says acquiring Gnip will help companies better understand what consumers and other brands are saying across Twitter.
Guiding Google’s spiders
In the growing web content universe, retail site pages must compete with the content offered up by every other kind of web site to be found by search engines—one reason Google launched the e-commerce-specific Froogle. Now, Google has created another opportunity for retailers and others to make more of their content more accessible to its spiders.
In the growing web content universe, retail site pages must compete with the content offered up by every other kind of web site to be found by search engines-one reason Google launched the e-commerce-specific Froogle. Now, Google has created another opportunity for retailers and others to make more of their content more accessible to its spiders.
A free, downloadable tool called a Sitemap generator, available in beta at Google, allows site operators to create and attach a specially formatted file to their web server. The file, called a Sitemap, directs Google crawlers to find what pages on the site are present, and which have changed recently. Goals are to shorten the time it takes for Google’s crawlers to find and index content and to help ensure content in the index is current.
“It’s like paid inclusion, without a price tag,” says Tim Kauffold, director of client services at search engine marketing firm OneUpWeb. “It’s a more proactive way for web sites to introduce their pages to Google.”
Sitemap seeks to make it easier for Google’s crawlers to digest dynamically-generated pages, used by many sites including retail sites. Search engine experts point out it’s been more difficult for crawlers to identify reference tags on dynamically generated pages -those whose creation is fed by a frequently-changing product database, for example-than it is for them to find tags on static pages. That means that such pages may not get into an engine’s index for possible considerations for inclusion in search results.
“If you have a 300,000-page web site, it’s not unusual for Google to have only 25,000 or 50,000 pages of it,” says Frederick Marckini, CEO of search engine marketing company iProspect. “The primary content of most web sites is captured-the pages with the most links that are most popular with the Internet community. By having all of this extended content, the rest of the story is there.”
Though similar to Yahoo’s paid inclusion program-without the fee-there’s a key difference. A paid inclusion program such as Yahoo’s guarantees that the URLs fed into it are in the engine’s index for possible inclusion in search results. Sitemap promises no such guarantee. “It’s really not a guarantee of inclusion in the index, but there’s a higher likelihood of it,” says Marckini. He adds that the tool will likely benefit online marketers from a revenue perspective as it creates more opportunity for their content to be found, particularly on so-called “long tail” search queries-those that are longer or more obscure.
“Retailers should benefit because they are not all very well represented in Google, so this would represent an opportunity to increase revenue for retailers, depending on the retailer,” he adds.