SEO for local stores with national ecommerce websites

Why sell to just one city when you can sell to the whole country? Foot traffic is still a great way to offer services and sell products, but many local stores have gotten into the national market and want national results without hurting their local search rank.

Does optimizing for my local store negatively impact my national search rank?

No, but it is important to have dedicated pages for your local stores. If you have multiple stores, include a location finder with a page dedicated to each location. Share as much information that is relevant to this specific location, like it’s hours, local coupons, contact info and directions.  Additionally, don’t neglect your product or service on these local pages.  Andy’s Pizza Place, is inline with likely search terms, but if your name it Andy’s Place, emphasis Pizza in the local pages in addition to the physical location.

Should I include my in store inventory online?

Ideally, users would have the ability to identify the physical store location where a product is available in addition to online purchasing options.   You don’t want to hurt local sales at the expense of online sales. Provide the buyer with the best information and let them choose how they prefer to purchase.  A product you offer may potentially be available on 100+ online stores, but buying options in your local market are certainly more limited.  “Available for pickup today”, can be a compelling marketing tool that gives you a heads-up on the Amazons of the ecommerce marketplace.

My location is in the metro area but outside of the city proper

Google does include distance in it’s ranking factors, but they also allow you to designate your service area in Google+.  Using a fake address, changing the location of the pin in Google maps, or using a PO Box, will result in a penalty.  You can add another legitimate address within the city, where you meet clients or conduct business, even though it isn’t your headquarters.

Here are a few tips for your local page:

  • Add a few direction from local land marks or populated areas
  • Add geo tagged photos of you store location
  • Remove address and Google+ from product pages and focus on a single local landing page (per location)
  • Get quality links from authoritative local domains to your local landing page

Do reviews help national or local SEO?

You want to encourage reviews for your store location and your ecommerce products.  Positive reviews should be encouraged on any user generated review site, i.e., Yelp, Yahoo, Zagat, etc., but keep in mind that, although these reviews may impact your search rank, they will not be visible in good search results, these are now exclusively Google+ reviews. Encourage real reviews from real customers.  Google will filter spammy reviews and could potential penalize you for fake reviews.

Implementing the proper schema makeup on your products pages allow search engines to identify the product name, description and rating, and can then pull that data into search engine results.

My local rank dropped after a Penguin update

After the Penguin 2.0, many local businesses with low quality links from directors were penalized.  Make sure your links and directory listing are coming from industry relevant sources.

You should identify and delete the low quality and irrelevant links and focus on sites that are related to your business and sites which see user generated reviews and content.

Summary

Organically ranking locally is much different than ranking nationally, but your efforts for both depended on quality, unique content onsite and real user generated content offsite.  Massive unrelated links and fake reviews that used to work now cause penalties.

Posted in Local SEO, SEO tips | 1 Comment

Leverage your product: How to make sure it’s worth growing

Product demand

Sit down and ask yourself. Are you answering the needs of the consumer? Will they miss you if you were not in the market place. If your answer is is less then 40% you should revise until you have an answer to market need.

One of the crazy facts about me as an entrepreneur is that I love trying out new ideas. I have a furniture site, an online flower shop, and some other eCommerce sites that I test Product/Marketing/Conversion strategies on continuously. One of the big takeaways I learned from with these tests is how important market demand is for items.

For this article I will be referencing my success with my Online Web Design Business: Creativehaus. I will also give examples of my brick and mortar company: Fox and Jane Salon.

It has now become our mission that every product we work on must answer this question: How would you feel if you could no longer have this product? — CreativeHaus really answered this question first hand. We get so many clients who literally say they couldn’t have put their business online if it wasn’t for our flexible prices and support. Our cancellation rates compared to our other services are extremely low due to the fact that we answer such a demand. We never nickel and dime the client, we found this was a retention issue with web designers who would get the client for an initial design, they would then leave and go somewhere else a year or two later.

Fox and Jane was an early adopter to this concept in the brick and mortar perspective. We slightly underpriced our high end, boutique experience. All our stylists must have 5 years minimum experience and go through an intensive training process. Finally we heard from clients that a business reflective to the community goal was important. This answers the demand of quality, affordable, services with ethical mission statement to help the local groups. Since its inception 3 years ago we have now grown to be the busiest salon in NYC area and do close to $4 million a year in business.

Give your budget a fighting chance

Whenever we first start marketing efforts we always start the uphill battle of conversion and budget.

I quickly try to learn about the emotions of the visitor so not to waste PPC budgets, this is otherwise known as A/B testing. I also meet with my sales team/office staff to figure out the exact concerns of a client as they contact us. I try to subtlety answer concerns they might have whilst on my sites. I spend time making changes and addressing the visitors to conquer conversions.

Fix leaks

Creativehaus did this after a bumpy few weeks with conversion tracking. We updated the image of a mobile device and a website on it, no contract, all inclusive. We also took away online ordering of the service, now a client fills out a form and gets to talk through the issues and design they want to have. CreativeHaus now has a close rate of 60% and a conversion of 1:10 to 1:19 depending on if I’m running Pay Per Click.

With Fox and Jane we have our prices clearly posted, reference to yelp reviews, well branded website, and the ability to book online. Our conversion is now between 1:6 to 1:8 depending on seasonality.

This has allowed us to leverage marketing spends and have a strong return on investment.

Increase ease of action

“Nothing ever comes easy” – Should not apply to a conversion on your website. You want to make the ease of purchase instant. Amazon has done this with 1 click purchasing.
I typically have a form visible for ‘fill out’ on every page of my site. I also try to reduce the process of eCommerce purchasing with fast checkout options.

Fox and Jane helps this by directing clients to a specific geo page based on their location, appointment setting, and a phone number that they would need to make an appointment. Many other location specific Salon sites lack this goal.

In all…

You have to find the best conversions for you. What might sell paper plates may not be the same conversion ideas of someone selling high end rugs. Keep this in mind and be agile in your approach to updates and changes.

Posted in Advice, Conversion Optimization, Onpage Optimization, SEO Advice | Leave a comment

Single vs. Multiple Word Queries

As Google has evolved to better understand who its users are and what they want, SEO companies are continually pushed to reevaluate their onsite optimization strategies. With the search giant zeroing in on perfecting the “answer engine,” keyword strategy is now synonymous with providing the right content for users’ questions. Provide the right answers, get the clicks, get the traffic, and get the conversions.

So how do people use Google?

For most SEO companies, the data is already within reach. Google Webmaster Tools provides specific user queries for which your website’s URLs appear in the SERPs (Search Engine Results Pages), the number of impressions generated from these queries, and the number of user clicks per query. This can be supremely helpful for identifying new keywords and improving onsite content for higher clickthrough rates, thus resulting in more traffic to your site.

With all this available data, we decided to do a brief study of one website, a successful SaaS company. Focusing in on user search trends, we looked at the prevalence of short (aka head) vs. long tail keywords, and the success rates for each of them. We wanted to know which queries – long or short – were used most frequently, which generated the most impressions, and which resulted in clicks.

Comparing single vs. multiple word searches

Using one month of Webmaster Tools data from the SaaS company, we divided up some 2,057 queries based on the number of words in the searches, from one to ten or more. Below are the results.

Single vs. Multiple Word Queries

The arc of the results proved two-pronged, showing the highest clickthrough rates with one-word queries (17%), then on the other end of the spectrum with eight-word queries (39%). The latter lends credence to the “answer engine” model, wherein most of the queries were phrases or interrogative (e.g. “who, what, where”) searches.

longtail clickthrough rate

We know that the length of a search query often reflects the user’s intention. They search a one- or two-word movie title if they want local movie showtimes or the name of an actor, but then turn around and compose an 8-word query if they’re trying to find that damn song from the soundtrack they just fell in love with. And while this may seem anecdotal, this phenomenon is at least backed in our study by the individual queries themselves. The single word searches are largely branded terms, while the 8, 9, and 10+ word searches are geared more toward technical specifications of the software in question.

How do we target multiple word keywords?

With this knowledge in mind, keyword strategy must align more with helping Google determine the search results that are right for each user. Instead of looking for keywords with the highest search volume or “right amount” of competition, it might be prudent to envision the questions that users are asking — and then answer them with smart content. Rather than measuring the keyword density on your main pages, try to see how many questions these pages lend advice to — whether your target market wants to know “Who provides emergency plumbing” or “What is the knowledge graph.” Less Keyword Planner, more brainstorming sessions.

While this data is from one site alone, and therefore prone to a certain margin of error, it is certainly enough to get the discussion started. We are certain that, in order to increase organic traffic and conversions on your site, a long tail (answer-based) keyword strategy will prevail. This is further backed by other industry experts’ opinions in light of the Hummingbird update which was aimed at improving results for long-tail conversational queries.

Posted in Google Webmasters Tools, Keywords, Search Engines, SEO Advice | Leave a comment

Tracking Time: Eye-Opening Insights Into Productivity

DSC00203.JPG

Here at the haus, we are always looking for ways to improve our systems internally and better improve the services that we provide to our clients. In an effort to streamline many of our internal processes and procedures, we recently conducted an internal audit of how employees were spending their time in order to identify strengths, pain points, and areas of improvement or potential bottlenecks that could be keeping us from performing at our optimum potential.

While we are still pouring over the data internally to better find out what the figures mean for us, there is still no shortage of enthusiasm about a project that proved to be simultaneously painless and incredibly eye-opening. Below, we’ll go over the methodology and data analysis for a project like this, along with some key takeaways from the project.

1.png

This all came about when managers were discussing some of the best possible ways to improve our product, additional steps to make part of our process, and the additional team members that would be necessary to accomplish our goals. While these goals were relatively flexible, we unearthed one major caveat–namely that we didn’t have enough information on how these items would actually improve efficiency, because we weren’t exactly sure how time was being used on both a macro and micro level.

In order to take a look at how team members spent most of their day, we did the following:

Start with the Job Description

We identified key areas that we found to be important components of the job description. These included things like email and phone correspondence, keyword research and competitor analysis, reporting, troubleshooting, development, content creation, and more. Our final list looked something like this:

  • Email

  • Phone

  • Reporting (SaaS)

  • Reporting (Manual)

  • Retention

  • Outreach

  • Scheduling and Assigning Content

  • Link Prospecting

  • Keyword Research

  • Competitor Research

  • Technical / Development Projects

  • Technical / Working with Developers

  • Technical / Optimization

  • Technical / Link Audits

  • Technical / Site Audits

  • Technical / Crawl Errors, 404s & Redirects, etc.

These represented key categories that could potentially illuminate how members of our SEO team could be working to serve their clientele. From email and phone correspondence, to more technical considerations like link audits, site audits, and crawl errors, there is a considerable amount of flexibility required in order to successfully complete the job.

There were several other important components of the job that we felt could have been included, but this represented a key list of core responsibilities that seemed to best encapsulate what each member of our team should be working on at a given time.

Creating A Google Form

In order to capture this kind of information, we needed to find a way to be as seamless and as least invasive as possible. A form, rather than a spreadsheet, email, document, or project management system, offers the flexibility of a push-button submission. This seemed to reduce friction in the collection process, and ensured that responses remained consistent across the entire staff–a critically important factor if you want to analyze data from a top-level perspective.

Collection Period

We collected data over the course of  period of two weeks. This allowed us to capture both the reporting side of the workload as well as the more technical and creative spaces within which we work. By limiting the data collection to two weeks, we got a significant enough response rate to notice some major trends without having to wait too long to uncover some of the results.

2.png

Once we got the data back, it was definitely a challenge to find the best ways to pick apart data from such a large number of employees in a way that effectively gave a top-level vision of how time was being used across the categories above. Some key factors to keep in mind:

Days Will Be Different

Some days will be different than others, and in this case, this was incredibly noticeable. There were several categories which many employees would not fill each individual day, as they would devote say, more time to technical concerns near the end of the week and more time to client communication and research near the beginning of the week. As a result, it was important to keep this in mind when looking at the big numbers; since certain team members simply didn’t do certain activities, it was important to keep this in mind when putting together the big report. Using Google Sheets and the COUNTBLANK function within Excel/Google Sheets, we were able to identify the amount of participants per category.

Averages

Looking at averages, to me, seemed the best way to ascertain major systemic changes that we could make. While looking at the total hours reported may provide good insights from a data sample, the idea was also to compare these against averages for each category to ensure that there were no alarming irregularities.

By looking at the total number of respondents per category in tandem with the total number of hours spent per category, we were able to obtain a better average and make adjustments to the final percentage numbers based on the number of respondents.

3.png

As stated, while we’re still getting used to what story the data tells, some facts were incredibly alarming and are already affecting change within the haus. For instance, an automated product was taking just as much time as its manual counterpart, leaving the switch back to a manually-generated, but more custom, flexible reporting interface will help.

Alternatively, we were able to realize some areas that we had initially expected to be major pain points shouldn’t be as much of a concern. For instance, while link audits always seem to take a significant amount of time, such a low number of people regularly engaged with a full-blown link audit that it was not as significant of a time drain that we once thought. Similarly, while an automated reporting system was brought in to help streamline the process, the fact that the same amount of time went in to both automated reporting and manual reporting shifted the conversation on a more automated workflow.

This was a great, eye-opening exercise! We look forward to sharing some of the results with you — but first, we need to take a closer look at actionable steps to improve work life in the Haus.

 

Posted in Advice, Marketing Tips, Productivity | Leave a comment

10 Awesome Sites with Free Stock Photos.

We had a few run in’s with Getty Images recently, man was that’s a pain!

To explain, Getty and other companies are buying the rights to images or are running scans on images all across the internet to find infringement.  Unfortunately if you are found using the photos, they skip the cease and desist and will request payment from as little as $250 up to thousand of dollars.

We have a strict guideline now at CreativeHaus to use images from iStock or other partners.

We have found a few great sites for “Creative Commons Zero” License.  This means you can use, copy, and modify the images without having permission from the original owner.

Disclaimer: Make sure to check permissions on each site to make sure you are following the permissions they set.

AND NOW….10 AWESOME SITES WITH FREE STOCK PHOTOS: (Yep)

Death to the Stock Photo

Greatisography

Designers Pic

Magdeleine

Lock and Stock Photos

Little Visuals

Life of Pix

Je Shoots

Jay Mantri

ISO Republic

Posted in Advice, CMS, Content, Facebook, Web Design | Tagged , , | Leave a comment

How to Optimize Keywords for Google: Or How I Learned to Stop Worrying and Love LSI

Recently, a post on the Moz blog seemed to ignite a particularly intriguing debate that centered around Google’s famed list of the 200+ factors that they use to rank results. Within the post, the author posited that Google has never relied on keyword density as a ranking factor. While this ignited a fiery debate within the comments section, it also ushers in an important conversation that search marketers should keep in mind–one that touches on the merits of looking at correlation vs causation, and one that looks at the complexities of language as a looming variable in the world of search.

To answer the initial question: No, it is very unlikely that Google uses keyword density as a ranking factor. However, to say that keywords in content won’t influence your position in search is naive, at best. Descriptive keywords not only dictate the way in which bots and search engines process and index your site, but also the way in which the public at large talks about your product or service, playing a major role in search. However, the early days of search still seem to guide the strategies and tactics; exact-match keywords strategically dot a page, rampantly reinforcing the keywords for which you are attempting to rank.

Yet Google’s come a long way; from the very public introduction of the Hummingbird algorithm, to the publicly announced, but less discussed addition of Ray Kurzweil to the Google Search team, and further explorations into AI, Google is becoming more fluid, adaptive, and intrinsically intelligent with how it understands and interacts with language. Today, we wanted to take a look at three complex ways in which Google processes queries and indexes information. Term frequency, semantic distance, the evolution of Google’s understanding of pronouns, synonyms and natural variants, and co-citation and co-occurrence, all govern how Google understands language on the Web.

1.png

While many may think that this is simply another word for keyword density, Google has made numerous references over the years to term frequency and inverse document frequency in applications for patents, as well as other documents. Term frequency and inverse document frequency focus less on keywords and how often they appear on a page, and more on the proportion of keywords to other lexical items within a document.

Expertly covered by Cyrus Sheperd on this Moz blog, TF-IDF is a ratio that helps Google compare the importance of particular keywords based on how often they appear in contrast to other documents on the page, as well as the greater corpus of documents as a whole. Supported by Hummingbird, this allows Google to have a more complex understanding of the way in which natural language can support overarching topics from a top level. Using language in a way that’s natural, and in a way that resonates within your niche or industry may be a better use of your time than trying to ensure your document includes your keywords a set number of times!

2.png

This goes without saying, but using synonyms and natural occurring variants of your target keyword help Google to identify a natural match for the searcher. In the previously referenced Moz blog, they use the example of “dog photos.” There’s a good chance that if someone is referring to dog photos, that other words on the page might exist, including “pictures of dogs”, “dog pictures”, “puppy pics” or “canine shots”. By ensuring that synonyms of your target keyword regularly appear, Google and other search engines are able to affirm the page’s intent and align it with that of the searcher by finding words with similar meanings that could potentially answer a user’s query.

spi-1011-word-cloud.jpg

Over 70% of searches rely on synonyms. According to Shepard, “To solve this problem, search engines possess vast corpuses of synonyms and close variants for billions of phrases, which allows them to match content to queries even when searchers use different words than your text.” Again, this is more incentive for marketers and webmasters alike to create copy that departs from a minimum requirement for keyword density, and instead rewards natural language that allows users to refer to their target keyword and other potential variations.

3.png

Related to the idea of synonyms and variants are the idea of co-citation and co-occurrence. First of all, Bill Slawski, of SEO by the Sea, has stated that co-citation and co-occurrence are part and parcel of the Hummingbird algorithm, which uses co-citation to identify words that may be synonyms. The search engines rely on corpus of linguistic rules and may even replace a query for a synonym where co-citation and co-occurrence have determined a better match or a heightened probability of a better search result.

This also helps determine and parse out different search queries for words that may have multiple meanings; in the example above, “dog picture” is a very different search than “dog motion picture”. However, in a more extreme scenario, a “plant” could refer to a tree, a shrub, or a factory, while a “bank” may refer to an institution that lends money, an index of thoughts or memories, or the land that dots either side of a river. A “trunk” may refer to an article of furniture, a part of a tree, a car, or an elephant. Contextual clues within the content help parse out the inferred meaning of the content on-site and ensure that Google serves a page that’s relevant to the searcher.

However, this is also playing a significant role in off-site optimization as well. While keyword-rich anchor text is still valuable, it is noticeably declining in importance due to concerns about spam. In a different piece, Rand Fishkin noted that queries for “cell phone ratings” regularly returned results on the first page that didn’t even contain the word “ratings” within the title, and instead used “reviews” or “reports”. This is a highly competitive query, yet Google used co-occurrence from both on-site and off-site content to determine that these sites are more relevant than those that contain the keyword.

One benefit of looking at co-occurrence from the search engines’ point of view, is that it is extremely hard to manipulate. This relies on a heavily updated corpus featuring an amalgamation of sources that are talking about the keyword in such a way to support the surrounding co-occurring words or phrases. It is an incredible testament to the algorithm’s ability to understand and naturally parse out how language intrinsically sounds. While Latent Semantic Indexing has been around long before Google or search engines, co-occurrence is a part of the algorithm that works much in the same way, identifying relationships between phrases and lexical items to extract and assign meaning.

4.png

The growing ability to detect and extract meaning to seemingly unrelated pieces of text illustrates Google’s growing ability to use artificial intelligence to understand language. From leaning on a user’s personal historical searches to understand pronouns, like a recent Google patent demonstrates, Google continues to lean on the information available to make the search process a more conversational and intuitive one.

Similarly, in appointing Ray Kurzweil and their acquisition of DeepMind, Google continues to leverage some of the sharpest minds in artificial intelligence to truly understand and engage with a user’s language.

Language is an incredibly dynamic and fundamental component of society, and Google and other search engines continue to expand their indices to ensure that they provide the best experience possible. As a result, marketers need to forget about manipulating Google’s search results, and instead engage with their community in their own voice. Worry less about keyword density, and instead look at how to present something in a way that is engaging and natural. Relevant, unique, and natural content both on-site and within the online community will help influence your position as an influencer and industry-leader.

Posted in Content, Google, Google Algorithm Update, Google Hummingbird, Keywords | Leave a comment

A Recipe for Creating an Optimized Homepage

Think of the home page as a book cover.  As we all know, first impressions are important. Inevitably, people are going to judge the “cover” before they decide to dig in deeper into the site and actually read the content. Below are some tips to help make your site captivate your users!

We read left to right, and up to down.  Put all important stuff closer to the top left corner of the page. That could be the logo, product offering, or an important service worth featuring.

Create something eye-catching before the fold of the page. When someone lands on the page, what’s seen at first glance needs to be enticing to create engagement and lead them to scroll down. Some simple ways to do this are using catchy slogans, hooking them with a new product offering, or discussing an intriguing topic. The catchier the page, the more likely users will stay on the site.

Speak in a tone that caters to your readers/users.  If your company target professional businesses, be sure to write in a formal, informative manner. On the other hand, if the site caters to adolescents, be sure to use language that will be relatable.

Have a call to action be blatant and near the top left side of the page.  Because it’s so commonly practiced, people naturally assume that the call to action, such as a phone number or email, is on the top left.  This makes it less cumbersome for users to find it rather than frantically look up and down the page for any point of contact.  The longer the user is on the site looking for any form of contact information, the less likely they will convert, and inevitably lose a conversion.

Don’t overstuff the homepage with too much. Having a text/image-heavy homepage can be overwhelming with inaccessible information, which can drive users away from the site.  This can also lead to a slow page speed. People on average have an attention span of 7 seconds, (even less with the younger generation). Any longer than 7 seconds could influence the users leave. Give just enough information to make them want to further explore the site.

The navigation bar should not be overwhelming. 5 is the rule of thumb for how many categories that should be featured on the navigation bar. Create sub categories to add more to the navigation bar without inundating it. Be sure to showcase the main categories rather than get too specific on the bar.

Create a homepage that changes with the current events.  Homepages have the stigma to static. Don’t be afraid to change up the homepage when applicable. Some instance, showcasing information about the company hosting an event, or when a new product that is being launched would be a great thing to highlight.

Your homepage is the face of your company/brand.  Make sure it captivates your users to learn more and explore the site.

 

Posted in Marketing Tips | 1 Comment

Internet Marketing Weekly Roundup

First things first: in case if you didn’t already know, we have a great brother and sister team here at SEOhaus. Although their technical savvy is a little limited, they provide us with a never-ending source of cuteness. Check out Tech Cocktail’s latest post featuring Lily and Marshall, along with other pets found at start-ups across the country.

BrightHaus-Marshall-Lily

 

Run an eCommerce store? Try these quick A/B tests to streamline your conversion process. Even if you aren’t selling online, these four tips from KISSmetrics are great to keep in mind.

HTTPS has been a hot topic in recent weeks when word spread like wildfire that going secure would improve rankings. But does it really? Daniel Cristo offers insight on Search Engine Land.

Moz released their newest Beginner’s Guide to Link Building. In an ever-changing landscape, it’s always useful to get a refresh on the basics and hone strategies that you may be a little rusty with.

Talk of the removal of the Carousel and changes to the Knowledge Graph also emerged this week. Desktop SERPs have been looking more and more like those on mobile. Ashley Zeckman tackles the topic on Search Engine Land.

What articles stood out to you this week, and what can we expect to see in the last week of August?

Posted in Uncategorized | Leave a comment

Mobilizing Your Internet Marketing

When I lived in Kenya during the summer of 2007, one thing that struck me was the amount of people using cell phones for computing tasks. At a time when smart phones were just catching on (the first iPhone was released that June), it was common to see people using phones for just about everything. I was sporting a slick Razor that could store a dozen songs and that’s about it, so my laptop was everything. Things were different in Kenya. From the businesspeople in Nairobi to the people in the village hours away from the nearest major city, phones were used for everything from bank transfers to email. When I returned to Kenya in 2009 with my iPhone look-alike, the mobile market had expanded:

Web Device Use In Kenya

Things are going to be a lot different my next visit. Desktop’s grasp on the market fell steadily until mobile flipped the script at the end of 2013. Mobile devices now account for over 70% of internet use in Kenya. Let’s take at the same data in the US:

Web Device Use in US

While many US users have been holding tightly to their desktops, there is no denying the mobile uprising. In Kenya, mobile devices are the primary – and usually the only – device for many Internet users, while US consumers tend to have an arsenal of devices. Even with a lower percentage of web visits completed on mobile devices in the US, the time spent on the Web using mobile devices surpassed desktop earlier this year.

Time Spent on Internet by Device

Mobile has shown that it is on the rise and is already a large part of internet marketing. If this is news to you, the time to act was yesterday. We have a lot of learn from developing countries such as Kenya and paint of picture of our mobile future by looking at their past. Some of the biggest signs of the time are the ways in which Google has supported mobile devices.

Google Behind Mobile

Many online marketers have been resistant to give mobile its due credit, but users are making it clear that mobile cannot be ignored. Desktop shows no signs of becoming irrelevant by any means. In fact, the amount of web use on desktop is strong as ever, but mobile gains more momentum monthly and is taking up a larger percentage of the market. And why shouldn’t it? More and more, we seek instant gratification. If I’m walking down the street and want to get the lowdown on two different restaurants, I’m going to pull out my phone right then and there. When I’m sitting in Balboa Park and want to know more about the plants or a museum, I’m using my phone rather than waiting until I’m home. More than anything else: mobile is always convenient.

Although almost two years old, Google’s report on the multi-screen world provides some great information. For example, 40% of smart phone use was outside the home compared to 31% on desktops. The 45 page reports concludes with this: “Smartphones are the backbone of our daily media use. They are the devices most used throughout the day and serve as the most common starting point for activities across multiple screens. Going mobile has become a business imperative”. In recent years – and especially in the last few months – Google has taken action with this in mind.

As Internet marketers, one of the biggest changes we care about was the updated SERP, which signified a major step in the company’s shift to “mobile first.” Desktop SERPs are cleaner, while mobile results have similar data but presented much differently due to screen size. Dr. Pete wrote a great piece on the SERPs across devices, but the main takeaway is that above the fold real estate on mobile devices is severely limited, stressing the importance of ads, Local Listings, organic rank, and more. Since May, we have seen Webmaster Tools notifications for information regarding mobile devices, including smart phone redirects and smart phone crawl errors:

Google Webmaster Tools

Additionally, one of our favorite tools – Page Speed Insights – notes both mobile and desktop web performance with mobile showed first.

What This Means For You

As marketers, we need to ensure that our clients’ online presence is consistent across devices. How are they showing up in SERPs on desktop and mobile, and what strategies can we employ to maximize visibility on mobile? More importantly, consider how sites are viewed on phones and tablets. At the very least, sites need to be friendly; we highly recommend responsive design. A younger, wiser me may have predicted the rise of mobile several years ago during my time in Kenya, but you know what they say about hindsight. Don’t look back on this period of time wishing you had acted. Now is the time to recognize the pervasiveness of mobile.

Posted in Google, Google Webmasters Tools, Mobile, Mobile Compatibility, Mobile Search, SEO, Website Traffic | Leave a comment

Three Insights to Transform Your Twitter Game

Today we’re going to look at Twitter from a Bird’s-eye view.

The evolution of social media has changed the way most of us communicate on the web. Whether inspiring a generation of users to #hashtag #every #little #thing #they #do or empowering an Arab Spring, Twitter often finds itself at the center of some pretty big changes.

And how about those changes? One doesn’t have to Google very far to find comprehensive takes on social media analytics and what they mean for businesses. Don’t be overwhelmed! With the help of a piece by Fast Company, we’ve rounded up three big insights to help raise your Twitter game to the next level.

1. THERE ARE SIX DISTINCT COMMUNICATION NETWORKS ON TWITTER

This one’s a doozy. The Pew Research Center and the Social Media Research Foundation worked together to analyze thousands of conversations on Twitter and boiled the wide range of data down to six basic network archetypes. Polarized crowds are great at supporting one another but are a lot less likely to engage with users with different viewpoints as their own. So, for example, conservative Republican Twitter users tend to follow, tweet at, retweet, and favorite amongst themselves more often than with liberals. On the other end, tightly-knit community clusters are a highly responsive group of Twitter users, usually grouped according to a specific interest. They engage with each other to share, for example, baking tips, opinions on a TV show, funny links, or other tweets pertaining to their hobby or interest.  Understanding where your brand fits among these types can help you tweak your strategy to reach new groups.

Image courtesy of the Pew Research Center.

Image courtesy of the Pew Research Center.

2. YOU HAVE LESS THAN AN HOUR TO RESPOND ON TWITTER

A big part of Twitter’s allure is how posts are handled in  real-time. You can communicate with your followers immediately, creating a higher, more interactive level of customer service. Still, the effect of this is double-edged. Research conducted by Lithium Technologies suggests that patience is not much of a virtue on Twitter. The study found that consumers using the platform to connect with businesses or brands don’t like to wait long to receive a response to a tweet or direct message sent to the brand. A good rule of thumb is to make a conscious effort to respond within an hour of the original post. Keeping tabs on email alerts can make things much easier. If you need something more comprehensive, there are tools (like Must Be Present) that alert social media managers when they’ve been engaged on Twitter.

3. LATE NIGHT IS THE BEST TIME FOR RETWEETS

While most people tend to follow  their own Tweeting schedule, like during the latest episode of Game of Thrones,  there is hard data available that shows that late night tweeting leads to more retweets from followers. Over 1.7 million tweets were analyzed by the folks at TrackMaven and the results are not wholly surprising: after 10 PM and before 11 PM (ET) is the ideal time to post. Fast Company suggests you follow the “Late-Night Infomercial Effect” when posting. What does that mean exactly? It means posting when the noise level is lowest for a greater chance of being heard. Be the voice that disrupts the silence. But please, keep the GoT spoilers to a minimum.

If you have any of your own social media stats that fuel your social strategy, feel free to share them in the comments!

Posted in #hashtag, Advice, Marketing Tips, Social Media, Twitter | Tagged , , , | Leave a comment