Category Archives: Analytics

Analytics Lead Tracking Solution – HubSpot

I hope this summer is treating you well! I have a treat for you regarding web lead tracking and email marketing nurturing. In the last two months I’ve been working with interesting software services. One in particular does what every marketer wants; track web campaigns to users’ contact details and streamlines communications through email marketing and lead nurturing. I’m working with a conservative budget, so these solutions are working for a small to medium size company. The final piece I’m working on is tying the phone leads, to the web leads and associating these different buckets at a campaign level, without massaging data and being a spreadsheet jockey. Understand keeping the costs down is essential to staying within budget when building the tracking architecture. If money and time wasn’t an issue, I’d be using Eloqua. The tracking architecture I’m implementing and will be discussing; is still in a testing phase; does not integrate with a CRM using an API and it does not require IT resources to implement. It’s the easy button for lead tracking and nurturing.

Many of my clients use Google Analytics for tracking lead conversions and sales. But what Google Analytics won’t give you is the name of the person submitting their contact details on a per campaign level. Google finds this a violation of privacy and I understand this reasoning. Many times, clients want to know the name of the lead which came in through a pay-per-click campaign so they can nurture the lead. As marketers we know that giving a contextual experience is the difference between acquiring a customer and pissing them off.

I’ve been working with a web lead tracking solution; HubSpot. HubSpot gives you what Google Analytics will not; first name, last name, email, phone number and anything else you want to know tied to a specific campaign. Within minutes you’re able to generate standard and custom fields to create a web form. Once you’re happy with the form, you cut and paste the JavaScript code into your web page and/or blog post and your done. It’s that easy and it works.

Some kicking features about HubSpot:

Once a lead comes in through a form, the lead details are sent to an email or alias. The cool thing; if a lead has a Twitter, Linkedin or any other public social media profiles these profile links are presented in the lead details. Let’s take a moment and think about how this could change your sales experience. Imagine if a sale person could browse a potential customer’ social media profiles before making that sales call? You know the potential client is interested, but now you know they love cats. Mind you, I don’t recommend the sale person disclosing they’ve researched the client’s public social media profiles, but in conversation, the sales representative could mention how much he/she loves cats.

If a lead submits their information and then goes back to your website or blog, you are sent an alert; “Anne Haynes one of your leads is back on the website.” Now you have the creepy factor and can call them right away and answer any questions they might have about your business. I’ve experienced this before with sales people and while I don’t like it, if a lead continues to visit the website, it’s a sign they’re interested. And all this information is tracked in the lead details.

If a lead comes in through one form campaign, you can create lead nurturing campaigns and send automatic emails through timed intervals. This features changes how marketers approach campaigns; sure it’s easy to create a form, cut and paste it into a web page and generate leads through pay-per-click, but managing the experience using automatic emails after a lead shows and interest, this creates a need for communication architectures. And it can become complex. For an example, I will use SEOINC because I use to work there many years ago and they have increased their product offering to include pay-per-click and social media marketing. Now imagine I’m a potential client and I submit my information through the website with a form powered by HubSpot. The SEOINC folks created a nurturing campaign which kicks off an automated email regarding their pay-per-click product offering 20 days after I submit my information the first time. 20 days after the pay-per-click email I receive another automatic email about their social media product offering. It’s all done without you being involved, it happens automatically, so the communication architecture must be designed ahead of time.

HubSpot has API integration with Salesforce and the company will work with you if you have an in house custom CRM solution.

HubSpot features that need some work:

When a lead submits his/her information wrong in a validated field, the user goes to a generic looking validation form. You are not able to apply CSS to the validation form – Yet!

The automatic emails do not have the ability to cut and paste HTML, so the design is limited. However, I’ve been told by my HubSpot consultant this feature is coming out within the next two weeks.

If you’ve built a nurturing program and the lead converts and signs up for other product offers all at once, you must turn off the nurturing program for that lead. This is a process that you want to enforce any way to know your lead to conversion ratio.

Summary:

So far, I really like HubSpot because it’s easy and takes the pain out of reporting. In my opinion, a marketer’s time should be spent optimizing campaigns and not sorting exported spreadsheet data. Don’t get me wrong, I know we need the reporting, but with HubSpot the data doesn’t need cleaning up.

Since I’ve been working with HubSpot there isn’t one person that doesn’t mention SEO to me; critiquing our current ranking with brand only terms, trying to sell the idea of hosting our blog using their blog service. I was raised to know installing blog software on a server gives the search value to the domain where it is housed. I don’t recommend outsourcing your blog using HubSpot. But it’s really good tying contact details to a campaign. There is also a UTM link generator, so you can micro manage your landing page performance and other web activities.

There is more to come as I continue down this process, but know my goal is to tie the phone leads to the web leads and when I get that going on, you will be the first to know!

Are you using a software service that does what HubSpot can do? I’m interested in knowing your solutions

Understanding Your Web Analytics – Bad ROBOTs

by: Queen of Search
I’ve been reading the WebTrends data for one of my clients and I continually find robots everywhere in the hit results. I’m working with the engineer to create filters and custom reports, but it’s insane how much traffic is coming from robots. None of the visits going to the site are real people. In order to tackle this problem – I’ve decided to update my education and conduct some Google research.

Some of these robots are designed to scrape email addresses and harvest them for spamming later. Most of the emails are used when renting email lists. If you’ve ever purchased an email list you know there is a huge variance between prices. Now you know where the cheap list providers get their email address, from the bad bots.

Last month an article came out naming the six web robots responsible for 85% of the email spam.

Some robots are designed to copy entire sites. Simply put, robots are never seen by the user, so they add no value.

There are a few ways to prevent web bots from accessing your website; change your htaccess.txt file and/or architect a high-tech robots.txt file. Web crawlers and bad robots will read the robots.txt file and use this file to know “where to go”. These are the devious web crawlers; web crawlers accessing the “don’t crawl” files using the robots.txt to know where to go.

I’ve heard horror stories about people creating the wrong type of robots.txt file. One person accidentally reversing the meaning of the robots.txt file. In other words, he/she entered in all the directories that he/she wanted the search engine spiders to index.

I found this old, but great resource when searching for solutions: How to keep bad robots, spiders and web crawlers away

If you continue to have problems, just write a letter to the spambots. Think Artificially.org wrote a funny blog post that’s a good read.

When conducting my research, I had to go to the Wikipedia.org and check out their robots.txt commands. Notice how they speak to the bad web crawlers telling them to slow down or they’re out. I also like the comment, “Friendly, low-speed bots are welcome viewing article pages, but not dynamically-generated pages please.”

When everything is said and done, study, learn and test to find out what works for you and your website.

PPC – Google and Your Analytics

by: Anne Haynes

There is a lot to be said when it comes to PPC, it’s not just clicks that need to be measured, capturing data around bounce rates needs to be collected as well.

Landing pages and user experience is “the” thing with Google these days. You can’t just say,” Oh you need a landing page with your PPC campaign.” Now, you need a micro site; something that keeps the user on a site to get the most out of your efforts.

Let’s face it, Google Analytics happened for a reason – for the small boutique agencies that can’t afford to invest in enterprise analytics packages, their data is golden to Google. Google reads your gmail account to serve up contextual Adsense ads while you are reading your email, why wouldn’t they read your Google Analytics account and tie it to your Adwords campaign?

Let’s say there is a performance issue with your hosting company. PPC traffic is sent to a landing page or micro site and the page and/or site goes down for a few moments, how does this impact your cost per click? Once your site has been flagged as having a poor landing page or micro site, who is to say you can even make it back into a reduced cost per click environment?

I haven’t seen any data on any of my questions and I look forward to testing.

Do you have data on this matter?