All posts in “Google Analytics”

Copy of Copy of 3 Predicted Digital Marketing Trends for 2018

The Facebook Pixel: What It Is

You have most likely heard about a Tracking Pixel; This is a crucial tool you should be using to get the most out of your social ad budget.

Today I would like to share what is a Facebook pixel, the many benefits of using one and why you should be already using one.


What is the Facebook pixel?

So what the heck is a Facebook pixel?, a Facebook pixel is a code that you place on your site. In its simplest form, It helps you track conversions from Facebook ads and optimise the ads based on data; You can build targeted audiences for future ads and re-market to qualified prospects who have already taken action on your website.

So how dose It works, it works by placing cookies on the page to track users as they interact with your site and your Facebook ads.


What are the benefits of using a Facebook pixel

There are the different ways you can use the data you have collected from Facebook pixel to refine your Facebook advertising.

Track conversions

One the most basic things you can do with the Facebook pixel is monitor how people converted to your website after viewing your Facebook ad.

This means you can even track customers across all of their devices.

For example, you can see if people see your ads on mobile but switch to a desktop before making a purchase. This information can help you improve your ad strategy and better calculate your ROI (return on investment).

Remarket

The Pixel code allows you to track data and use it to show targeted ads to people who have already visited your website; this is otherwise known as “Remarketing”. You can get really specific here.

For example, You can show people an ad for the exact product that they abandoned in a shopping cart on your website.

This great feature is why you should create a Facebook pixel right now, even if you’re not using Facebook ads yet. This is, so you have retargeting capabilities from your very first Facebook ad and historical data from now when you do start.

Create lookalike audiences

In Facebook you can use its targeting data to help you build a thing they call a “Lookalike Audience” This is a Segment of people who have similar likes, interests, and demographics to the people who are already interacting with your site, helping you expand your potential customer base.

Run effective ads

Using a Facebook pixel can make your ads a lot more useful it does this by improving the quality of the ads you are running, and by improving the way people are being targeted controlling who see them.

In addition to that, you can use Facebook pixel data to ensure your ads are seen by the people who are the most likely to take your desired action.

If you have any questions about your Facebook pixel, please feel free to comment, and we will answer any questions.

removing-referral

The Beginners Guide To Removing Referral Spam In Google Analytics

Welcome to a Beginners Guide to Removing Referral Spam in Google Analytics.

In this guide, I will be teaching you how to remove or block referrer spam.
First, we will start with the basics

 

What is Referrer Spam?

Referrer spam occurs when your site gets fake referral traffic from bots and this fake traffic is then recorded by Google Analytics (GA).

 

What is a bot and What do they do?

A bot is a program called a crawler which is developed to perform repetitive tasks with a high degree of accuracy and speed.

Bots are used for indexing web pages mostly (reading contents of web pages).

 

Good Bots:

Google Bot is an example of a good-bot. A Googlebot is used by Google to crawl and index pages on the internet. They use their crawl bots every day to crawl web pages of all types. This is how Google has so many up to date site results across the internet.

Good bots obey a file called “robots.txt” but bad bots don’t. Bad bots can create fake user accounts, send spam emails, steal email addresses and can get around CAPTCHAs codes.

 

Bad Bots:

Bad bots are mostly used in black hat techniques such as:

 

  • artificially increase website traffic
  • click fraud
  • scrape websites
  • spread malware (virus)
  • harvest email addresses

 

Bad bots use many methods to hide so that they can’t be detected by security. They can pretend to a web browser (like chrome) or traffic coming from a legitimate website.

They send out HTTP requests to the websites with a fake referrer header and create and send fake referrer headers to avoid being detected as bots.

The fake referrer header has the website URL which the spammer wants to promote and/or build backlink to.

When they do this, it is recorded in your server logs. Google treats this referrer value as a back-link which influences the search engine ranking of the link being promoted.

They can hide from bot filtering used by Google Analytics (GA) and because of this, you can then see spam Traffic in your GA ‘Referrals’ reports.

Most bots don’t use Javascript but some do. Bots that do use Javascript show up as hits in GA reports and mess up the traffic data and any metric based on sessions like bounce and conversion rate.

Bots that don’t use Javascript

Bots that don’t use Javascript on the other hand, (like Googlebot) do not mess up your data. However, their visits are still recorded in your server logs file. They still consume your server resources and still eat your bandwidth. They can even negatively affect your website performance.

If you can’t see a problem in your GA reports but your sites still acting funny check out another article we have written on bots that don’t use javascript and how to defend from them.

 

Can It Get Any Worst? YES! It Can.

Botnets:

Botnets are a network of infected computers that come from different IPs and countries at different rates and are all being controlled by one source. The computers act like zombies if you will, to a leader computer (the spammer). The bigger the network the more IPs which means you can’t just block IPs and limit the rate.

Botnets can also create dozens of fake referrer headers and if they are using a VPN then IP blocking is useless. This means if you block a spam referral by a GA filter or by using .htaccess file there is no guarantee that you have completely blocked it.

 

Infection Bots

Botnets get new computers onto their network by infecting them with malware. They become zombies of that Botnet with the end user not even realising it most of the time.

 

Sad Truth:

If you decide to block botnets, you will most likely block the traffic coming from real people. Whatever you do, though don’t click on the links in your ‘Referrals’ reports as they might be trying to infect your computer.

 

What You Can Do About it

Check Your Reports

Go to your Referrals report and sort the report by bounce rate in descending order. You can also download it if you prefer. Look at referrers with a bounce rate of 100% and 40+ sessions. They are probably spam.

Bot Filtering

It’s definitely not foolproof but try Using GA’s “Bot filtering” feature which excludes hits from known bots.

If You Can’t Identify It

If you still can’t identify it then you might have to visit the site (to make sure it is legitimate). You must have anti-virus/malware software installed on your site and computer before you visit any website that you can’t identify.

List of Known Domains

I have put together a list of suspicious sites referred below. If it’s on the list below then chances are, it is a spam referrer and you don’t need to check the website to make sure

Click Here To View The List (LINKs on this list are updated every so often)

Block them from appearing in your reports.

You can do this by adding a custom advanced filter on GA as shown below.

Use a WAF

Web Application Firewall acts as a line of defence between your web server and the internet. This is probably the fastest way to sort the problem. Also, most services cache your site so if the site on the server goes down your site will still function and be viewable.

Use Google Chrome

The best option to surf the internet is to use Google Chrome. Chrome detects malware deploying websites faster than any other web browser.

 

Block referrer used by a bot

Go to your .htaccess file and add the following:

Example Below:

 

"RewriteEngine On Options +FollowSymlinks RewriteCond %{HTTP_REFERER} ^https?://([^.]+\.)*luxup\.ru\ [NC,OR] RewriteRule .* – [F]"

 

This will block the HTTP and HTTPS referrals from luxup.ru and subdomains.

 

Block the IP address used by the spam bot

 

To block IPs in your .htaccess file and add write code below:

 

CODE:
"RewriteEngine On Options +FollowSymlinks Order Deny,Allow Deny from 234.45.12.33"
CODE END:

 

Block the IP address range

 

If you are sure that a range of IPs is bad, then you can block the whole IP range.

 

CODE:
"RewriteEngine On Options +FollowSymlinks Deny from 86.239.34.0/44 Allow from all"
CODE END

 

CIDR is a method for representing a range of IPs.

 

Blocking by CIDR better than blocking individual IP and it takes less space on a server.

 

86.239.34.0/44 is the CIDR range.

 

Use custom alerts to monitor unusual spikes.

If you are using GA, you can use custom alerts this way you can quickly detect and fix issues and minimise their impact.

 

Little Tips:

 

  • Do not exclude the referrer spam from the referral traffic using the ‘Referral exclusion list’ this will not do anything.
  • Create a note/annotation on your charts in G.A and explain what the unusual spike is for.

 

Important Note For PC’s

Without the right protection (anti-virus/anti-malware) your machine could be in danger.

Important Note For Mac’s

Bots are less likely to happen on Macs but you will still need to be aware as there are a few emerging (i-warm)

 

Keep updated with latest OS X and maybe invest in some protection (anti-virus/anti-malware) to be safe.

 

Website slow-CPU 100% capacity- (1)

Website slow? CPU at 100%? Resolve the problem now!

About 2 years ago MPH Creative were recommended to a new client that was experiencing very poor website performance. The site was slow when navigating or updating and was continually crashing, more often than not the site would not load at all and time out.

The Problem

The web agency they had in place appeared to be clueless and didn’t know how to solve the problem and communications eventually broke down completely. It was a risk for us to get involved considering the previous agency had built the custom WordPress site as well as hosted it and still didn’t know how to resolve the problem. The company was in desperate need of a trustworthy agency and so with reservation, we agreed to help. Sometimes, diagnosing a website that we haven’t built ourselves can be like finding a needle in a very large haystack.

The first task was to move the hosting away from the web agency. Once the site had moved, all appeared to operate normally, speed was good, we had no crashes and the client was happy – but surely it couldn’t be that simple?

After a month or two, the site was still running well and the problems were ultimately put down to poor hosting… but did we speak too soon?

Another month later and the old problems started to resurface, the site slowed and performance dropped rapidly which eventually led to the site being down more than up.

The client and hosting company came back to us reporting the issues, so we went through the process of ensuring all the obvious things were updated. But again, other than a load of ridiculously large images uploaded to the system nothing major jumped out. All the plugins, WordPress version and theme were updated and all operated as they should.

We advised the hosting company to increase the specification of the server as the site did appear to have a decent level of traffic but nothing out of the norm. This appeared to do the job for another month or so, until we received a panicked call from the client that the site had once again gone down and showed an ‘error establishing database connection’.

The hosting company was less than helpful, and we were sure the site didn’t have any fundamental problems that would reoccur sporadically. If a site doesn’t work, it doesn’t work, very rarely a site will keep developing issues for no apparent issue. Once again, we did all we thought possible on our side and the client eventually lost patience with the lack of assistance from the hosting company who just kept pointing the finger at us and reiterating the same query that the CPU was at 100% capacity.

The client changed hosting companies yet again and on changing, like magic, all appeared calm and the site went back to normal, leaving us to believe this was again a hosting issue. I’m not a lover of hosting companies at the best of times and have the opinion that many people do on broadband and mobile phone companies. They promise you the world but once you’ve signed on the dotted line and have your money they couldn’t care less.

Following the third change in hosting company, we had even less time than before when the site crashed and was almost inoperable.

We were sure the site itself wasn’t the cause, so we did some in-depth research online to see if others were having similar problems. We were amazed to see that this is a global problem with hundreds of discussions, forums and cries for help from website owners and the issues were exactly the same:

  • High CPU usage at 100% capacity
  • Site extremely slow
  • Site timing out
  • Unable to access WordPress admin
  • Hosting company providing no advice or help
  • Hosting company blaming the site build and to check plugins

After reading many posts and applying possible solutions – none of which worked, we then focused on outside influences. According to Google Analytics, traffic was steady with nothing alarming that would pull the site down, especially as the site was on a dedicated high spec server. There was a fair bit of referral traffic pinging the site that reflected quite a high bounce rate so we blocked these IP addresses in the site code. This halved the GA results but still the site was extremely slow.

 

DDoS-Attacker-CPU-chart

DDoS Attack and How it works

 

The next step was to test to see if the site was being attacked through more hidden, malicious methods. Accessing the site database, we applied a query code to reveal all traffic hitting the site. In an instant, it was black and white what was causing the trouble. The site was being massively hit by a sustained automated DDoS attack. Over just a few hours, the site had been hit over 45,000 times which was crippling CPU and causing the 100% overload pulling the site down.

 

Why does this not show in Google Analytics?

The attack avoids being tracked by Google libraries as Google relies on javascript to perform it’s tracking; since this would not have been run by these requests it did not show up in the GA metrics.

 

The Solution

From here we isolated the IP addresses that hit the site almost every second or more and applied a short piece of code to block these IP addresses. And wham, bam thank you, mam, the site was back and fully operational.

Now that we’d unearthed the problem, a concern dawned on us. Every time we’d changed hosting the automated attack lost the IP address of our client’s site and the attacked ceased, but within months the attackers had relocated the site and the attacks started again.

This was obviously a malicious and targeted attack put in place to bring our clients website down. For what reason we do not know, they provide a positive and worthwhile service so to attack such a company simply makes no sense.

If you want to check to see if your site is being targeted by DDoS attacks you can follow these steps:


 

Step 1

Via FTP connect to your website and locate the functions.php file. This is usually in wp-content/themes/*YOUR THEME NAME*/

Add the following code to the top of the functions page:
$wpdb->query( $wpdb->prepare("INSERT INTO temp_log
(user_agent, their_address, they_requested)
VALUES (%s, %s, %s)
",array($_SERVER ['HTTP_USER_AGENT'],$_SERVER['REMOTE_ADDR'],$_SERVER['REQUEST_URI'] )) );

Then save and re-upload.


 

Step 2

We then made some changes to log the traffic, and left it for this log to build up enough information to provide a useful profile. We then ran a query to extract anomalous traffic.

Access the database, if you don’t have direct access, alternatives like phpMyAdmin are usually available. Once you’ve done that click on the tab at the top that says “SQL” and create the following table:

 

CREATE TABLE ‘temp_log’.’temp_log’ (
`id` INT NOT NULL AUTO_INCREMENT,
`they_requested` VARCHAR(255) NOT NULL,
`user_agent` VARCHAR(255) NOT NULL,
`when_they_visited` TIMESTAMP NOT NULL,
`their_address` VARCHAR(50) NOT NULL,
PRIMARY KEY (`id`) );

 


 

Step 3

Next, click the “Query” tab and input the following:
SELECT *FROM temp_log ORDER BY id desc


 

Step 4

A list will appear presenting the IP address and times they hit the site
If you are being heavily attacked there will be a clear culprit as the IP will appear multiple times, possibly every second or so.

 


 

Step 5

If the page is being visited by one of the suspect IPs we have identified, we need to halt execution. We want to do this fairly early on, so we add the following code to the top of the wp-config.php file:
if (in_array($_SERVER['REMOTE_ADDR'], [IP ADDRESS 1, IP ADDRESS 2])) {
header("HTTP/1.1 503");
die();
}

The IPs take the form of a comma separated list, with each IP surrounded by quotes. The header is to inform the visitor of the nature of the response; theoretically, any code could be supplied, but 503 seemed appropriate. We then have the “die” since the program execution is terminating abnormally.


 

Result:

CPU usage should have dramatically reduced and site performance vastly increased.

What if the DDoS attack starts again?
Apply the same process to block the attackers and consider using a third partly buffer like Cloudflare (www.cloudflare.com) to block DDoS attacks. The first level package is free so well worth giving this a go.

 

 

 

If you’re not sure how to apply the above steps, but you think this might be the problem with your site, feel free to get in touch and we will do our best to help.