Is Spam Winning?

Is Spam Winning?

I caught an interesting press release yesterday from SoftScan that says that of all the email they scanned in June, 90.o6% was classified as spam. Even worse, the peak spam rates are truly astonishing.

June also saw SoftScan record its highest ever percentage of email classified as spam when it stopped 96.55% of all email scanned, correctly identifying it as junk mail. Narrowly beating February’s record of 96.22%, the high level occurred during a weekend when there is less legitimate business email. Spam levels on week days during June peaked at 91.36%.

The problem for us is simple, as the volume of spam grows, how do we continue to correctly separate out the increasingly small minority of valid email? 

One trend I’ve been seeing in web traffic would suggest that a good percentage of the web traffic we see in our reports is bogus as well.  I attribute this to scripted robots looking for contact forms, forum registration forms and blog comments on which they hope to post their message unnoticed.

Here’s what got me thinking: I was looking at traffic stats for a local business.  They don’t do any business outside of their local shops, no online product sales, etc.  On the surface, they had fairly good traffic.  The problem came in when I started to look at the geographic distribution of their visitors.  Nearly half were coming from Asia – and on top of it, they had a huge depth of visit.  Almost as if they were reading every single page of the site…

The truth is, they probably are. They’re robots and they’ve got no business tearing up the bandwidth of the web.

So my question now for all you analytics mavens out there is this:  How much of your traffic is bogus?   

Further Reading:

PDF Spam – Internet Security Blog – yet another new spam trend.

2 thoughts on “Is Spam Winning?

  1. I’m still amazed at the shear volume of spam and the scam nature of so much of it…

    You make a very good point about bots. Whether or not you can go deep into the data and really filter bots depends on your web analytics system. Even the best systems are quite challenged, requiring human expertise to really optimize. It’s a big problem and if left unchecked or to the “vendors” is more disruptive to a metrics system than any cookie deletion issues.

    It’s very possible that up to 70%+ of web traffic is robotic, depending on the type of site you have. That said, the good tools do a decent job of filtering a good majority of it. And with smart humans near perfection is achievable. But they get through!

Leave a Reply

Your email address will not be published. Required fields are marked *