I caught an interesting press release yesterday from SoftScan that says that of all the email they scanned in June, 90.o6% was classified as spam. Even worse, the peak spam rates are truly astonishing.
June also saw SoftScan record its highest ever percentage of email classified as spam when it stopped 96.55% of all email scanned, correctly identifying it as junk mail. Narrowly beating February’s record of 96.22%, the high level occurred during a weekend when there is less legitimate business email. Spam levels on week days during June peaked at 91.36%.
The problem for us is simple, as the volume of spam grows, how do we continue to correctly separate out the increasingly small minority of valid email?
One trend I’ve been seeing in web traffic would suggest that a good percentage of the web traffic we see in our reports is bogus as well. I attribute this to scripted robots looking for contact forms, forum registration forms and blog comments on which they hope to post their message unnoticed.
Here’s what got me thinking: I was looking at traffic stats for a local business. They don’t do any business outside of their local shops, no online product sales, etc. On the surface, they had fairly good traffic. The problem came in when I started to look at the geographic distribution of their visitors. Nearly half were coming from Asia – and on top of it, they had a huge depth of visit. Almost as if they were reading every single page of the site…
The truth is, they probably are. They’re robots and they’ve got no business tearing up the bandwidth of the web.
So my question now for all you analytics mavens out there is this: How much of your traffic is bogus?
PDF Spam – Internet Security Blog – yet another new spam trend.