Category Archives: Security Research

Something smells funny at urlblacklist.com

Competition is great, it drives innovation and gives companies the incentive to improve and the drive needed to continue to improve services and products. And in the spirit of competition sometimes a little dirt gets slung. But it isn’t only serving our personal interests to outline the failures of one of our competitors today. In this case, I believe that it is in fact doing the public a service. And by this I mean spreading awareness, to warn unsuspecting customers of what they are spending their money on, and of course, to convince you. with a little sound reasoning, to purchase our services instead of the competition.

There aren’t many suppliers of domain blacklist data for web filtering platforms and applications, this is one of the primary motivating reasons why we created Squidblacklist.org in 2012. We knew it could be done better. There are however a small handful of other providers remaining. One of these websites is urlblacklist.com, and it is them whom we have chosen to single out for scrutiny in this blog post.

In summary, Urlblacklist.com is a horrible service provider. One whos website was recently down for over 3 months in the 2016 calendar year. As a provider of services to paying clients, this is simply a disgrace. We have been monitoring urlblacklist.com, watched multiple outages, and we know that their domain name changed hands recently, something they did not announce on their website, so we will do it for them, here.

The website urlblacklist.com was down for nearly an entire month in October 2016. Then, a second outage occured that lasted nearly 2 months, beginning on or around Nov 10, lasting through until Dec 30, 2016. Further scrutiny reveals that in Nov of 2016, the owner of urlblacklist.com irresponsibly allowed his domain name to expire, which resulted in another 2 month long outage, in which another company took the opportunity to purchase the domain name, which is now registered under an entity named “Dr Guardian” who one can only assume has taken ownership and reopened the website, and is actively processing payments by unsuspecting customers.

They also do not make any mention of ownership changing hands anywhere on their website, a courtesy any respectable business would extend to its existing clients and the general public. Instead of even acknowledging the domain name ownership change. they choose to deceptively place blame for the outage on a billing issue with their registrar.

And I doubt if anybody knows who is really even operating the website. The owner has never responded to emails, doesn’t seem too care if his website goes down until months later, and I suggest that you should seriously consider switching to Squidblacklist.org if you are a current urlblacklist.com subscriber.

A brief visit to urlblacklist.com shows that the owner would like you to believe the second extended outage of 2016 was brief, a deception which is evident in a recent “news” message claiming that “over Christmas vacation” there was an outage, I guess Christmas vacation begins around Nov 10 and ends December 30th.

One can safely assume by looking at the ancient and truly aweful web design of urlblacklist.com and then research its track record of unreliability, and make some general conclusions about its owner and or operators which are to say, generally not favorable at all. And this lack of integrity can also be found in the quality of their blacklists, or lack thereof, which is of course, what really matters.

Urlblacklist.com is an aging website. I would encourage you to use their lists, and monitor the daily changes, you will know first hand that nearly 60% of the domains in their blacklists do not even resolve, a good indicator that there is very poor technology behind the update processes going on behind the scenes. It becomes clear rather quickly that they are pushing old, recycled domain data by systematically removing a set number of domains and re-adding them back using some crude scripts or something, rotating this data in and out in a way which gives the customer the illusion that updates are taking place.

With manual additions and removals being performed occasionally presumably by human hands. Beyond this however, there is obviously zero innovation taking place at urlblacklist.com. Which is evidently ran by an incompetent individual who is doing the world a disservice by continuing to accept payments for what is, in our opinion, hardly passable even as a purely free service.

Shalla and any other websites referring people to this website are also guilty of doing the general public a disservice by linking to urlblacklist.com because clearly anybody who has actually observed or used the lists from these people should come to the same conclusions that we have, and that is urlblacklist.com is an unreliable provider of services that needs to go away.

Also make note that shady clones of urlblacklist.com are also in existence, these domains are registered by totally different owners.
http://schwela.com is one of them.


If you like what we are doing here and want to support our efforts, please consider subscribing to download all of our blacklists.

Flat rate subscription. For full access to all of our works, select a membership option & subscribe today.



Select Payment Option




Problems & Solutions with Mikrotik RouterOS DNS Domain Blacklists.

Article by Benjamin E. Nichols http://www.squidblacklist.org
Introduction.

As a publisher of domain blacklist data, I thought it would be appropriate the share some recent challenges and resultant experiences with Mikrotik RouterOS DNS Domain name blacklists, especially considering that we charge a service fee for access to these blacklists. But first, we would like to thank Mikrotik for their fine products and timely support and we hope to continue to co-operate in the future. The following article describes recent issues, historical problems, and current fixes, patches and workarounds for categorized domain blacklisting utilizing the static dns entry features of RouterOS.

Recently we had some issues with these lists that was directly related to four individual problems all of which have been resolved.

1. Painfully Slow Import of Large Domain Blacklists ( Resolved )

Importing large domain blacklists has been an excruciatingly, absurdly slow process for many years, even with the most expensive high end Mikrotik RouterOS devices, which I believe the general consensus would be that, if one were to pay a premium, one would anticipate premium performance.

This issue has been recently resolved by Mikrotik with a new patch for RouterOS included in the latest release candidate as of October 2016. This was a very overdue fix and a welcomed change that will undoubtedly bring us closer to making domain based web filtering using standalone Mikrotik RouterOS devices that much closer to be practical for most people. We suggest you test the latest release candidate for yourself. The fix has been included in 6.38rc15 (Release candidate) available from http://www.mikrotik.com/download.

2. Recent Changes to RouterOS Static DNS ( Resolved )

Another issue that we faced with static dns entries in particular, is that with a recent change in RouterOS, a change that was made some time towards the end of summer 2016. They seemed to have changed the way that Static DNS entries are handled by the OS, forcing us to change our format in order to retain operability. Which actually, ended up being a very good thing that forced us to make dramatic improvements to our static dns format, a change which, in and of itself is something that was admittedly also long overdue. And the new format is perfectly suited for RouterOS Static DNS Entries. And we owe a debt of gratitude to the generous folks over at Mikrotik’s forums for helping us resolve those issues in a timely manner.

3. The 60 character limitation: ( Resolved )

Then we found out after consultation with Mikrotik via ticket submission and email discussions with their support staff, that RouterOS has a 60 character limitation that prevents domain names beyond a certain complexity to be loaded. This prompted us to open a support ticket, which began the dialogue. We were getting the following message when loading blacklists “error regex too compex”.

Also note the misspelling of the word complex, "compex".
Also note the misspelling of the word complex, “compex”.
( The spelling of complex has been fixed after our discussion with support staff. ) We decided an easy work around for this was to simply remove all domain names with more than 60 characters, and while we dont like throwing away domain data, after careful analysis of the data removed, most of these domains were junk so its not too big of an issue, and its a fix which seemed to work to address the problem. ( Mikrotik has informed us they do not plan to fix this any time soon as the work involved doing so is prohibitive at this time . )

4. Static DNS Blacklist – Script Failure at “Error duplicate entry detected” ( RESOLVED )

Now, this is where we hit a roadblock with Mikrotik Static DNS Entries. You see, loading an individual static dns blacklist from Squidblacklist.org into a RouterOS device works just fine, given that you must have adequate resources, memory, storage and cpu power. ( see our compatibility chart )

The problem however is with the fact that the way RouterOS handles duplicate entries, causing the import process to abort. Allow me to elaborate. The reason why this is an unacceptable end is that some domains will inevitably exist in multiple blacklists and/or blacklist categories, for example, a pornography website may also be malicious, and therefore the domain name will be present in both the adult, and the malicious blacklists. A network administrator may decide to load both of these two blacklists, which should work, however, it wont. We have no way of predicting which combination of blacklists somebody would opt to load so we cannot create some code to willy nilly remove domains on th fly..

Further Expounding. A solution is required to address the problem of loading multiple blacklists with overlapping domain entries.

Here is the solution.

We add on-error={} to the end of the line, this seems to be a great workaround and has eliminated the issue.

dsdfsdsdf

As a result of publishing this article and the work we have done here, our ADS blacklist for blocking ads using Mikrotik RouterOS Static DNS will now be free for everybody to download and use to show our appreciation for contributions.

It can be downloaded at the following url. http://www.squidblacklist.org/downloads/tik-dns-ads.rsc

A huge thank you to the developers, and to the volunteers who spent countless hours resolving issues, and creating a better future.

Thank you to Jonas Carlsson of remote24.se for contacting us with scripting support on the resolution of issue #4

Thank you for your time, and we hope that making this information public will help somebody out there.

Respectfully,

Benjamin E. Nichols
http://www.squidblacklist.org


If you like what we are doing here and want to support our efforts, please consider subscribing to download all of our blacklists.

Flat rate subscription. For full access to all of our works, select a membership option & subscribe today.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now available.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.


Blacklist Processing – Automated Domain Addition & Removal – A.D.A.R. System – Delay Pool Recheck Function

A little peek at some of whats going on behind the scenes.

One of the things that we do when we process the data that we publish, is to add domains, but also to remove unwanted, or delisted domains that match a certain criteria. But its never a good idea to simply delete the data, instead, we place these removed domains into the delay pools, domains that are flagged as no longer resolving, or being redirected to place holders, which is common with domains that are parked or suspended. Once this data is added to a delay pool, the data is then rechecked again and again, if the domains ever come back online, they get added back to the blacklists again before being white washed through filters again later during our daily update processes.

At Squidblacklist.org, we are working to bring you a higher class of blacklist through logic and innovation.


If you like what we are doing here and want to support our efforts, please consider subscribing to download all of our blacklists.

Flat rate subscription. For full access to all of our works, select a membership option & subscribe today.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now available.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.


Case Study – Web Filtering & Blacklist Quality Put To The Test.

DSC00728
Web filtering is an important consideration for any enterprise – it is one of the best-known and most efficient front line defenses against hacker attacks and malicious software. One of Squidblacklist.org customers was using a solution from another vendor which had reached end of life and needed to be replaced.

The system had not been performing to the customer’s satisfaction – it had proven difficult to manage, was not cost-effective, and its limited reporting capacity required an additional application to fill in the gaps in its functionality.

The client carried out an independent evaluation and selected two blacklist providers for deployment on separate Internet links to test the varying degrees of effective filtering. Filtering policies were created based on group membership rather than individual user rules as in the previous installation, and were integrated into Active Directory.This allowed existing support staff to grant Internet access by moving users into relevant Active Directory groups rather than modifying the proxy server configuration.

Improved Web Filtering Performance

Not only did the new Blacklists from Squidblacklist.org enhance the effective application of these appliances and the performance of web filtering for the customer, they also identified a number of websites which had been previously been mistakenly blocked or likewise, websites that should have been block, not blocked at all. The client was thus able to advise the relevant organizations – which included their customers and partner organizations – that their web filtering solutions had been compromised by poor quality blacklists from websites like shalla “secure services” and urlblacklist. These issues were then easily resolved by converting to blacklists by Squidblacklist.org.

The enhanced blacklists also introduced Weaknetlabs Technology which combines the best of conventional tools with new intelligent identification algorithms. ADR automatically tracks and adds or removes different domains. More effectively producing higher class of blacklists, than first generation blacklists from other providers. It also removes the inherent weaknesses in using human-only classification to give you the most up-to-date URL blocking and control.

The customer has since found that this new setup meets their requirements to an infinitely higher degree than their previous setup.


Blacklisting has Evolved. Subscribe Now!

Flat rate subscription. Select a membership option & subscribe.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now available.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.


Study – Internet Filters Block Many Useful Sites

Access_Denied_day_02

Teenagers who look to the Internet for health information as part of their “wired generation” birthright are blocked from many useful sites by antipornography filters that federal law requires in school and library computers, a new study has found.

The filtering programs tend to block references to sex and sex-related terms, like “safe sex,” “condoms,” “abortion,” “jock itch,” “gay” and “lesbian.” Although the software can be adjusted to allow access to most health-related Web sites, many schools and libraries ratchet up the software’s barriers to highest settings, the report said.

“A little bit of filtering is O.K., but more isn’t necessarily better,” said Vicky Rideout, vice president of the Henry J. Kaiser Family Foundation, which produced the report, to be published today in The Journal of the American Medical Association. “If they are set too high, they can be a serious obstacle to health information.”

The researchers found that filters set at the least restrictive level blocked an average of 1.4 percent of health sites; at the most restrictive level, filters blocked nearly 25 percent of health sites. The amount of pornography blocked, however, was fairly consistent: 87 percent at the least restrictive level, 91 percent at the most restrictive.

The programs blocked a much higher percentage of health sites devoted to safe-sex topics: 9 percent at the least restrictive level and 50 percent at the most restrictive. The blocked pages at high levels included The Journal of the American Medical Association’s site for women’s health and a page with online information from the Food and Drug Administration about clinical trials.

To the researchers, the results mean that a school or library that uses a less restrictive setting for Internet filters can lose very little of the protective effect of the filters, while minimizing the tendency of filters to block harmless and even valuable sites.

The report is the first major study of the effectiveness of filters to appear in a peer-reviewed scientific journal, and the first to look at the effectiveness of filters at various settings. Most previous studies have been produced by organizations with a strong point of view either favoring or opposing filters. The Kaiser Foundation is a nonprofit health research group. David Burt, an antipornography advocate who is a spokesman for the filtering company N2H2 , said he was pleased with the report, which he called “very thoughtful and well designed — they recognized it matters a lot how you configure a filter and set it up.”

But opponents of filtering requirements said the study showed the technology’s clumsiness.

“Filters are just fine for parents to use at home,” said Judith F. Krug, director of the Office for Intellectual Freedom at the American Library Association. “They are not appropriate for institutions that might be the only place where kids can get this information.”

“The importance of the First Amendment,” Ms. Krug said, “is that it provides us with the ability to govern ourselves, because it guarantees that you have the right to access information. The filters undercut that ability.”

Nancy Willard, an Oregon educator who has written student guides that emphasize personal responsibility in Internet surfing, called filtering a kind of censorship that, if performed by the schools directly, would be unconstitutional.

“These filtering companies are protecting all information about what they are blocking as confidential trade secrets,” Ms. Willard said. “This is nothing more than stealth censorship.”

The study was conducted for the foundation by University of Michigan researchers, who tested six leading Internet filtering programs. The researchers searched for information on 24 health topics, including breast cancer and birth control, and also for pornographic terms. They performed the tests at each of three settings. At the least restrictive setting, only pornography is supposed to be blocked; an intermediate setting also bars sites with nudity and other controversial material like illicit drugs. The most restrictive setting possible for each product may block sites in dozens of other categories.

The researchers then called 20 school districts and library systems around the United States to ask how they set their filters. Of the school systems, which teach a half million students over all, only one set its filters at the least restrictive level.

The issue of library filtering is making its way through the federal courts. Last month the Supreme Court agreed to hear a Bush administration defense of the Children’s Internet Protection Act, the federal law requiring schools and libraries to use filters on computers used by children or to lose technology money. A special panel of the United States Court of Appeals for the Third Circuit, in Philadelphia, struck down part of the law that applied to libraries as unconstitutional. Chief Judge Edward R. Becker wrote that filters were a “blunt instrument” for protecting children.

eenagers who look to the Internet for health information as part of their “wired generation” birthright are blocked from many useful sites by antipornography filters that federal law requires in school and library computers, a new study has found.

The filtering programs tend to block references to sex and sex-related terms, like “safe sex,” “condoms,” “abortion,” “jock itch,” “gay” and “lesbian.” Although the software can be adjusted to allow access to most health-related Web sites, many schools and libraries ratchet up the software’s barriers to highest settings, the report said.

“A little bit of filtering is O.K., but more isn’t necessarily better,” said Vicky Rideout, vice president of the Henry J. Kaiser Family Foundation, which produced the report, to be published today in The Journal of the American Medical Association. “If they are set too high, they can be a serious obstacle to health information.”

The researchers found that filters set at the least restrictive level blocked an average of 1.4 percent of health sites; at the most restrictive level, filters blocked nearly 25 percent of health sites. The amount of pornography blocked, however, was fairly consistent: 87 percent at the least restrictive level, 91 percent at the most restrictive.

The programs blocked a much higher percentage of health sites devoted to safe-sex topics: 9 percent at the least restrictive level and 50 percent at the most restrictive. The blocked pages at high levels included The Journal of the American Medical Association’s site for women’s health and a page with online information from the Food and Drug Administration about clinical trials.

To the researchers, the results mean that a school or library that uses a less restrictive setting for Internet filters can lose very little of the protective effect of the filters, while minimizing the tendency of filters to block harmless and even valuable sites.

The report is the first major study of the effectiveness of filters to appear in a peer-reviewed scientific journal, and the first to look at the effectiveness of filters at various settings. Most previous studies have been produced by organizations with a strong point of view either favoring or opposing filters. The Kaiser Foundation is a nonprofit health research group. David Burt, an antipornography advocate who is a spokesman for the filtering company N2H2 , said he was pleased with the report, which he called “very thoughtful and well designed — they recognized it matters a lot how you configure a filter and set it up.”

But opponents of filtering requirements said the study showed the technology’s clumsiness.

“Filters are just fine for parents to use at home,” said Judith F. Krug, director of the Office for Intellectual Freedom at the American Library Association. “They are not appropriate for institutions that might be the only place where kids can get this information.”

“The importance of the First Amendment,” Ms. Krug said, “is that it provides us with the ability to govern ourselves, because it guarantees that you have the right to access information. The filters undercut that ability.”

Nancy Willard, an Oregon educator who has written student guides that emphasize personal responsibility in Internet surfing, called filtering a kind of censorship that, if performed by the schools directly, would be unconstitutional.

“These filtering companies are protecting all information about what they are blocking as confidential trade secrets,” Ms. Willard said. “This is nothing more than stealth censorship.”

The study was conducted for the foundation by University of Michigan researchers, who tested six leading Internet filtering programs. The researchers searched for information on 24 health topics, including breast cancer and birth control, and also for pornographic terms. They performed the tests at each of three settings. At the least restrictive setting, only pornography is supposed to be blocked; an intermediate setting also bars sites with nudity and other controversial material like illicit drugs. The most restrictive setting possible for each product may block sites in dozens of other categories.

The researchers then called 20 school districts and library systems around the United States to ask how they set their filters. Of the school systems, which teach a half million students over all, only one set its filters at the least restrictive level.

The issue of library filtering is making its way through the federal courts. Last month the Supreme Court agreed to hear a Bush administration defense of the Children’s Internet Protection Act, the federal law requiring schools and libraries to use filters on computers used by children or to lose technology money. A special panel of the United States Court of Appeals for the Third Circuit, in Philadelphia, struck down part of the law that applied to libraries as unconstitutional. Chief Judge Edward R. Becker wrote that filters were a “blunt instrument” for protecting children.


Subscribe Today – Paypal or Credit Card Accepted.

Flat rate subscription. Select a membership option & subscribe.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now vvailable.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.

Study – Web Filtering in Schools

slc_filtering_1

AASL Executive Summary

The American Association of School Librarians (AASL) conducted its national longitudinal survey, School Libraries Count!, between January 24 and March 4, 2012. The annual survey collected data on filtering in schools. Participants answered 14 questions ranging from whether or not their schools use filters, to the specific types of social media blocked at their schools.

This paper is an overview of the data that was collected. As the results show, filtering continues to be an important issue for most schools around the country. The data from School Libraries Count! suggests that many schools are going beyond the requirements set forth by the Federal Communications Commission (FCC) in its Child Internet Protection Act (CIPA).

AASL’s position views the social aspect of learning as important for students in the 21st century and much of the filtering software seems to discount that aspect.

Uses and Types of Filtering

When asked whether their schools or districts filter online content, 98% of the respondents said content is filtered. Specific types of filtering were also listed in the survey, encouraging respondents to check any filtering that applied at their schools. There were 4,299 responses with the following results:

94% (4,041) Use filtering software
87% (3,740) Have an acceptable use policy (AUP)
73% (3,138) Supervise the students while accessing the Internet
27% (1,174) Limit access to the Internet
8% (343) Allow student access to the Internet on a case-by-case basis

slc_filtering_2

The data indicates that the majority of respondents do use filtering software, but also work through an AUP with students, or supervise student use of online content individually.

The next question identified types of filtering software and asked respondents to select those used at their schools. There were 4,039 total responses. The top three filtering software was:

70% (2,827) URL-based
60% (2,423) Keyword-based
47% (1,898) Blacklists

Who and What Gets Filtered

When respondents were asked if content for students is filtered by their school or by the district, 100% of the 4,299 respondents answered “Yes.” Respondents also indicated that in 73% of schools, all students are filtered at the same level.

When asked if the filters affect both students and staff, 88% of 3,783 respondents said filters are used for staff, and 56% of 2,119 respondents said the same level of filtering is applied to students and staff alike.

The top four filtered content areas in schools surveyed include:

Social networking sites (88%)
IM/online chatting (74%)
Gaming (69%)
Video Services (66%)

Additional filtered content includes personal e-mail accounts, peer-peer file sharing and FTP sites. However when asked if they could request sites be unblocked, 92% of the 3,961 respondents indicated they could in the following ways:

27% (1,069) Have the site unblocked in a few hours
35% (1,386) Have the block removed in within one to two days
17% (673) Wait more than two days but less than a week
20% (792) Wait one week or more

The survey found that 68% of the decisions to unblock a site are made at the District level and only 17% of the decisions are made at the building level.
Bring Your Own Devices

slc_filtering_3

The School Libraries Count Survey! also asked which types of portable electronic devices students are allowed to bring to school. Respondents were able to select all that apply. The 4,299 responses revealed the following percentages for devices allowed:

E-readers (53%)
Cell phones (49%)
Laptops (39%)
MP3 Players (36%)
Netbooks (32%)

When students bring these items to school, 51% of 2,981 responses indicated there is a filter mechanism used for these devices.

When answering how students’ personal devices were filtered, the top five answers from 1,520 respondents were:

Through the use of the AUP (48%)
Logging on through the school network (47%)
Not having Internet connectivity (29%)
Using the discretion of the classroom teacher (28%)
Logging into a “guest” network (26%)

Impact of Filtering on Learning

The last filtering question discussed the impact that filtering has on the individual programs. Respondents were asked to select all that applied.

Of the 4,299 responses 52% indicated that filtering impedes student research when completing key word searches, 42% indicated that filtering discounts the social aspects of learning, and 25% stated that filtering impeded continued collaboration outside of person-to-person opportunities.

On the other hand, 50% indicated filtering decreased the number of potential distractions, 34% indicated filtering decreased the need for direct supervision, and 23% indicated that filtering allowed research curriculum to yield more appropriate results.

One trend revealed in the survey is that students are increasingly allowed to bring their own devices to school, but those devices are still subject to the filters. Many school librarians are reporting that true student research is being hindered by school filters, making this an issue that AASL will continue to address in the future.


Blacklists For Web Filtering Purposes.

Flat rate subscription. Select a membership option & subscribe.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now vvailable.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.

Should you consider investing in a Web Filter?

Should you consider investing in a Web Filter?


When you think of web filtering what is the first visual that comes to mind? For some it’s pop-ups and notifications to update my antivirus. Today’s web filtering capabilities have become more sophisticated, and so have the criminals attempting to infect systems.

According to one study, cybercrime costs have increased by 19 percent. Let’s break down three tips for protecting your business with web filtering.

1. Don’t be cheap with your security budget, invest in web filtering software

it_guy
Similar to purchasing insurance on your business, a solid web filter will provide additional security for your network. You should be shopping for software with keyword blocking, malware filtering, social media monitoring, redirects, P2P blocking, BYOD support, user notifications, 24/7 software support and custom acl or blacklist rules for fine tuning the sites that are blocked. Such as the blacklists offered by Squidblacklist.org.

You may also want to look into software that eliminates anonymous proxies or allows your organization to block specific sites related to gambling, gaming, streaming media or any site category that should not be accessed during work hours.

All of these features are limited when you use free web filtering services. To feel peace of mind, it’s important to put a little investment into your web filtering tools.

Additionally, the extra layer of coverage may prevent an internal, yet unintentional, threat to occur because your employee landed on a bad website. In fact, according to a study by Kansas State University, roughly 60 to 80 percent of employee time is spent surfing non-work related websites. Essentially the money you spend on your web filtering software could pay off tenfold in productivity if you limit some commonly surfed websites.

2. Make web filtering required for all employee devices

This may seem like an obvious statement but with today’s flex scheduling and mobility it’s easy for older devices to get overlooked. Consider using a web filtering tool that allows you to deploy updates across multiple platforms.

The best filters allow you to manage these devices through a central dashboard and make updates, or see traffic, on an as-needed basis. While we aren’t encouraging a Big Brother mentality, it’s good to know you can see the whole picture and focus on a problem before it becomes a threat.

Another great feature of this type of filtering tool is disaster recovery. If your web filtering platform is located in a cloud environment you can access your dashboard anytime, anywhere. For chief technology officers on the go, this is also a productivity plus in the event of a potential threat.

3. Understand content filtering basics.

You may not be an IT manager or CTO, but that doesn’t mean you should avoid learning how web filtering works. At its core, web filtering is established to screen incoming traffic from the web and determine whether it should be displayed – or blocked. It is also important to understand that web filters are not a replacement for quality antivirus and anti-malware software. While some web filters can block blacklisted malware domains and links, and others offer inline scanning for viruses and malware. Many of these malicious threats are transmitted via email or various other attack vectors, such as insider threats. Ultimately the threats are difficult to completely mitigate so investing in a robust solution is critical.

It’s important to understand some of the terminology useed in the first half of this article.

For example: blocking a redirect can prevent typo squatters from redirecting your search terms to unrelated sites that host malware. A typo squatter will change a URL from its known link to a similar URL that may have one additional letter (usually a consonant). Many users do not pay attention to the domain names in search results and are not as cautious as they should be when clicking links.

Another basic web filtering term is “anonymous proxies.” These are tools that facilitate anonymous traffic, making sources untraceable and can be, and are often used for the purpose of bypassing web filters. These proxies are typically used to mask malicious or otherwise criminal activity on the internet and block the location of a specific threat.

If you find that many of the terms mentioned in your web filtering research are too complicated we recommend using a google to do your homework. This helpful tool is also handy when you’re researching other IT security tools such as backup, antivirus, operation systems, and more.

So there is little question that you need a web filtering tool. Now that you are equipped with the resources and understanding to purchase an effective solution, we recommend that you act immediately on this mission. Your employees are landing on thousands of websites a day and the liability falls on the organization to protect your assets, your data, and your network. Ultimately it is your brand that could suffer from a security breach.

Strong categorized domain blacklists from Squidblacklist.org

A critical component of an effective content control strategy.


Flat rate subscription. Select a membership option & subscribe.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now vvailable.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.

Survey finds the largest single group of malicious domains, about one-third of the total, fall under the .biz TLD.

 

Domains and domain names are fundamental to the operation of the Internet. They provide a hierarchy of unique identifiers that guide traffic across the Web and identify websites, servers and other resources. However, in the form of malicious domains, they are a basic tool in the hands of cybercriminals.

As with other aspects of computer security, there are no silver bullets for protecting against malicious domains. However, understanding domain names can help firms and individual employees guard themselves against attacks.

Domain names form a hierarchy of domains and subdomains. For example, marketing.companyname.com is a subdomain of companyname.com. In turn, this is one of the many subdomains of the familiar top-level domain (TLD) com. It is typical to type a period in front of TLD names, as in .com, though the period is technically a separator, not part of the TLD itself.

Looking at the TLD Landscape

A recent IBM security intelligence and research report, “The .Bizness Behind Malicious Domain Names,” looks at malicious domains and their distribution across the overall domain name structure, particularly the various TLDs, such as .com, .net and .org. The report builds on research from IBM security partner CrowdStrike, which keeps track of ongoing malicious activity online.

CrowdStrike’s survey found that the largest single group of malicious domains, about one-third of the total, fall under the TLD .biz. This TLD was created specifically for business use in 2000 to alleviate overcrowding within the original .com TLD (which dates back to the 1980s).

It should be emphasized that most .biz websites are perfectly legitimate businesses. However, the difficulty of policing an entire global TLD has let cybercriminals register domain names that often mimic well-known, legitimate domains, such as the websites of major firms. Most other malicious domains fall under the long-established .org, .com and .net TLDs. Some have country-specific TLDs, often to either target victims in those countries or disguise their own origins.

Protecting Against Malicious Domains

The best protection against malicious domains is user awareness. For example, a domain name such as companyname.com.biz should trigger immediate suspicion. It is deceptively trying to masquerade as a subdomain of the .com TLD when, in fact, it is a subdomain of .biz.

Overly clever spellings, such as wind0wsupdates.com, should also raise a red flag. Unfortunately, all too many users have “domain blindness” and pay little or no attention to where they are actually going online. Moreover, mobile devices such as smartphones may hide address bars in order to conserve limited screen space.

Firms and other organizations can use a brute-force method to protect against some malicious domains by blocking entire TLDs. If, for example, a company has no business partners with a .biz subdomain, it can bar all connections to .biz. Individual exceptions can then be white-listed.

However, this is not practical for TLDs such as .com or .org. Along with encouraging user awareness, the best protection is provided by a security partner that can provide up-to-date listings of malicious domains to avoid.


Subscribe Today – Paypal or Credit Card Accepted.

Flat rate subscription. Select a membership option & subscribe.



Select Payment Option



  • You will be issued a username and password.
  • You will be granted access to our member area.
  • 5 Year Membership Option now available.
  • For lifetime membership options click here.clipart
  • Contact us if you would like a pre order invoice.

Disclaimer: All sales are final, we do not issue refunds. Cancel your subscription anytime.