Frequently Asked Questions
Our honest opinion about our tools, services and everything else
Is there a restriction on how many websites I can scan?
Are there any restrictions on how many pictures you can edit with Pixelmator or Photoshop? No! We also don't have such silly restrictions.
On the top of that our tools work differently to what you are used to. Tests and data reside inside your very own browser and never reach our servers. In other words, your very own browser is doing the hard word. Therefore, you can use the tools that we offer just like any other desktop software - unlimited.
Can I schedule monthly or per-week scans for my websites?
At the moment we do not offer a scheduling mechanism for a range of reasons. Mainly we are concerned not to cause any damage by throwing the "kitchen sink" at your production web application. However, we are running a private-beta of Remote, which is an automated, hosted solution designed to do these sort of things. If you are interested, just get in touch.
What is the cost?
The cost varies depending on what you want. If you are just interested in scanning than the Scanner is cheapest option. However, you may want to add more tools to diversify your tests. The Classic Pack incorporates all kinds of tools at a discounted price.
The reality of the situation is that the more tools you throw at a problem the better outcome you will get because different tools are good at different things and all tools have blind spots that you may not be aware of.
Is there a trial period?
Yes there is! In fact, all subscriptions start as a trial and roll into paid after the period ends. We won't charge you a penny until the end of the trial period. If you cancel after this date you will have a month worth of goodness with our tools but you will be charged.
Does it cover OWASP Top 10?
OWASP Top 10 is a "mythical beast" because some of the items on the list cannot be trivially tested in an automated or even semi-automated fashion. That does not apply just for us but for all web application security vendors. So, to say that we support OWASP Top 10 is like saying "we support OWASP Top 10 but only for the issues that are automatable and testable". However, with a bit of skill and education you can test for any kind of vulnerabilities even beyond what OWASP Top 10 has to offer. And you can use our tools to do that.
If you are running a team of 10 or more people we can even come and do training for you. Get in touch for details.
Does it cover malware detection of websites?
Do not buy services from vendors that do malware detection for websites! They are cheats. Period. Although malware hosted on your system could result in your website getting blacklisted, it is not like the malware actually runs on your website. If it runs you have bigger problem than getting into a black list, which BTW is easily rectifiable. Stay away from anyone who is offering malware detection as their number one feature. It won't work for you. Web security is not the same as desktop security.
That being said, at the moment we do not offer such silly services and if we start doing this, it is likely that we will do it for free.
What about the false-positive rate?
Some vendors claim that they have a "zero false-positive rate". Don't trust them. If you are still not convinced than contact us and will prove why. False positives are just a matter of life and although we try hard to minimise their rate, you cannot get away from this problem 100%.
Our strategy is slightly different though. Instead of creating "one size fits all" type of solution, which is subjective to a high false-positive rate, we have engineered many small domain specific tools that are really good at what they do and therefore the false-positive rate is small and sometimes none-existent. Our tools are a bit like the apps your will find on Apple's and Android's app stores - they are simple, easy to use and very good at what they do. In other words, they are domain-specific and that gives them certain leverage over other none-domain-specific tools.
We think that the conversation about the false-positive rate is way over-estimated. For example, if a tool does not produce any results at all, certainly the false-positive rate is zero but the success rate could be perceived as being zero too.