![]() ![]() Sure, maybe renaming your "User" SQL database column to something more esoteric will make it a little bit harder for attackers to perform SQLi. If you make your own algorithm, it's likely to contain serious mistakes that you might be overlooking. The current popular algorithms have been properly vetted by the security community and are much more secure than anything you could make on your own. Unless you SERIOUSLY know what you're doing, don't try to make your own encryption or hashing algorithm. Surely, given how long all those other hashing algorithms have been around for, they must be insecure by now, right? Maybe it's better to make your own. #SECURITY VIA OBSCURITY CODE#But just putting your insecure code behind a random subdomain with no other controls is a terrible idea. If the subdomain requires a secure login and is IP-restricted, you're a-okay. This is fine, as long as it isn't your only method of security. One of the ways you might do this is stowing it away in a subdomain. Say there's a part of your website that you want to hide - maybe some insecure code that you still need to test, or some admin controls. If you have a Wordpress website, try running wpscan on it to see if there are any glaring vulnerabilities you should fix. Hiding the fact that you're using Wordpress isn't as important as just keeping your Wordpress and plugins updated. The alternative? Honestly, don't bother too much. While these might deter a novice attacker, any hacker worth their internet connection will be able to figure out that you're running Wordpress by checking your CSS. Common ways of doing this include removing Wordpress' readme.html file and renaming folders such as wp-admin. Some Wordpress websites try to conceal the fact that they're running Wordpress. Also consider IP-restricting it if you don't move around too much. Better yet, if anyone other than you shouldn't see a page, make sure it's behind a secure login page. Instead, if you want a page to not show up in results, add a noindex metatag to the page. Checking for a robots.txt file is one of the first things a malicious person might do - and where do you think they're going next when they see you've told Google not to crawl "super-secret-passwords/"? A robots.txt might look something like this:Īll this does is prevent Google from crawling those pages! It doesn't ward away hackers. Robots.txt is used to tell search engines such as Google not to crawl certain sections of your website. ![]() Robots.txt is a file located at the root of your domain, e.g., /robots.txt. Here are some of the most harmful examples of security through obscurity I've seen. In computing, security through obscurity is used more commonly than you'd expect. #SECURITY VIA OBSCURITY PASSWORD#A much better bet would have been to password protect your files. ![]() #SECURITY VIA OBSCURITY WINDOWS#It might work for a while, but the moment anyone checks the "Frequent Files" section of Windows Explorer, your secret's out. ![]() This would be security through obscurity. And you might have felt very confident knowing that there's no reason your parents would ever look in the "homework" directory. You probably hid this folder behind a whole bunch of other folders and named it something boring. Say you're a teenager again, and you've got a particular folder of files that you'd rather your parents don't find. Security through obscurity is the reliance on secrecy and confusing attackers instead of building proper controls to keep them out. If you do pentesting, this will give you some tips on where to look in webapps to find examples of poor security you can exploit. This article is an explanation of what security through obscurity is, why it's terrible if it's your only defense, and some ways in which you might be using it in your webapps. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |