How to prevent XSS (Cross Site Scripting) whilst allowing HTML input

22,549

Solution 1

Microsoft have produced their own anti-XSS library, Microsoft Anti-Cross Site Scripting Library V4.0:

The Microsoft Anti-Cross Site Scripting Library V4.0 (AntiXSS V4.0) is an encoding library designed to help developers protect their ASP.NET web-based applications from XSS attacks. It differs from most encoding libraries in that it uses the white-listing technique -- sometimes referred to as the principle of inclusions -- to provide protection against XSS attacks. This approach works by first defining a valid or allowable set of characters, and encodes anything outside this set (invalid characters or potential attacks). The white-listing approach provides several advantages over other encoding schemes. New features in this version of the Microsoft Anti-Cross Site Scripting Library include:- A customizable safe list for HTML and XML encoding- Performance improvements- Support for Medium Trust ASP.NET applications- HTML Named Entity Support- Invalid Unicode detection- Improved Surrogate Character Support for HTML and XML encoding- LDAP Encoding Improvements- application/x-www-form-urlencoded encoding support

It uses a whitelist approach to strip out potential XSS content.

Here are some relevant links related to AntiXSS:

Solution 2

Peter, I'd like to introduce you to two concepts in security;

Blacklisting - Disallow things you know are bad.

Whitelisting - Allow things you know are good.

While both have their uses, blacklisting is insecure by design.

What you are asking, is in fact blacklisting. If there had to be an alternative to <script> (such as <img src="bad" onerror="hack()"/>), you won't be able to avoid this issue.

Whitelisting, on the other hand, allows you to specify the exact conditions you are allowing.

For example, you would have the following rules:

  • allow only these tags: b, i, u, img
  • allow only these attributes: src, href, style

That is just the theory. In practice, you must parse the HTML accordingly, hence the need of a proper HTML parser.

Solution 3

If you want to allow some HTML but not all, you should use something like OWASP AntiSamy, which allows you to build a whitelisted policy over which tags and attributes you allow.

HTMLPurifier might also be an alternative.

It's of key importance that it is a whitelist approach, as new attributes and events are added to HTML5 all the time, so any blacklisting would fail within short time, and knowing all "bad" attributes is also difficult.

Edit: Oh, and regex is a bit hard to do here. HTML can have lots of different formats. Tags can be unclosed, attributes can start with or without quotes (single or double), you can have line breaks and all kinds of spaces within the tags to name a few issues. I would rely on a welltested library like the ones I mentioned above.

Solution 4

Regular expressions are the wrong tool for the job, you need a real HTML parser or things will turn bad. You need to parse the HTML string and then remove all elements and attributes but the allowed ones (whitelist approach, blacklists are inherently insecure). You can take the lists used by Mozilla as a starting point. There you also have a list of attributes that take URL values - you need to verify that these are either relative URLs or use an allowed protocol (typically only http:/https:/ftp:, in particular no javascript: or data:). Once you've removed everything that isn't allowed you serialize your data back to HTML - now you have something that is safe to insert on your web page.

Share:
22,549
Peter Bridger
Author by

Peter Bridger

Technical Architect at Zone (https://www.thisiszone.com/) instilling the benefits of TDD and SOLID within the teams I run and using Scrum and a Continuous Delivery pipeline to continuously deliver high quality software

Updated on December 05, 2020

Comments

  • Peter Bridger
    Peter Bridger over 3 years

    I have a website that allows to enter HTML through a TinyMCE rich editor control. It's purpose is to allow users to format text using HTML.

    This user entered content is then outputted to other users of the system.

    However this means someone could insert JavaScript into the HTML in order to perform a XSS attack on other users of the system.

    What is the best way to filter out JavaScript code from a HTML string?

    If I perform a Regular Expression check for <SCRIPT> tags it's a good start, but an evil doer could still attach JavaScript to the onclick attribute of a tag.

    Is there a fool-proof way to script out all JavaScript code, whilst leaving the rest of the HTML untouched?

    For my particular implementation, I'm using C#

  • Peter Bridger
    Peter Bridger over 12 years
    it does seem that a full HTML parser is the only bullet proof solution. I'm going to look into using majestic12.co.uk/projects/html_parser.php
  • Dunhamzzz
    Dunhamzzz over 12 years
    This doesn't protect you against even half of the stuff listed here: ha.ckers.org/xss.html, a lot of the <head> hacks don't even need javascript:
  • Wladimir Palant
    Wladimir Palant over 12 years
    Right, so you downvote an answer simply because you didn't take the time to read it... Of course it does protect against these vectors - that's why I recommended using an HTML parser. Once you parse HTML and serialize it properly all the invalid HTML input issues go away "automatically". And removing all elements and attributes that aren't explicitly allowed is good enough to make it safe. Taking care of javascript: is only the last step. What do you think how HTML Purifier works?
  • Dunhamzzz
    Dunhamzzz over 12 years
    You are suggesting OP roll his own HTML sanitation which is suicidal in this day and age.
  • Dunhamzzz
    Dunhamzzz over 12 years
    You can put javascript in src and href.
  • Christian
    Christian over 12 years
    @Dunhamzzz - That's another rule, concerning tag content. I talked about tags and their attributes, not content. The point is, whereas href/src are useful, onclick is not.
  • Wladimir Palant
    Wladimir Palant over 12 years
    No, I am explaining how a solution works. If the OP reads this and decides to use an existing solution - great. But understanding what it does is still crucial, using it like an enchantment that will magically fix your issues will likely lead to security issues again. Not to mention that your suggested solution only works for PHP but the OP is using ASP.net/C#.
  • Dunhamzzz
    Dunhamzzz over 12 years
    Ahh the C# was added after my answer
  • YodasMyDad
    YodasMyDad over 11 years
    Just in case anyone reads this and just for the record. NONE of the above work in medium trust if you want to use the Safe HTML methods.
  • Reactgular
    Reactgular almost 10 years
    FYI: None of the above are in development anymore, and I've read comments elsewhere that AntiXSS is not well implemented.
  • René
    René over 2 years
    I found a library that looks good and is still maintained. This might be interesting for everyone seeing this question and answer. github.com/mganss/HtmlSanitizer