Why exactly is eval evil?

15,518

Solution 1

There are several reasons why one should not use EVAL.

The main reason for beginners is: you don't need it.

Example (assuming Common Lisp):

EVALuate an expression with different operators:

(let ((ops '(+ *)))
  (dolist (op ops)
    (print (eval (list op 1 2 3)))))

That's better written as:

(let ((ops '(+ *)))
  (dolist (op ops)
    (print (funcall op 1 2 3))))

There are lots of examples where beginners learning Lisp think they need EVAL, but they don't need it - since expressions are evaluated and one can also evaluate the function part. Most of the time the use of EVAL shows a lack of understanding of the evaluator.

It is the same problem with macros. Often beginners write macros, where they should write functions - not understanding what macros are really for and not understanding that a function already does the job.

It often is the wrong tool for the job to use EVAL and it often indicates that the beginner does not understand the usual Lisp evaluation rules.

If you think you need EVAL, then check if something like FUNCALL, REDUCE or APPLY could be used instead.

  • FUNCALL - call a function with arguments: (funcall '+ 1 2 3)
  • REDUCE - call a function on a list of values and combine the results: (reduce '+ '(1 2 3))
  • APPLY - call a function with a list as the arguments: (apply '+ '(1 2 3)).

Q: do I really need eval or does the compiler/evaluator already what I really want?

The main reasons to avoid EVAL for slightly more advanced users:

  • you want to make sure that your code is compiled, because the compiler can check code for many problems and generates faster code, sometimes MUCH MUCH MUCH (that's factor 1000 ;-) )faster code

  • code that's constructed and needs to be evaluated can't be compiled as early as possible.

  • eval of arbitrary user input opens up security problems

  • some use of evaluation with EVAL can happen at the wrong time and create build problems

To explain the last point with a simplified example:

(defmacro foo (a b)
  (list (if (eql a 3) 'sin 'cos) b))

So, I may want to write a macro that based on the first parameter uses either SIN or COS.

(foo 3 4) does (sin 4) and (foo 1 4) does (cos 4).

Now we may have:

(foo (+ 2 1) 4)

This does not give the desired result.

One then may want to repair the macro FOO by EVALuating the variable:

(defmacro foo (a b)
  (list (if (eql (eval a) 3) 'sin 'cos) b))

(foo (+ 2 1) 4)

But then this still does not work:

(defun bar (a b)
  (foo a b))

The value of the variable is just not known at compile time.

A general important reason to avoid EVAL: it is often used for ugly hacks.

Solution 2

eval (in any language) is not evil in the same way that a chainsaw is not evil. It is a tool. It happens to be a powerful tool that, when misused, can sever limbs and eviscerate (metaphorically speaking), but the same can be said for many tools in a programmer's toolbox including:

  • goto and friends
  • lock-based threading
  • continuations
  • macros (hygenic or other)
  • pointers
  • restartable exceptions
  • self-modifying code
  • ...and a cast of thousands.

If you find yourself having to use any of these powerful, potentially dangerous tools ask yourself three times "why?" in a chain. For example:

"Why do I have to use eval?" "Because of foo." "Why is foo necessary?" "Because ..."

If you get to the end of that chain and the tool still looks like it's the right thing to do, then do it. Document the Hell out of it. Test the Hell out of it. Double-check correctness and security over and over and over again. But do it.

Solution 3

Eval is fine, as long as you know EXACTLY what is going into it. Any user input going into it MUST be checked and validated and everything. If you don't know how to be 100% sure, then don't do it.

Basically, a user can type in any code for the language in question, and it will execute. You can imagine for yourself how much damage he can do.

Solution 4

"When should I use eval?" might be a better question.

The short answer is "when your program is intended to write another program at runtime, and then run it". Genetic programming is an example of a situation where it likely makes sense to use eval.

Solution 5

IMO, this question is not specific to LISP. Here is an answer on the same question for PHP, and it applies to LISP, Ruby, and other other language that has an eval:

The main problems with eval() are:

  • Potential unsafe input. Passing an untrusted parameter is a way to fail. It is often not a trivial task to make sure that a parameter (or part of it) is fully trusted.
  • Trickyness. Using eval() makes code clever, therefore more difficult to follow. To quote Brian Kernighan "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it"

The main problem with actual use of eval() is only one:

  • inexperienced developers who use it without enough consideration.

Taken from here.

I think the trickyness piece is an amazing point. The obsession with code golf and concise code has always resulted in "clever" code (for which evals are a great tool). But you should write your code for readability, IMO, not to demonstrate that you're a smarty and not to save paper (you won't be printing it anyway).

Then in LISP there's some problem related to the context in which eval is run, so untrusted code could get access to more things; this problem seems to be common anyway.

Share:
15,518
Jay
Author by

Jay

Updated on July 22, 2022

Comments

  • Jay
    Jay almost 2 years

    I know that Lisp and Scheme programmers usually say that eval should be avoided unless strictly necessary. I’ve seen the same recommendation for several programming languages, but I’ve not yet seen a list of clear arguments against the use of eval. Where can I find an account of the potential problems of using eval?

    For example, I know the problems of GOTO in procedural programming (makes programs unreadable and hard to maintain, makes security problems hard to find, etc), but I’ve never seen the arguments against eval.

    Interestingly, the same arguments against GOTO should be valid against continuations, but I see that Schemers, for example, won’t say that continuations are "evil" -- you should just be careful when using them. They’re much more likely to frown upon code using eval than upon code using continuations (as far as I can see -- I could be wrong).

  • Jay
    Jay about 14 years
    So if I'm actually generating S-expressions based on user input using an algorithm that won't directly copy user input, and if that's easier and clearer in some specific situation than using macros or other techniques, then I suppose there's nothing "evil" about it? In other words, the only problems with eval are the same with SQL queries and other techniques that use user input directly?
  • Ken
    Ken about 14 years
    What does that rule have to do with GOTO? Is there any feature in any programming language with which you can't make a mess?
  • Tor Valamo
    Tor Valamo about 14 years
    The reason it's called "evil" is because doing it wrong is so much worse than doing other things wrong. And as we know, newbies will do stuff wrong.
  • Jay
    Jay about 14 years
    Thanks -- that's what I heard of eval before ("ask yourself why"), but I had never yet heard or read what the potential problems are. I see now from the answers here what they are (security and performance problems).
  • JUST MY correct OPINION
    JUST MY correct OPINION about 14 years
    And code readability. Eval can totally screw the flow of code and render it incomprehensible.
  • stesch
    stesch about 14 years
    @Ken: There is no GOTO rule, hence the quotation marks in my answer. There's just a dogma for people who are afraid to think for themselves. Same for eval. I remember speeding up some Perl script dramatically by using eval. It's one tool in your toolbox. Newbies often use eval when other language constructs are easier/better. But avoiding it completely just to be cool and please dogmatic people?
  • Jay
    Jay about 14 years
    Thanks! I just didn't understand the last point (evaluation at the wrong time?) -- coud you elaborate a bit please?
  • Daniel Earwicker
    Daniel Earwicker about 14 years
    +1 as this is the real answer - people fall back on eval simply because they don't know there's a specific language or library feature to do what they want to do. Similar example from JS: I want to get a property from an object using a dynamic name, so I write: eval("obj.+" + propName) when I could have written obj[propName].
  • sepp2k
    sepp2k about 14 years
    I wouldn't say that code must be validated before evaling it in all circumstances. When implementing a simple REPL for example, you would probably just feed the input into eval unchecked and that wouldn't be a problem (of course when writing a web-based REPL you'd need a sandbox, but that's not the case for normal CLI-REPLs that run on the user's system).
  • Jay
    Jay about 14 years
    I see what you mean now, Rainer! Thansk!
  • Tor Valamo
    Tor Valamo about 14 years
    Like I said, you have to know exactly what happens when you feed what you feed into the eval. If that means "it will execute some commands within the limits of the sandbox", then that's what it means. ;)
  • asveikau
    asveikau over 13 years
    I don't understand why "lock-based threading" [sic] is in your list. There are forms of concurrency that don't involve locks, and problems with locks are generally well known, but I've never heard anyone describe using locks as "evil".
  • JUST MY correct OPINION
    JUST MY correct OPINION over 13 years
    asveikau: Lock-based threading is notoriously difficult to get right (I'd guess that 99.44% of production code using locks is bad). It doesn't compose. It is prone to turning your "multi-threaded" code into serial code. (Correcting for this just renders the code slow and bloated instead.) There are good alternatives to lock-based threading, like STM or actor models, that makes the use of it in anything but the lowest-level code evil.
  • Jay
    Jay over 13 years
    I understand that the use of global env is true for both Common Lisp and Scheme; is it also true for Clojure?
  • Hello71
    Hello71 over 13 years
    @Daniel: "obj.+"? Last I checked, + isn't valid when using dot-references in JS.
  • claj
    claj over 12 years
    @Daniel probably meant eval("obj." + propName) which should work as expected.
  • asm
    asm about 12 years
    +1, I agree with @Daniel that this is the real problem. I work on a 20 year old Lisp application and I've seen very experienced Lisp programmers use eval when it's unnecessary (and more complicated) than doing it correctly.
  • csl
    csl over 10 years
    In Scheme (at least for R7RS, perhaps also for R6RS) you must pass an environment to eval.
  • Throw Away Account
    Throw Away Account about 9 years
    The "evil input" problem with EVAL only affects non-Lisp languages, because in those languages, eval() typically takes a string argument, and the user's input is typically spliced in. The user can include a quote in their input and escape into the generated code. But in Lisp, EVAL's argument is not a string, and user input cannot escape into the code unless you're absolutely reckless (as in you parsed the input with READ-FROM-STRING to create an S-expression, which you then include in the EVAL code without quoting it. If you do quote it, there's no way to escape the quote).
  • artemonster
    artemonster over 8 years
    You may want to check out the kernel language ;)
  • Chocksmith
    Chocksmith about 8 years
    Just a comment to remind you that eval is extremely useful in AI applications where the program is the data.
  • Darren Ringer
    Darren Ringer about 8 years
    Hey, who're you calling an ugly hack!?
  • Kevin Kostlan
    Kevin Kostlan almost 8 years
    When TO use eval: The user needs to entering code in an interactive application. The workaround with builtin libraries is unusually obscure/verbose. When TO use macros: Performance (domain-specific compilation) once bottlenecks are identified, 3rd party library wrapper code generation. These are rare, maybe a total of <5% of the times it's used.
  • Rainer Joswig
    Rainer Joswig almost 8 years
    @KevinKostlan: I would be VERY VERY VERY careful about using EVAL in interactive applications: it creates a HUGE security problem when the user can execute arbitrary code by using EVAL. Especially in a situation when the interactive app runs on your side and the user is someone else somewhere...
  • Kevin Kostlan
    Kevin Kostlan almost 8 years
    @Rainer Joswig: Yes if you have the server run client commands it better be VERY well sandboxed, which means do NOT "roll your own" malicious code checker. If it's client code running on the client is it a severe threat as long as the client knows that any code ran gets all the privileges of the application?
  • Rainer Joswig
    Rainer Joswig almost 8 years
    @KevinKostlan: depends on the client app. Sometimes I would not want an 'end user' to peek and poke into the platform, or let other programs/services trying to exploit such 'feature', which could be a security problem. Basically injected code could do a lot of interesting things like looking onto the disk and sending the findings to some other place...
  • Loïc Faure-Lacroix
    Loïc Faure-Lacroix over 7 years
    @TorValamo ever heard of jail break?
  • szymanowski
    szymanowski about 6 years
    the "why chain" :) be sure to stop after 3 steps it can hurt.
  • ocodo
    ocodo about 6 years
    Probably worth adding on things like multiple inheritance, and meta-object-protocol since we're in a CL question.
  • Francois Bourgeois
    Francois Bourgeois over 3 years
    thank you for pointing out the other evil things, that you should never use :-)
  • Marc
    Marc over 3 years
    Perfect answer.
  • Will Ness
    Will Ness over 3 years
    in that case, why eval if we can compile and then funcall?