Yet in Firebug
var a = 10; delete a; console.log("this is a: " + a);consistently returned:
ReferenceError: a is not defined (Fool!)2
How could this be happening?
If I reworked the code and ran it as a script in a web page, it gave me "this is a: 10", as expected.
After a bit of research (thank you, Google), I found the answer. Ironically in a blog entry about another book by the same author that was puzzling me now...
The solution is well worth reading, but I'll see if I can summarize it:
- Variables are in fact properties with a "don't delete" flag set.
- Variables declared in JavaScript code that would otherwise be global, when executed by an eval call are in fact deletable (the flag isn't set)!
- Those deletable variables are added to the calling object's variable object, for want of a better place to put them.
Of course this answer poses another question: why are global variables that are in code executed in an eval context deletable?
I think that the following code might propose an answer:
(function test() { eval('var wtf = "wtf";'); console.log(wtf); console.log(delete wtf); // true })();
When you evaluate code you will possibly have variables added to your calling context that you might not want there. By marking them as deletable, you have a chance to clean them up. Now to work out how you can do this automatically...
Footnotes
1 Page 12, JavaScript Patterns by Stoyan Stefanov, September 2010: First Edition.
2 Fool! wasn't really part of the message: that is just how I felt when I saw the error.
No comments:
Post a Comment