Topic: The best javascript engine ever (Page 1 of 1) Pages that link to <a href="https://ozoneasylum.com/backlink?for=28037" title="Pages that link to Topic: The best javascript engine ever (Page 1 of 1)" rel="nofollow" >Topic: The best javascript engine ever <span class="small">(Page 1 of 1)</span>\

 
_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-05-2006 12:44

Sounds promising, uh?....

A couple of weeks ago, I've learnt how to make a compiler.
Which is very cool and "quite" simple.

Let me explain "quite", needs to define a bit of background for this one:
Compilation takes a lexer and a parser.

The lexer tokenizes a script in recognizable elements.
The parser verifies that the token compose an understandable pattern, eg. that the obey a grammar, aka syntax: ever heard a compiler complain about a syntax error? That's when the parser part gets mad at your code.

Existing tools like yacc or bison and lex, or flex, can help build such tools.
Basically, what has to be defined is the language (or set of lexemes) in the form of regular expressions for each token,
and the grammar, in the form of composition (substitution) rules.

....Under this new light, I've spotted the cause of the memory problem Scott had weeks ago.
If you (and I hope Scott did) read my answers, you already know the way he accessed his dom elements caused circular references, eg. objects referring to objects that referr to themselves,
and that these circular references were leaving some objects undestroyed when they were not in use anymore.

This is due to the built-in javascript parsers of current browsers.

Processing recursive syntax rules, like circular references tipycally, is very tedious: it's easy to process them to a fixed nth level, but it's diffuclt to make a generic syntax rule
to encompass all possible levels of recursion of such references.

I am wondering, and this is 100% theory, if it is "possible" at all. If it is, it means browsers js engines can be optimized a lot...

and it opens up a way to build an Open Source js engine, for example based on Sipermonkey, but make it 100 times better than the existing ones.
A js engine customized by the loones who make the best javascripts, eg.

You. Me. Us. The Asylum dhtml gurus.

liorean
Bipolar (III) Inmate

From: Umeå, Sweden
Insane since: Sep 2004

posted posted 06-06-2006 01:56

Heh. Welcome to my world.

To burst your bubble though:

  • JavaScript syntax has a few similarities with C and Perl. One of these is that there is no way to build a JavaScript engine that separates lexical syntax scanning (the lexer/scanner/tokeniser part, depending on what litterature you're reading) from the grammatical syntax scanning (parsing, reduction, or a dozen or so other terms). JavaScript lexical syntax happens to be dependant on the grammatical syntax. As such, you cannot simply use tools such as bison and flex, because bison would need as it's working material the tokens returned from flex, but flex would need to get data from bison regarding what types of tokens are allowed in the current grammar context to give it further tokens. There's a saying in the Perl world: "Only perl can parse Perl". While not strictly true, I hope you can see how the impossibility of separating scanner and parser makes for a much more difficult task. JavaScript is much easier to deal with than Perl on this point though, but not entirely simple.

  • The parsing phase builds an AST from source code. This AST may (as in SpiderMonkey), or may not (as in JScript (1)), be translated into some type of byte code, depending on which engine you're looking at. Anyway, my point is that the parsing is over once the AST has been completed. The execution has yet to start, though. The memory leaks are not taking place in the parsing phase. They are not even taking place in the execution phase (2). These leaks you're talking about are located to the data, in all likelyhood tied to a less than perfect garbage collection.

  • Garbage collection is the problem. JavaScript is a garbage collected language. But the object model of the underlying host isn't necessarily controlled by the same garbage collector.



    Microsoft COM uses a reference counting garbage collection scheme, and as such fails to collect garbage cycles. Mozilla XPCOM is very similar (IIRC it's binary compatible), but has (recently added, in fact) a garbage cycle detector to eliminate the problem of garbage cycles. Gecko - in difference to Trident - has interfaces used to prevent garbage cycles in many cases very frequently used from script, such as the DOM. Trident's memory leaks could be addressed in a similar way. I don't have enough knowledge to say anything about how these things are dealt with in the WebKit and Presto rendering engines, but I can not that WebKit is rather leaky, while Presto is the least leaky engine of them all.



In other words: The scripting engines are not the problem. The host object models are.

If you want to see an execution time leak, i.e. a leak in an actual scripting engine, try the following in SpiderMonkey.
Warning: Will consume as close to 100% CPU as it can get and will eventually crash due to running out of memory, but may take a while doing so if you have a lot of RAM

code:
var
    a=[];
a.length=0xffffffff;
a.push(0xffffffff,0x100000000);



That's how one of those may look. I can't actually think of a memory leak in the parsing phase, but that doesn't mean there aren't any.





1. Not sure about this, but I haven't heard anything indicating it would use a byte code engine. Maybe I should ask Eric Lippert about it.

2. Execution time leakage is almost always accompanied by either
- 100% activity and continually raising memory consumption
- or with the application becoming unresponsive due to deadlocks.

--
var Liorean = {
abode: "http://web-graphics.com/",
profile: "http://codingforums.com/member.php?u=5798"};

(Edited by liorean on 06-06-2006 02:15)

_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-07-2006 10:15

Ok, thanks for the pointers, but I have to understand it all, and maybe burst your bubble back about what happened to Scott some time ago

1) Why not a semi-compiled engine? I am taking flex and bison as "academic examples" and tools to support a js parser's development, but why not a Java-like engine? What in the js model prevents that exactly?
I'd be tempted to say safety, among things, but I really, really don't see how a semi-compiler would differ from a Java JIT machine or ActionScript in the Shockwave Flash component: those things exist
and they do work in browsers.

Why would the system gc miss out on elements?

In other words:

quote:

Garbage collection is the problem. JavaScript is a garbage collected language. But the object model of the underlying host isn't necessarily controlled by the same garbage collector.



How does this impact the proper functionning of the garbage collector?

>> in all likelyhood tied to a less than perfect garbage collection.

I've posted many things in that thread, that apparently ended up unread, and I don't really have time to dig them up.
The leaks he pointed out are classical, runtime leaks, memory that got out of control of the js interpreter, objects that simply "lost theyre referrer" but kept theyre memory space.

From several accounts and docs on the internet, from several sources including youngpup, this is caused by circular references being "misprocessed", and this is a syntaxic limitation
of many parsers and interpreters afaik, in terms of "recursive syntaxic rules".

Basically, what happened in his code is that he created objects referring to objects reffering to... and the chain got "broke"
at some point, if he deleted them "in the wrong order", leaving some objects unreferenced, and therefore, undeletable by the gc or anyone else for that matter.

I even had a sample bison syntax which caused exactly this issue in my papers somewhere: by then creating circular references and deleting elements in the wrong order
inside the code, you could get very similar behaviors to the ones being discussed.

One fact about this: there are js practices that prevent these kinds of memory management issues.
So the higher level syntax has a direct impact on those, regardless of the gc model or behavior.


...I certainly don't understand how what you said applies to the gc behavior and impacts it, but I understand the issue
described by Scott is caused by circular object references, and I understand I saw the same thing occur in other parsing tools because of syntaxic limitations
(which, of course, don't cause the leaks at parsing time, but "prepare" the objects to be incorrectly treated and deleted).

_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-07-2006 10:34

Allelujah! Here is the wonderful paper (fascinating stuff) that ties both ends, answers my question, and clarifies things.
Read the part about closures, then memory leaks:

quote:

http://simon.incutio.com/slides/2006/etech/javascript/js-reintroduction-notes.html



The origin of the issue is syntaxic, somehow: js allows for syntaxic mistakes, in that it allows closurs which can easilly produce circular references.
FF's powerful gc system tries to resolve this, but it's already acting as a workaround.... In other words, classical gc, indeed, will fail at solving those references,
while some other approaches will succeed.

Problem? Well, the origin, indeed, is syntaxic: js's syntax allows these errors to be made. On purpose.

Rhmmmm... there has to be a better way.

poi
Paranoid (IV) Inmate

From: Norway
Insane since: Jun 2002

posted posted 06-07-2006 12:28

[quickie] Using closure is normal and useful. The fact that vanilla closures leads to memory leak is not a surprise but it's possible to do Leak Free Javascript Closures. [/quickie]

Tyberius Prime
Maniac (V) Mad Scientist with Finglongers

From: Germany
Insane since: Sep 2001

posted posted 06-07-2006 13:02

Or, pragmatically use thi dojo toolkit, which fixes the closure as an eventhandler problem while at the same time being cross browser.

_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-07-2006 13:42

I'd opt for the dojo thing, but the fact that scripters are reaching the limits of theyre fav language shows a design issue.

I mean, closures and theyre existence are not the issue as I said above (100% agree with poi).

But the high level abstraction should prevent scripters from compromising the underlying mem alloc, one should not worry about mem alloc when scripting, it doesn't make much sense to
have to since you have no real control, does it?

I stronlgy think this shows that js should be redesigned somehow, either to become something more strict and more powerful (javascript 2 initiative), or something that doesn't trade
flexibility for reliability and doesn't force syntax-specific best practices.

Something smells in the concept of javascripters being able to cause leaks.

liorean
Bipolar (III) Inmate

From: Umeå, Sweden
Insane since: Sep 2004

posted posted 06-07-2006 17:40
quote:
_Mauro said:
Why not a semi-compiled engine? I am taking flex and bison as "academic examples" and tools to support a js parser's development, but why not a Java-like engine? What in the js model prevents that exactly?I'd be tempted to say safety, among things, but I really, really don't see how a semi-compiler would differ from a Java JIT machine or ActionScript in the Shockwave Flash component: those things existand they do work in browsers.

Why would the system gc miss out on elements?

I'm not sure you really understand what you're talking about here, because you're conflating two separate things - executable program and data.

Parsing generates an AST. The AST is either interpreted or compiled. If the AST is compiled to byte code, then the byte code is compiled (often JITted) into machine code at run time. An interpreter is generally lower performance, but reduces the time needed to go from source code to executing the program. A byte code engine is generally higher performance, but takes more time to go from source code to executing the program. There are JavaScript engines of both the interpreted and the byte code compiled kind. These types of memory leaks are present in either case.

But a language engine has more parts than the executable program. The language engine must also hook up a host environment. And that's where almost all the memory leak problems in browsers appear. Memory leaks generally don't happen in the JavaScript engine controlled object space. That object space is governed by the JavaScript garbage collector, and that garbage collector is quite good at getting all circular references in all of the browser hosted JavaScript engines.

But to do anything useful in the language you need to use the host environment, and here's where the problem comes in. Generally the browser host is coded in C/C++, Obj-C or Java and except for some Java browsers it is not governed by the same garbage collector. So, the language and it's executable program and it's internal object space is not the problem. The garbage collection mechanism of the host object space is the problem. Or rather, making sure that garbage collection works in both of the object spaces and can cross the object space border in both directions during a collection. Mozilla developers are working on a unified system to allow all the different garbage collectors to interact, allowing different garbage collection schemes to work by using a common interface. That way, SpiderMonkey can do it's arena garbage collection, XPCOM can do it's refcounting (Now with a with a cycle detector!), Python can use whatever garbage collection system that engine uses etc. and all the environments can detect cycles crossing the object space borders.

Java browsers, .NET browsers etc. could chose to go the other way - they could chose to write the entire host to be governed by the same garbage collector. It would still not fix the problem with garbage collection when crossing the object space border (say into the Flash plugin), but it would drastically reduce the number of border crossings that must be done.

But most browsers are not written for a one-GC-to-rule-them-all run time engine, are they? It's not the JavaScript engine the problem lies with. It's the host that needs to be fixed.


quote:
/snip/
How does this impact the proper functionning of the garbage collector?

As explained above - because the garbage collectors need to be able to detect cycles when references cross into other object spaces, both cycles within that other object space and cycles crossing the border more than once.

quote:
I've posted many things in that thread, that apparently ended up unread, and I don't really have time to dig them up.The leaks he pointed out are classical, runtime leaks, memory that got out of control of the js interpreter, objects that simply "lost theyre referrer" but kept theyre memory space. From several accounts and docs on the internet, from several sources including youngpup, this is caused by circular references being "misprocessed", and this is a syntaxic limitation of many parsers and interpreters afaik, in terms of "recursive syntaxic rules".

This is what I think you're misunderstanding. The circular references have nothing at all to do with the parsing of the JavaScript and "recursive syntax rules". They aren't even located to the executable program resulting from the parsing of the source. They're all located in the data. And the failures to collect them are because the garbage collector sometimes fails to detect cycles when crossing object spaces or because one of the object spaces doesn't have any cycle detection at all. Microsoft COM is refcounted, and doesn't have a garbage cycle detector/dead object detector. Most C/C++ code in the browser hosts use manual memory management, and also here failure to detect cycles can happen. And crossing over into plugins or extensions, even if the object space on both sides is perfectly free of garbage cycles, can introduce memory leaks because the extension or plugin and the browser host fail to communicate across the border.

quote:
Basically, what happened in his code is that he created objects referring to objects reffering to... and the chain got "broke"at some point, if he deleted them "in the wrong order", leaving some objects unreferenced, and therefore, undeletable by the gc or anyone else for that matter.

Yeah, but what does that have to do with the parsing? The parser doesn't know anything about objects, references etc. The parser shouldn't have to know, either. If a language doesn't have explicit memory management, and JavaScript is one of those languages, then memory management is the sole responsibility of the object spaces. And even in languages that have explicit memory management, the parser shouldn't have to care, because then the memory management is the responsibility of the executable program.

quote:
I even had a sample bison syntax which caused exactly this issue in my papers somewhere: by then creating circular references and deleting elements in the wrong order inside the code, you could get very similar behaviors to the ones being discussed. One fact about this: there are js practices that prevent these kinds of memory management issues.

If you look at what those practices amount to, you'll see it boils down to a few rules:

- Basic memory management by refcounting means that cycles will not be collected. Not at all.
- That means an object may never be allowed to contain a reference to itself or any other object that contains a reference chain in which mentioned object can be found.
- So, the solution is to either manually remove cycles (which you then need to keep track of manually whenever you introduce)
- Or to use a non-cyclic approach, such as storing all reference in a global array and storing the index in the array instead of storing the actual reference.

Those are workarounds in the executable program around deficiencies in the object spaces. To fix the problem, you need to fix the gargabe collection mechanism in those object spaces. The problem may still remain, however, if the garbage collectors don't communicate across object space borders, because even if they detect cycles internally, they will fail to detect cycles going through the other object spaces.

quote:
So the higher level syntax has a direct impact on those, regardless of the gc model or behavior.... I certainly don't understand how what you said applies to the gc behavior and impacts it, but I understand the issue described by Scott is caused by circular object references, and I understand I saw the same thing occur in other parsing tools because of syntaxic limitations(which, of course, don't cause the leaks at parsing time, but "prepare" the objects to be incorrectly treated and deleted).

Well, the problem is really disconnected from the parser, that's what I'm saying.

The parser is only responsible for turning source code into an AST that eventually becomes an executable program. Objects may, or may not, be created by that executable program, but that has nothing to do with turning source code into executable program.
Object lifetimes are handled in the memory management system, whether that is manual allocation/deallocation, manual refcounting, automatic refcounting or some more advanced type of garbage collection scheme.
And since the memory management of browsers is fragmented into several different object spaces each governed by separate garbage collectors, memory leaks appear if any of those object spaces fails either detecting cycles internally, or fails to communicate sufficiently with other garbage collectors.

--
var Liorean = {
abode: "http://web-graphics.com/",
profile: "http://codingforums.com/member.php?u=5798"};

(Edited by liorean on 06-07-2006 17:45)

_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-07-2006 18:35

Ok, thanks for trying to clarify, but I'll play the moron / devil's advocate to the fullest extent.
I'd be glad if I had answers to the following questions:

- how comes those issues don't arise with other semi-interpreted / semi-compiled languages like Java (the garbage collection model of Java is very close to javascript, afaik)? I have applets that can run for days.

- does it make sense design-wise to have a scripting language without memory control be subject to huge memory management issues?

- what is your background? You said welcome to my world. Where / what is your world?

liorean
Bipolar (III) Inmate

From: Umeå, Sweden
Insane since: Sep 2004

posted posted 06-07-2006 19:45
quote:

_Mauro said:
how comes those issues don't arise with other semi-interpreted / semi-compiled languages like Java (the garbage collection model of Java is very close to javascript, afaik)? I have applets that can run for days.

Well, how many of the objects you're using in those applets come from another environment, such as COM? Most Java programs don't even touch the outside world, all objects used are native Java constructs and interfaces, and everything resides in the same object space - governed by the Java garbage collector. But how well do think Java would handle it if half the objects used in the program were COM objects written in C/C++ or even Visual Basic, as is the case in most web scripting in iew?

quote:
does it make sense design-wise to have a scripting language without memory control be subject to huge memory management issues?

Well, no. But that's the thing I think you're not realising - the language and the browser are separate things.
The memory leak isn't in the JavaScript engine, the memory leak is in the browser host.
It's actually quite hard to find a memory leak in the language engine itself.

The host environment is badly designed, because JScript didn't even exist when COM was designed and COM was never built to work with a language as powerful as JScript. COM was built for VB and never had to deal with things like closures. Iew is using an object model that isn't advanced enough to support the main scripting engine. Had COM been mark-and-sweep garbage collected - or even just had a garbage cycle detector - then you wouldn't see even a tenth of the memory leaks you can find today.


quote:
what is your background? You said welcome to my world. Where / what is your world?

Last two years, I've been studying parsing and compiler building and tried to build parsers for various things, among them JavaScript and CSS, in more than one language. Most recently I've spent my evenings on writing a real, full-fledged parser for selectors in JavaScript to implement the Selectors API. (Which is practically complete, by the way. I'm writing on the code generator at the moment.)

--
var Liorean = {
abode: "http://web-graphics.com/",
profile: "http://codingforums.com/member.php?u=5798"};

(Edited by liorean on 06-07-2006 19:47)

_Mauro
Maniac (V) Inmate

From:
Insane since: Jul 2005

posted posted 06-08-2006 01:05

K. Don't get me wrong, I very much appreciate a debate and being contradicted, I am merely paying you respect by playing
my part as the devil's advocate you know (and I do know you're good, as if I had to say that).

So....

quote:

Java would handle it if half the objects used in the program were COM objects written in C/C++ or even Visual Basic



Well, astoundingly well, which puzzles me a bit.
For instance, make sure you have a recent JVM, and google "jogl", then hit the demos, any of the demos, especially the one which integrates several openGL contexts in Java swing subwindows.
Granted, the c components / Java components talk is controlled by the VM, but... what you have there is a Java class which will call the right dynamic library depending on the OS,
and will talk through it to the gfx hardware using OpenGL 1.5 specs: it's stable as hell.

Ok... just checked the demos and they are webstart applications, but anything, and trust my Java expertise on this one, any Java app could virtually be turned into an applet,
and would in many cases behave very closely to how it would have done as a standalone webstart app. Not to mention I've already used OpenGL contexts inside applets myself.

And to be precise, and maybe give you a bit of a challenge and interesting paths to explore, there is no real "delete" operator or the likes in Java. In facts, destruction
is almost never explicit in Java (ok, variables can be explicitely nulled out, the system gc can be tickled, etc.. but no real destruction occurs).

liorean
Bipolar (III) Inmate

From: Umeå, Sweden
Insane since: Sep 2004

posted posted 06-08-2006 03:02
quote:

_Mauro said:
Well, astoundingly well, which puzzles me a bit.
For instance, make sure you have a recent JVM, and google "jogl", then hit the demos, any of the demos, especially the one which integrates several openGL contexts in Java swing subwindows. Granted, the c components / Java components talk is controlled by the VM, but... what you have there is a Java class which will call the right dynamic library depending on the OS, and will talk through it to the gfx hardware using OpenGL 1.5 specs: it's stable as hell.

I don't doubt that, I've heard people with far more experience of graphics handling than I say JOGL is really good. (Graphics never were my thing...)

JOGL is controlled by the Java garbage collector, but performs manual C style memory management for the OpenGL API, right? At least that's how I've gotten it explained to me.

Now, I'm not saying COM couldn't be handled through a similar API in Java - and I think Mozilla has this problem mostly solved for Java interaction with XPCOM.
However, the difference between JOGL in Java and COM in JScript is that there's no compatibility layer in between in the latter case. There's no wrapping of the COM object handling in a native object. If you attach a JScript event listener on a COM Element object, then you are adding a Function object from the JScript object space as a direct property on the Element object from the COM object space.

As I understand it - and I admit I may very well have this wrong - in the JOGL case only the JOGL library directly interfaces with the OpenGL memory handling. All objects you as a user of that library handle are controlled by the Java garbage collector. Since the crossing of the object space border is so narrow, only the library code has to actually deal with that object space and thus the memory management is localised and references are easy to keep track of.

But how do you deal with it when every other object you're handling is governed by another garbage collector, when that garbage collector uses refcounting and lacks cycle detector? How do you deal with it when it's not the Java DOM you're dealing with but an external DOM implementation? Again, I'm not saying Java couldn't deal with it better than JScript does today, I'm just saying that the solution to the problem is largely the same for both languages. Either you eliminate the object space crossings, or you make sure garbage collection with cycle detection works in all object spaces as well as across the border.

Either one-GC-to-rule-them-all or many-GCs-all-cooperating. What you have in iew today is the many-GCs-not-fully-cooperating situation. If Microsoft ported iew to .NET, then they could go with the one-GC-to-rule-them-all solution, at the cost of backwards compatibility. Or they could just continue on with the current many-GCs-not-fully-cooperating solution and keep backwards compatibility, fixing host objects as they go by adding cycle detection in the iew hosted COM environment. Eventually they'll have fixed the entire browser host, then all you need to keep a watch out for is non-browser-host COM objects.

quote:
And to be precise, and maybe give you a bit of a challenge and interesting paths to explore, there is no real "delete" operator or the likes in Java. In facts, destructionis almost never explicit in Java (ok, variables can be explicitely nulled out, the system gc can be tickled, etc.. but no real destruction occurs).

Yeah, it's one of the problem with the C# and Java memory management systems as I see it. Timely destruction, even if you have to make it explicit, should if you ask me be absolutely required of any exception based language. It's what makes the RAII design pattern possible.

It's maybe not necessary for a language like JavaScript that doesn't even deal with threading, I/O, OS API calls or database integration in the most common host.
But it does makes for very much more elegant error handling and rollback systems for those types of features than if the language lacks timely destruction.



Oh, and if you noted - the delete keyword in JavaScript has nothing to do with the same keyword in C/C++ either. The only thing it does is to remove a property from an object. Might not even do that, in the JScript engine I believe all it does is set the property to undefined and prevent enumeration by flagging the property as deleted.

--
var Liorean = {
abode: "http://codingforums.com/",
profile: "http://codingforums.com/member.php?u=5798"};

DizzZ
Obsessive-Compulsive (I) Inmate

From: Dnipropetrovsk ;)
Insane since: Jan 2006

posted posted 09-02-2006 11:42

Look at Flash Player 9 (I know that it is not JS but the engine is very very similar)
That guys rewrited garbage collector and increased the speed many times!
For a pity Mozzilla and MS not doing that...
But I hear that MS already tries to make their engine better (according to their site they are already using modified engine in IE7 RC (last release candidate).

About JS compiler (or semi-compiller): sorry, I think it can't be real. The best improvment for JS I see is JIT (I believe it is possible to make JIT for JS). JS is young language, engines and other tools will evolve.
-----------------------------------------------------------------------------------------
var pleaseVisit=window.open("http://trickyscripter.com","mySite","")

(Edited by DizzZ on 09-02-2006 11:52)

poi
Paranoid (IV) Inmate

From: Norway
Insane since: Jun 2002

posted posted 09-02-2006 12:06

JS compiler are a reality.
Having strong typing as planned in JS2.0 might help the developers of JS engines to improve the performance.

DizzZ
Obsessive-Compulsive (I) Inmate

From: Dnipropetrovsk, Ukraine
Insane since: Jan 2006

posted posted 09-02-2006 15:43

AS3.0 already have strong typing. It greatly improves the speed (many times) but it is still interpreter.
What processor commands will generate JS compiler? x86? SPARC?
I think the JIT approach is the best in case of web technologies. Compile just before running the code and use the most advanced features of the clients computer.

---------------------------------------------------------------------------------------------------------------
var pleaseVisit=window.open("http://trickyscripter.com","mySite","")

poi
Paranoid (IV) Inmate

From: Norway
Insane since: Jun 2002

posted posted 09-02-2006 16:06

don't know about AS3.0 but I've played a little bit with Flash 8 and FLASM, and the AS is transformed into some assembly-like language and compiled into some virtual machine (op)code. The same can be/is done with some JavaScript engines and it's much faster than parsing and interpreting the JS code on the fly.

Anyway that's good to know that strong typing helped improving the performance of AS3.0 since JS2.0 is going in that direction.



(Edited by poi on 09-02-2006 16:10)

DizzZ
Obsessive-Compulsive (I) Inmate

From: Dnipropetrovsk, Ukraine
Insane since: Jan 2006

posted posted 09-02-2006 18:37

AS3.0 is just an ActionScript 3.0
AS1.0 is very similar to JS, but w/o RegExps and some other features.
AS2.0 is similar to JS2.0 but also w/o some features.
Actually AS2.0 code compiles into exactly the same opcode as AS1.0
All the types in AS2.0 was used only for error checking at compile time (and there was no speed increase).
AS3.0 is still at beta stage but already used by some people.
It uses different virtual machine (which is faster itself) and uses REAL strong typing.

Flash Player now is like Java VM or like .NET CLR.
And I believe that next version of JS must be also VM based and use bytcode.
I think it must be an interpreter with JIT (like in Java).

---------------------------------------------------------------------------------------------------------------
var pleaseVisit=window.open("http://trickyscripter.com","mySite","")



Post Reply
 
Your User Name:
Your Password:
Login Options:
 
Your Text:
Loading...
Options:


« BackwardsOnwards »

Show Forum Drop Down Menu