I am using this bit of code in order to reformat some large ajax responseText into good binary data. It works, albeit slow.
The data that I am working with can be as large as 8-10 megs.
I need to get this code to be absolutely efficient. How would loop unrolling or Duff's device be applied to this code while still keeping my binary data intact, or does anyone see anything that can be changed that would help increase it's speed?
var ff = [];
var mx = text.length;
var scc= String.fromCharCode;
for (var z = 0; z < mx; z++) {
ff[z] = scc(text.charCod开发者_开发百科eAt(z) & 255);
}
var b = ff.join("");
this.fp=b;
return b;
Thanks Pat
Your time hog isn't the loop. It's this: ff[z] = scc(text.charCodeAt(z) & 255);
Are you incrementally growing ff
? That will be a pig, guaranteed.
If you just run it under the debugger and pause it, I bet you will see it in the process of growing ff
. Pre-allocate.
Convert the data to a JSON array on the server. 8/10 megabytes will take a long time even with a native JSON engine. I'm not sure why a JS application needs 8/10 megs of data in it. If you are downloading to the client's device, convert it to a format they expect and just link to it. They can download and process it themselves then.
精彩评论