How to load large local files using JavaScript?









up vote
0
down vote

favorite












Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?



*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.










share|improve this question

















  • 1




    What exactly are you planning on doing with this file? Just curious.
    – SiddAjmera
    yesterday










  • This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
    – Damien Golding
    yesterday










  • Any reason why you would be asking js to strain under a job that serverside would find trivial?
    – lucas
    yesterday










  • @lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
    – Damien Golding
    yesterday











  • You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
    – meteorzeroo
    yesterday















up vote
0
down vote

favorite












Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?



*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.










share|improve this question

















  • 1




    What exactly are you planning on doing with this file? Just curious.
    – SiddAjmera
    yesterday










  • This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
    – Damien Golding
    yesterday










  • Any reason why you would be asking js to strain under a job that serverside would find trivial?
    – lucas
    yesterday










  • @lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
    – Damien Golding
    yesterday











  • You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
    – meteorzeroo
    yesterday













up vote
0
down vote

favorite









up vote
0
down vote

favorite











Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?



*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.










share|improve this question













Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?



*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.







javascript file blob local filereader






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked yesterday









Damien Golding

3592520




3592520







  • 1




    What exactly are you planning on doing with this file? Just curious.
    – SiddAjmera
    yesterday










  • This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
    – Damien Golding
    yesterday










  • Any reason why you would be asking js to strain under a job that serverside would find trivial?
    – lucas
    yesterday










  • @lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
    – Damien Golding
    yesterday











  • You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
    – meteorzeroo
    yesterday













  • 1




    What exactly are you planning on doing with this file? Just curious.
    – SiddAjmera
    yesterday










  • This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
    – Damien Golding
    yesterday










  • Any reason why you would be asking js to strain under a job that serverside would find trivial?
    – lucas
    yesterday










  • @lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
    – Damien Golding
    yesterday











  • You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
    – meteorzeroo
    yesterday








1




1




What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday




What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday












This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday




This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday












Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday




Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday












@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday





@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday













You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday





You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday













1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer() to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.






share|improve this answer
















  • 1




    The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
    – Kaiido
    16 hours ago











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237829%2fhow-to-load-large-local-files-using-javascript%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer() to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.






share|improve this answer
















  • 1




    The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
    – Kaiido
    16 hours ago















up vote
1
down vote



accepted










FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer() to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.






share|improve this answer
















  • 1




    The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
    – Kaiido
    16 hours ago













up vote
1
down vote



accepted







up vote
1
down vote



accepted






FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer() to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.






share|improve this answer












FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer() to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.







share|improve this answer












share|improve this answer



share|improve this answer










answered 16 hours ago









meteorzeroo

411414




411414







  • 1




    The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
    – Kaiido
    16 hours ago













  • 1




    The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
    – Kaiido
    16 hours ago








1




1




The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago





The important part here is the slice() method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago


















 

draft saved


draft discarded















































 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237829%2fhow-to-load-large-local-files-using-javascript%23new-answer', 'question_page');

);

Post as a guest














































































Popular posts from this blog

Top Tejano songwriter Luis Silva dead of heart attack at 64

政党

天津地下鉄3号線