How to load large local files using JavaScript?
up vote
0
down vote
favorite
Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?
*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.
javascript file blob local filereader
|
show 3 more comments
up vote
0
down vote
favorite
Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?
*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.
javascript file blob local filereader
1
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday
|
show 3 more comments
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?
*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.
javascript file blob local filereader
Is there a way to handle very large files(like 2GB and over) locally in JavaScript without crashing the browser?
*I am aware of the input tag and the FileReader API, but it seems there is nothing like Node.js streams.
javascript file blob local filereader
javascript file blob local filereader
asked yesterday
Damien Golding
3592520
3592520
1
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday
|
show 3 more comments
1
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday
1
1
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday
|
show 3 more comments
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer()
to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.
1
The important part here is theslice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer()
to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.
1
The important part here is theslice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago
add a comment |
up vote
1
down vote
accepted
FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer()
to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.
1
The important part here is theslice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer()
to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.
FileReader enables you to read contents of files asynchronously. With respect to large file (2GB in your case), you can use function/method FileReader.readAsArrayBuffer()
to read a certain chunk size of a file in the memory hence this won't crash your browser, this blog is a good example.
answered 16 hours ago
meteorzeroo
411414
411414
1
The important part here is theslice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago
add a comment |
1
The important part here is theslice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.
– Kaiido
16 hours ago
1
1
The important part here is the
slice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.– Kaiido
16 hours ago
The important part here is the
slice()
method of Blob. readAsArrayBuffer will just crash the browser the same if you pass a too big Blob to it.– Kaiido
16 hours ago
add a comment |
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53237829%2fhow-to-load-large-local-files-using-javascript%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
What exactly are you planning on doing with this file? Just curious.
– SiddAjmera
yesterday
This is mainly hypothetical for future development at the moment, but the reason I thought of asking this question was because of wanting to extract data from a dump of my Google data. Obviously, any kind of functionality I would use this for is to extract a smaller amount of data at once such as "find all dates when I am at specific location".
– Damien Golding
yesterday
Any reason why you would be asking js to strain under a job that serverside would find trivial?
– lucas
yesterday
@lucas No network upload so reduces data costs + can be done on slow network environments. Users can share the processing burden instead of having the server owner burden the load of everyone using the service. Don't like uploading private data to external server.
– Damien Golding
yesterday
You can offload your long running task (large file handling in this case) to Web Workers (developer.mozilla.org/en-US/docs/Web/API/Worker). The web worker runs within another thread (context) therefore it won't block your main thread (browser)
– meteorzeroo
yesterday