• Hey Guest! Ever feel like entering a Game Jam, but the time limit is always too much pressure? We get it... You lead a hectic life and dedicating 3 whole days to make a game just doesn't work for you! So, why not enter the GMC SLOW JAM? Take your time! Kick back and make your game over 4 months! Interested? Then just click here!

Large file question

  • Thread starter William Roberts
  • Start date
W

William Roberts

Guest
I have a file that has over 150,000 lines in it with 3 fields (comma delimited). Right now, I am reading the entire file into an array... But I feel like this isn't the smartest approach. Is there a better way to read a large file? Thanks for the support.

Will
 

FrostyCat

Redemption Seeker
All those 150k lines will be read anyways, so you probably won't find any room for optimization in the reading loop. If you do want to optimize this, look into how the data is structured after you read it. For example, if you will be indexing them like a dictionary, then reading into a map is much smarter than reading into an array.
 
D

dannyjenn

Guest
If the text file doesn't need to be human-readable, then maybe you could format the data as a buffer and then use buffer_read() instead of a text file? Just a thought.
 

Pfap

Member
To expand on what @dannyjenn said, you could also load it asynchronously if you were to use buffers and the function buffer_load_async(). Then it would not block while it is loading, although I haven't really gotten it to work. The manual basically says it's the only way to load for html5. Otherwise, efficiency when reading a file doesn't matter too much as it will just block until the full file is read; which is what @FrostyCat was pointing out. Once, all that data is in memory how you use it is what will matter.
 
Top