Matlab text file precision
This was a fun one to figure out. Maybe it will help someone else.
The background: I'm running a heat budget model which steps through seven years of weather data in 10-minute time steps, so there's a bit over 368,000 time steps. One of the outputs is a text file with the elapsed time since the start, and associated temperature predictions at each time step, totaling 368,208 lines.
The problem: I open old timeseries text files to plot up some data, and discover that my time points are screwy. Once I get past the 99,999th time point, instead of incrementing in 10 minute steps, I get data like this:
100,000
100,000
100,000
100,000
100,000
100,100
100,100
100,100
100,100
etc.
I was writing these text files using dlmwrite('filename.txt', big_huge_array, ' '). Unfortunately, it turns out that the default precision using this method is 5 digits. The fix is simple enough, just increase the precision when writing the text file:
dlmwrite('filename.txt', big_huge_array, 'delimiter', ' ', 'precision', 8)
The background: I'm running a heat budget model which steps through seven years of weather data in 10-minute time steps, so there's a bit over 368,000 time steps. One of the outputs is a text file with the elapsed time since the start, and associated temperature predictions at each time step, totaling 368,208 lines.
The problem: I open old timeseries text files to plot up some data, and discover that my time points are screwy. Once I get past the 99,999th time point, instead of incrementing in 10 minute steps, I get data like this:
100,000
100,000
100,000
100,000
100,000
100,100
100,100
100,100
100,100
etc.
I was writing these text files using dlmwrite('filename.txt', big_huge_array, ' '). Unfortunately, it turns out that the default precision using this method is 5 digits. The fix is simple enough, just increase the precision when writing the text file:
dlmwrite('filename.txt', big_huge_array, 'delimiter', ' ', 'precision', 8)
<< Home