You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am wondering how the preprocessed bin data was converted to Numpy (28x28) data?
In the Numpy data, I see that each location has a value between 0 and 255 (not just 1). How was this value arrived at? I thought the original stroke data contained only x and y coordinates and for each such x,y coordinate, we make a 1? But this does not seem to be the case. Is there a pointer to any algorithm to convert the stroke data to the numpy array?
The text was updated successfully, but these errors were encountered:
The bin data is basically a rendered image of the drawings. The values are 0 to 255 because that is a common way to store 8bit color data.
So if you scale the x,y coordinates to 28x28 and render and plot the lines onto an image you should get something similar.
I am wondering how the preprocessed bin data was converted to Numpy (28x28) data?
In the Numpy data, I see that each location has a value between 0 and 255 (not just 1). How was this value arrived at? I thought the original stroke data contained only x and y coordinates and for each such x,y coordinate, we make a 1? But this does not seem to be the case. Is there a pointer to any algorithm to convert the stroke data to the numpy array?
The text was updated successfully, but these errors were encountered: