File optimization
I had a professor mention something to us in class today, and I want to try out a little experiment.
Basically I have a formula f(Z) for any integer Z. If f(Z) is true, then output a value, if f(Z) is false, output a different value. Do this for all f(Z) in a certain range [x, y] where x, y are any two ints.
Two questions I wanted to ask here:
1) Is there any way I could output less than an entire ASCII character for each check? I really only need one bit for each value of Z, not the entire byte. I'm using Windows, if that makes a difference.
2) Let's say I open a file, with some kind of file open()
If I tell it to point to the first byte, the 10th byte, or the 1000000th byte, is there any difference in speed?
Thanks!
Basically I have a formula f(Z) for any integer Z. If f(Z) is true, then output a value, if f(Z) is false, output a different value. Do this for all f(Z) in a certain range [x, y] where x, y are any two ints.
Two questions I wanted to ask here:
1) Is there any way I could output less than an entire ASCII character for each check? I really only need one bit for each value of Z, not the entire byte. I'm using Windows, if that makes a difference.
2) Let's say I open a file, with some kind of file open()
If I tell it to point to the first byte, the 10th byte, or the 1000000th byte, is there any difference in speed?
Thanks!
