I create a file in the console. I try to write to it. I get a file, but it has no content. 0 Bytes.
Is there some sort of permissions issue I should be aware of?
testfile1= ('testfile.txt', 'w')
testfile1.write("this is a test")
I create a file in the console. I try to write to it. I get a file, but it has no content. 0 Bytes.
Is there some sort of permissions issue I should be aware of?
testfile1= ('testfile.txt', 'w')
testfile1.write("this is a test")
perhaps there's just a several minute delay... Now the files I've created both have content in them (but not the latest content).
EDIT: Judging from your post in the Dropbox thread, you might be writing your file to your Dropbox folder, which I hadn't realised. I wouldn't expect it to make much difference if both your processes are running on PA, but if you're creating a file on PA and checking it on another host which has the same Dropbox share then you can expect a significant delay, anywhere from seconds to minutes. This is because Dropbox does background synchronisation as opposed to being a proper networked filesystem. The PA admins did recently make some changes which should improve performance a little, but Dropbox itself is by no means instant.
I also note that you're using the Dashboard -> Files tab to view your file. Personally I've also observed a short delay in updates here (something like 10-15 seconds) so I would recommend opening a bash shell to view your files using the cat
command.
Firstly, the code snippet above isn't opening a file - you're just assigning a 2-tuple to testfile1
and then calling the write()
method on it, which will fail with the error "tuple object has no attribute write()". I assume you meant:
testfile1 = open('testfile1.txt', 'w')
testfile1.write("this is a test")
In future please try to paste the exact code you're running - in this case it was fairly easy to figure out what you meant, but sometimes the precise details of what you're doing can be critical in working out the problem.
To answer your actual question, are you writing the file in one console and then reading it in another? If so, be aware that Python (and potentially the underlying libc library) performs buffered IO. That means when you perform a write()
call like that, the data hasn't necessarily actually gone into the underlying file - it's sitting in memory. This is done for performance reasons - if you perform multiple write()
calls then they can all be stacked up in memory, which is fast, and then sent out to disk, which is slow, all in one go. So, if you view it from another process before the data has been pushed out to disk then it might look empty.
If you need to force all the buffered data to be written to the disk right now then you can use the flush()
method on the file:
testfile1 = open("testfile.txt", "w")
testfile1.write("this is a test")
testfile1.flush()
You can also just close the file, which performs an implicit flush. If you terminate your Python session or script then all the objects are deleted and the destructor of the file
object will cause an implicit close()
which itself performs the flush as I've just mentioned. Hence, the contents of the file will become updated once you close your console or terminate your script.
One easy way of making sure you close a file is to use the with
keyword:
with open("testfile.txt", "w") as testfile1:
testfile1.write("this is a test")
This will automatically close the file again after the with
block has finished. This is useful to remember as it's really easy to forget to close files, and I generally find it cleaner than just relying on the file
object's destructor to do the work.
Finally, if you need every single write()
operation to go straight to the disk, consider using os.open()
and os.write()
instead, which give you access to the raw OS file descriptor. Be aware that you have to use these with care - they just deal with file descriptors as integers so there's no implicit close (except for the one done by the OS when your Python process terminates). Personally I wouldn't recommend these unless you really think you need them - there's nothing wrong with using Python file
objects and performing flush()
whenever required.
Oh and as an aside, PA uses NFS for file storage so I use the term "written to disk" somewhat loosely - the principles are identical, however. There may be some small delay in seeing changes to a file between different hosts, but since all your processes are likely running on the same host at PA then you're unlikely to see this. In any case, any such delay should be in the order of milliseconds so you shouldn't notice it by manual inspection anyway.
+1 to everything Cartroo says :-)
@mroswell -- if you could show us the exact code you're running then it would help give you a more precise answer about what's going on. But my guess is that you're not flushing or closing the file.