要编写一个简单函数,只需使用
yield:
def read_in_chunks(file_object, chunk_size=1024): """Lazy function (generator) to read a file piece by piece. Default chunk size: 1k.""" while True: data = file_object.read(chunk_size) if not data: break yield datawith open('really_big_file.dat') as f: for piece in read_in_chunks(f): process_data(piece)
另一个选择是使用
iter和辅助功能:
f = open('really_big_file.dat')def read1k(): return f.read(1024)for piece in iter(read1k, ''): process_data(piece)
for line in open('really_big_file.dat'): process_data(line)
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)