Is there a generator version of `string.split()` in Python?

Is there a generator version of `string.split()` in Python?,第1张

Is there a generator version of `string.split()` in Python?

It is highly probable that

re.finditer
uses
fairly minimal memory overhead.

def split_iter(string):    return (x.group(0) for x in re.finditer(r"[A-Za-z']+", string))

Demo:

>>> list( split_iter("A programmer's RegEx test.") )['A', "programmer's", 'RegEx', 'test']

edit: I have just confirmed that this takes constant memory in python
3.2.1, assuming my testing methodology was correct. I created a string of very
large size (1GB or so), then iterated through the iterable with a

for
loop
(NOT a list comprehension, which would have generated extra memory). This did
not result in a noticeable growth of memory (that is, if there was a growth in
memory, it was far far less than the 1GB string).



欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5666161.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存