Numpy to TFrecords:是否有更简单的方法来处理来自tfrecords的批量输入?

Numpy to TFrecords:是否有更简单的方法来处理来自tfrecords的批量输入?,第1张

Numpy to TFrecords:是否有更简单的方法来处理来自tfrecords的批量输入?

整个过程使用简化

Dataset API
。这是两个部分:
(1): Convert numpy array to tfrecords
(2,3,4):read the tfrecords to generate batches

1. 从一个numpy数组创建tfrecords:
    def npy_to_tfrecords(...):       # write records to a tfrecords file       writer = tf.python_io.TFRecordWriter(output_file)       # Loop through all the features you want to write       for ... :          let say X is of np.array([[...][...]])          let say y is of np.array[[0/1]]         # Feature contains a map of string to feature proto objects         feature = {}         feature['X'] = tf.train.Feature(float_list=tf.train.FloatList(value=X.flatten()))         feature['y'] = tf.train.Feature(int64_list=tf.train.Int64List(value=y))         # Construct the Example proto object         example = tf.train.Example(features=tf.train.Features(feature=feature))         # Serialize the example to a string         serialized = example.SerializeToString()         # write the serialized objec to the disk         writer.write(serialized)      writer.close()
2. 使用Dataset API(tensorflow > = 1.2)读取tfrecords:
    # Creates a dataset that reads all of the examples from filenames.    filenames = ["file1.tfrecord", "file2.tfrecord", ..."fileN.tfrecord"]    dataset = tf.contrib.data.TFRecordDataset(filenames)    # for version 1.5 and above use tf.data.TFRecordDataset    # example proto depre    def _parse_function(example_proto):      keys_to_features = {'X':tf.FixedLenFeature((shape_of_npy_array), tf.float32),    'y': tf.FixedLenFeature((), tf.int64, default_value=0)}      parsed_features = tf.parse_single_example(example_proto, keys_to_features)     return parsed_features['X'], parsed_features['y']    # Parse the record into tensors.    dataset = dataset.map(_parse_function)    # Shuffle the dataset    dataset = dataset.shuffle(buffer_size=10000)    # Repeat the input indefinitly    dataset = dataset.repeat()    # Generate batches    dataset = dataset.batch(batch_size)    # Create a one-shot iterator    iterator = dataset.make_one_shot_iterator()    # Get batch X and y    X, y = iterator.get_next()


欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5623425.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-15
下一篇 2022-12-15

发表评论

登录后才能评论

评论列表(0条)

保存