Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
342 views
in Technique[技术] by (71.8m points)

python - Tensorflow:将导入的图形操作应用于2D张量的每个元素(Tensorflow: applying an imported graph operation to each element of 2d tensor)

There are questions answering parts of my question but I can't connect the pieces together.

(有一些问题可以回答我的部分问题,但我无法将各个部分连接在一起。)

Suppose I have a graph that operates on a 1d array of just 2 elements

(假设我有一个仅对2个元素的1d数组进行操作的图形)

input = tf.placeholder(tf.float32, [2], name="input")

I want to build a graph that can receive an arbitrary long 2d array of such elements and run the first graph on it

(我想构建一个可以接收此类元素的任意长2d数组的图形,并在其上运行第一个图形)

 x = tf.placeholder(tf.float32, [None, 2], name = 'x')

I know how to import the first graph (tf.import_graph_def) and how to run some operation on an array using tf.map_fn .

(我知道如何导入第一个图形(tf.import_graph_def)以及如何使用tf.map_fn在数组上运行某些操作。)

But how can I combine the two?

(但是,如何将两者结合起来呢?)

For each run of the network I need to pass it a different input.

(对于网络的每次运行,我都需要传递一个不同的输入。)

But the mapping is done inside tf.import_graph_def.

(但是映射是在tf.import_graph_def内部完成的。)

Should I do the import each time in the function called in the loop?

(我是否应该每次在循环中调用的函数中进行导入?)

Sounds wrong ...

(听起来不对...)

The code below works, but I believe there is a better way:

(下面的代码有效,但是我相信有更好的方法:)

with tf.Graph().as_default() as g_1:
input = tf.placeholder(tf.float32, [2], name="input")
y = tf.add(input[0], input[1])
output = tf.identity(y, name="output")

gdef_1 = g_1.as_graph_def()

tf.reset_default_graph()
with tf.Graph().as_default() as g_combined:
    x = tf.placeholder(tf.float32, [None, 2], name = 'x')

    def calc_z(el):
        y, = tf.import_graph_def(gdef_1, input_map={"input:0": el},
                               return_elements=["output:0"])
        return y

    final_result = tf.map_fn(calc_z, x)

    init = tf.global_variables_initializer()

with tf.Session(graph=g_combined) as sess:
    # For tensorboard
    # run it as tensorboard --logdir=graphs
    writer = tf.summary.FileWriter('./graphs', sess.graph)
    # Run the initializer
    sess.run(init)
    print(sess.run([final_result], feed_dict = {x:[[1,2],[3,4],[5,6]]}))
    writer.close()

Update: I tried to achieve the same result, but to keep the imported graph trainable, but fail to do so.

(更新:我试图获得相同的结果,但保持导入的图形可训练,但未能做到。)

The return_elements argument to import_meta_graph seems to be just ignored and only the saver is returned.

(该return_elements参数import_meta_graph似乎只是忽略,只返回保护程序。)

Then a call to restore fails with the error

(然后,调用恢复失败并显示错误)

Tensor Tensor("map/while/save/Const:0", shape=(), dtype=string) may not be fed I'm using the code below:

(我无法使用Tensor Tensor(“ map / while / save / Const:0”,shape =(),dtype = string):)

tf.reset_default_graph()
xx = tf.placeholder(tf.float32, [2], name="xx")
yy = tf.add(xx[0], xx[1])
yy = tf.identity(yy, name = 'yy')
#need at least 1 varaible to save the graph
_ = tf.Variable(initial_value='fake_variable')

config = tf.ConfigProto(log_device_placement=False)
config.gpu_options.allow_growth = True

with tf.Session(config=config) as sess:    
    saver = tf.train.Saver()
    sess.run(tf.initialize_all_variables())
    saver.save(sess, "./model_ex2")

tf.reset_default_graph()
with tf.Session() as sess:
    x = tf.placeholder(tf.float32, [None, 2], name = 'x')

    def calc_z(el):
#         saver, yy  = tf.train.import_meta_graph("./model_ex2.meta", 
#                                            input_map={"xx:0": el}, return_elements=["yy:0"])
#         saver.restore(sess, "./model_ex2")
#         return yy
        # return_elements argument seems to be ignored and only the saver is returned.
        saver = tf.train.import_meta_graph("./model_ex2.meta", 
                                           input_map={"xx:0": el})
        saver.restore(sess, "./model_ex2")
        return yy

    final_result = tf.map_fn(calc_z, x)

init = tf.global_variables_initializer()
with tf.Session(config=config) as sess:
    sess.run(init)
    print(sess.run([final_result, op], feed_dict = {x:[[1,2],[3,4],[5,6]]}))
  ask by Moshe Kravchik translate from so

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Your current solution is actually good already.

(您当前的解决方案实际上已经很好。)

The graph is imported only once when g_combined is constructed, not once per element in x , so it does what you would like to.

(构建g_combined时,该图形仅导入一次, g_combined x每个元素导入一次,因此它可以执行您想要的操作。)

If you have a metagraph instead, it should work similarly with tf.train.import_meta_graph , since input_map and return_elements should also be usable with it (note however this functions returns the imported saver too).

(如果您有一个元图,它应该与tf.train.import_meta_graph类似地工作,因为input_mapreturn_elements也应该与它一起使用(但是请注意,此函数也返回导入的保护程序)。)

However, you can also import the metagraph in a different graph, freeze it (eg using tf.graph_util.convert_variables_to_constants ) and then import that graph def into the final graph.

(但是,您也可以将元图导入到其他图中,将其冻结(例如,使用tf.graph_util.convert_variables_to_constants ),然后将该图def导入最终图。)

import tensorflow as tf

meta_graph_path = ...
meta_graph_save_path = ...
with tf.Graph().as_default() as g_meta_import, tf.Session() as sess:
    saver = tf.train.import_meta_graph(meta_graph_path)
    saver.restore(sess, meta_graph_save_path)
    frozen_graph = tf.graph_util.convert_variables_to_constants(
        sess, tf.get_default_graph().as_graph_def(), 'output')

with tf.Graph().as_default() as g_combined:
    x = tf.placeholder(tf.float32, [None, 2], name = 'x')
    def calc_z(el):
        y, = tf.import_graph_def(frozen_graph, input_map={'input:0': el},
                                 return_elements=['output:0'])
        return y
    final_result = tf.map_fn(calc_z, x)
    init = tf.global_variables_initializer()

The only catch of this solution is that it the imported part will obviously be frozen and not trainable.

(该解决方案的唯一收获是,进口的零件显然将被冻结且不可培训。)


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...