model_export¶
export_context¶
- enter_export_mode(mode, export_ctx=None)[source]¶
进入模型导出模式,会根据mode构图
- Parameters:
mode (
ExportMode
) – 导出模式,可选ExportMode.DISTRIBUTED,ExportMode.STANDALONEexport_ctx (
ExportContext
,optional) – 模型导出上下文
saved_model_exporters¶
- class StandaloneExporter(model_fn, model_dir, export_dir_base, shared_embedding=False, warmup_file=None)[source]¶
单机模式的saved model导出器
- Parameters:
model_fn – 和tf.estimator兼容的model_fn,以(features,mode,config)作为参数并且返回EstimatorSpec
model_dir – 保存checkpoint的目录
export_dir_base – 导出saved_model的目标路径
shared_embedding – 是否复用checkpoint中的 embedding 文件, False的话会将embedding文件拷贝至saved_model,可能会降低导出速度
warmup_file – warmup文件,参考 https://www.tensorflow.org/tfx/serving/saved_model_warmup
- export_saved_model(serving_input_receiver_fn, checkpoint_path=None, global_step=None)[source]¶
导出saved_model
- Parameters:
serving_input_receiver_fn – 返回 tf.estimator.export.ServingInputReceiver 的函数,用来将serving 请求映射到模型输入
checkpoint_path – 可选的checkpoint路径,为空则使用tf.train.latest_checkpoint(self._model_dir)
- class DistributedExporter(model_fn, model_dir, export_dir_base, shared_embedding=False, warmup_file=None, dense_only=False, allow_gpu=False, with_remote_gpu=False, clear_entry_devices=False, include_graphs=None, global_step_as_timestamp=False)[source]¶
分布式模型导出器
- Parameters:
model_fn – 和tf.estimator兼容的model_fn,以(features,mode,config)作为参数并且返回EstimatorSpec
model_dir – 保存checkpoint的目录
export_dir_base – 导出saved_model的目标路径
shared_embedding – 是否复用checkpoint中的 embedding 文件, False的话会将embedding文件拷贝至saved_model,可能会降低导出速度
warmup_file – warmup文件,参考 https://www.tensorflow.org/tfx/serving/saved_model_warmup
include_graphs – Only export saved_models from include_graphs if the param not None, otherwise export all graphs in export context
global_step_as_timestamp – whether to use use global_step export folder name, useful when we do parallel export in sync_training
- export_saved_model(serving_input_receiver_fn, checkpoint_path=None, global_step=None)[source]¶
导出saved_model
- Parameters:
serving_input_receiver_fn – 返回 tf.estimator.export.ServingInputReceiver 的函数,用来将serving 请求映射到模型输入
checkpoint_path – 可选的checkpoint路径,为空则使用tf.train.latest_checkpoint(self._model_dir)