Luckfox Pico Max运行RKNN-Toolkit2中的Yolov5 adb USB仿真
1:下载rknn-toolkit2
git clone https://github.com/rockchip-linux/rknn-toolkit2
2:修改onnx目录下的yolov5的test.py的代码
# pre-process config
print('--> Config model')
rknn.config(mean_values=[[0, 0, 0]], std_values=[[255, 255, 255]], target_platform='rv1106') #target_platform='rk3566')
# Init runtime environment
print('--> Init runtime environment')
# ret = rknn.init_runtime()
ret = rknn.init_runtime(target='rv1106', device_id= 'bd547ee6900c058b')
3:adb push rknn_sever和依赖库
RV1103/RV1106上使用的RKNPU Runtime库是librknnmrt.so,使用32-bit的rknn_server,启动步骤如下:(armhf-uclibc)
- adb push RV1106/Linux/rknn_server/armhf-uclibc/usr/bin下的所有文件到/oem/usr/bin目录
- adb push RV1106/Linux/librknn_api/armhf-uclibc/librknnmrt.so到/oem/usr/lib目录
- 进入板子的串口终端,执行:
chmod +x /oem/usr/bin/rknn_server
chmod +x /oem/usr/bin/start_rknn.sh
chmod +x /oem/usr/bin/restart_rknn.sh
restart_rknn.sh
4:运行yolov5的python代码进行adb连接仿真
(RKNN-Toolkit2) ubuntu@ubuntu:~/Downloads/rknn-toolkit2-master/rknn-toolkit2/examples/onnx/yolov5$ python3 test.py
W __init__: rknn-toolkit2 version: 1.6.0+81f21f4d
--> Config model
done
--> Loading model
W load_onnx: It is recommended onnx opset 19, but your onnx model opset is 12!
W load_onnx: Model converted from pytorch, 'opset_version' should be set 19 in torch.onnx.export for successful convert!
Loading : 100%|█████████████████████████████████████████████████| 125/125 [00:00<00:00, 6242.27it/s]
done
--> Building model
I base_optimize ...
I base_optimize done.
I
I fold_constant ...
I fold_constant done.
I
I correct_ops ...
I correct_ops done.
I
I fuse_ops ...
I fuse_ops done.
I
I sparse_weight ...
I sparse_weight done.
I
GraphPreparing : 100%|██████████████████████████████████████████| 149/149 [00:00<00:00, 1538.03it/s]
Quantizating : 100%|██████████████████████████████████████████████| 149/149 [00:01<00:00, 85.94it/s]
I
I quant_optimizer ...
I quant_optimizer results:
I adjust_tanh_sigmoid: ['Sigmoid_146', 'Sigmoid_148', 'Sigmoid_150']
I adjust_relu: ['Relu_144', 'Relu_141', 'Relu_139', 'Relu_137', 'Relu_135', 'Relu_132', 'Relu_130', 'Relu_127', 'Relu_125', 'Relu_123', 'Relu_121', 'Relu_118', 'Relu_116', 'Relu_113', 'Relu_111', 'Relu_109', 'Relu_107', 'Relu_102', 'Relu_100', 'Relu_97', 'Relu_95', 'Relu_93', 'Relu_91', 'Relu_86', 'Relu_84', 'Relu_75', 'Relu_73', 'Relu_70', 'Relu_67', 'Relu_65', 'Relu_63', 'Relu_61', 'Relu_59', 'Relu_56', 'Relu_53', 'Relu_51', 'Relu_48', 'Relu_46', 'Relu_43', 'Relu_41', 'Relu_39', 'Relu_37', 'Relu_35', 'Relu_32', 'Relu_29', 'Relu_27', 'Relu_24', 'Relu_22', 'Relu_20', 'Relu_18', 'Relu_16', 'Relu_13', 'Relu_10', 'Relu_8', 'Relu_6', 'Relu_4', 'Relu_2']
I adjust_no_change_node: ['MaxPool_81', 'MaxPool_80']
I quant_optimizer done.
I
W build: The default input dtype of 'images' is changed from 'float32' to 'int8' in rknn model for performance!Please take care of this change when deploy rknn model with Runtime API!
W build: The default output dtype of 'output' is changed from 'float32' to 'int8' in rknn model for performance!Please take care of this change when deploy rknn model with Runtime API!
W build: The default output dtype of '283' is changed from 'float32' to 'int8' in rknn model for performance!Please take care of this change when deploy rknn model with Runtime API!
W build: The default output dtype of '285' is changed from 'float32' to 'int8' in rknn model for performance!Please take care of this change when deploy rknn model with Runtime API!
I rknn building ...
I RKNN: [11:24:23.838] compress = 0, conv_eltwise_activation_fuse = 1, global_fuse = 1, multi-core-model-mode = 7, output_optimize = 1,enable_argb_group=0 ,layout_match = 1, pipeline_fuse = 0
I RKNN: librknnc version: 1.6.0 (585b3edcf@2023-12-11T07:42:56)
D RKNN: [11:24:24.052] RKNN is invoked
W RKNN: [11:24:24.721] Model initializer tensor data is empty, name: 219
W RKNN: [11:24:24.721] Model initializer tensor data is empty, name: 238
D RKNN: [11:24:24.748] >>>>>> start: rknn::RKNNExtractCustomOpAttrs
D RKNN: [11:24:24.749] <<<<<<<< end: rknn::RKNNExtractCustomOpAttrs
D RKNN: [11:24:24.749] >>>>>> start: rknn::RKNNSetOpTargetPass
D RKNN: [11:24:24.749] <<<<<<<< end: rknn::RKNNSetOpTargetPass
D RKNN: [11:24:24.749] >>>>>> start: rknn::RKNNBindNorm
D RKNN: [11:24:24.750] <<<<<<<< end: rknn::RKNNBindNorm
D RKNN: [11:24:24.750] >>>>>> start: rknn::RKNNAddFirstConv
D RKNN: [11:24:24.751] <<<<<<<< end: rknn::RKNNAddFirstConv
D RKNN: [11:24:24.751] >>>>>> start: rknn::RKNNEliminateQATDataConvert
D RKNN: [11:24:24.752] <<<<<<<< end: rknn::RKNNEliminateQATDataConvert
D RKNN: [11:24:24.752] >>>>>> start: rknn::RKNNTileGroupConv
D RKNN: [11:24:24.752] <<<<<<<< end: rknn::RKNNTileGroupConv
D RKNN: [11:24:24.752] >>>>>> start: rknn::RKNNTileFcBatchFuse
D RKNN: [11:24:24.752] <<<<<<<< end: rknn::RKNNTileFcBatchFuse
D RKNN: [11:24:24.752] >>>>>> start: rknn::RKNNAddConvBias
D RKNN: [11:24:24.754] <<<<<<<< end: rknn::RKNNAddConvBias
D RKNN: [11:24:24.754] >>>>>> start: rknn::RKNNTileChannel
D RKNN: [11:24:24.754] <<<<<<<< end: rknn::RKNNTileChannel
D RKNN: [11:24:24.754] >>>>>> start: rknn::RKNNPerChannelPrep
D RKNN: [11:24:24.754] <<<<<<<< end: rknn::RKNNPerChannelPrep
D RKNN: [11:24:24.754] >>>>>> start: rknn::RKNNBnQuant
D RKNN: [11:24:24.754] <<<<<<<< end: rknn::RKNNBnQuant
D RKNN: [11:24:24.754] >>>>>> start: rknn::RKNNFuseOptimizerPass
D RKNN: [11:24:24.866] <<<<<<<< end: rknn::RKNNFuseOptimizerPass
D RKNN: [11:24:24.866] >>>>>> start: rknn::RKNNTurnAutoPad
D RKNN: [11:24:24.866] <<<<<<<< end: rknn::RKNNTurnAutoPad
D RKNN: [11:24:24.866] >>>>>> start: rknn::RKNNInitRNNConst
D RKNN: [11:24:24.866] <<<<<<<< end: rknn::RKNNInitRNNConst
D RKNN: [11:24:24.866] >>>>>> start: rknn::RKNNInitCastConst
D RKNN: [11:24:24.866] <<<<<<<< end: rknn::RKNNInitCastConst
D RKNN: [11:24:24.866] >>>>>> start: rknn::RKNNMultiSurfacePass
D RKNN: [11:24:24.866] <<<<<<<< end: rknn::RKNNMultiSurfacePass
D RKNN: [11:24:24.866] >>>>>> start: rknn::RKNNReplaceConstantTensorPass
D RKNN: [11:24:24.867] <<<<<<<< end: rknn::RKNNReplaceConstantTensorPass
D RKNN: [11:24:24.867] >>>>>> start: OpEmit
D RKNN: [11:24:24.867] <<<<<<<< end: OpEmit
D RKNN: [11:24:24.867] >>>>>> start: rknn::RKNNLayoutMatchPass
I RKNN: [11:24:24.867] AppointLayout: t->setNativeLayout(64), tname:[128]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[131]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[133]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[142]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[137]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[140]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[143]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[145]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[147]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[161]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[151]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[156]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[159]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[162]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[166]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[185]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[170]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[175]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[180]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[183]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[186]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[190]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[199]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[194]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[197]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[200]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[202]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[204]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[205]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[206]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[207]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[208]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[209]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[210]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[211]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[213]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[221]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[223]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[229]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[225]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[227]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[230]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[232]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[240]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[242]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[248]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[244]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[246]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[249]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[251]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[253]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[254]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[256]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[262]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[258]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[260]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[263]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[265]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[267]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[268]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[270]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[276]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[272]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[274]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[277]
I RKNN: [11:24:24.868] AppointLayout: t->setNativeLayout(64), tname:[279]
D RKNN: [11:24:24.868] <<<<<<<< end: rknn::RKNNLayoutMatchPass
D RKNN: [11:24:24.868] >>>>>> start: rknn::RKNNAddSecondaryNode
D RKNN: [11:24:24.869] <<<<<<<< end: rknn::RKNNAddSecondaryNode
D RKNN: [11:24:24.869] >>>>>> start: OpEmit
D RKNN: [11:24:24.883] <<<<<<<< end: OpEmit
D RKNN: [11:24:24.883] >>>>>> start: rknn::RKNNProfileAnalysisPass
D RKNN: [11:24:24.884] <<<<<<<< end: rknn::RKNNProfileAnalysisPass
D RKNN: [11:24:24.888] >>>>>> start: rknn::RKNNOperatorIdGenPass
D RKNN: [11:24:24.888] <<<<<<<< end: rknn::RKNNOperatorIdGenPass
D RKNN: [11:24:24.888] >>>>>> start: rknn::RKNNWeightTransposePass
W RKNN: [11:24:25.219] Warning: Tensor 289 need paramter qtype, type is set to float16 by default!
W RKNN: [11:24:25.219] Warning: Tensor 219 need paramter qtype, type is set to float16 by default!
W RKNN: [11:24:25.220] Warning: Tensor 290 need paramter qtype, type is set to float16 by default!
W RKNN: [11:24:25.220] Warning: Tensor 238 need paramter qtype, type is set to float16 by default!
D RKNN: [11:24:25.220] <<<<<<<< end: rknn::RKNNWeightTransposePass
D RKNN: [11:24:25.220] >>>>>> start: rknn::RKNNCPUWeightTransposePass
D RKNN: [11:24:25.220] <<<<<<<< end: rknn::RKNNCPUWeightTransposePass
D RKNN: [11:24:25.220] >>>>>> start: rknn::RKNNModelBuildPass
D RKNN: [11:24:25.844] RKNNModelBuildPass: [Statistics]
D RKNN: [11:24:25.844] total_regcfg_size : 266816
D RKNN: [11:24:25.844] total_diff_regcfg_size: 164280
D RKNN: [11:24:25.844] <<<<<<<< end: rknn::RKNNModelBuildPass
D RKNN: [11:24:25.844] >>>>>> start: rknn::RKNNModelRegCmdbuildPass
D RKNN: [11:24:25.846] ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.846] Network Layer Information Table
D RKNN: [11:24:25.846] ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.846] ID OpType DataType Target InputShape OutputShape DDRCycles NPUCycles MaxCycles TaskNumber RW(KB) FullName
D RKNN: [11:24:25.846] ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.846] 0 InputOperator INT8 CPU \ (1,3,640,640) 0 0 0 0/0 0 InputOperator:images
D RKNN: [11:24:25.846] 1 Conv INT8 NPU (1,3,640,640),(12,3,2,2),(12) (1,12,320,320) 465543 409600 465543 8/0 1200 Conv:Conv_0
D RKNN: [11:24:25.846] 2 ConvRelu INT8 NPU (1,12,320,320),(32,12,3,3),(32) (1,32,320,320) 798774 1843200 1843200 8/0 1604 Conv:Conv_1
D RKNN: [11:24:25.846] 3 ConvRelu INT8 NPU (1,32,320,320),(64,32,3,3),(64) (1,64,160,160) 801060 921600 921600 16/0 3218 Conv:Conv_3
D RKNN: [11:24:25.846] 4 ConvRelu INT8 NPU (1,64,160,160),(32,64,1,1),(32) (1,32,160,160) 399366 204800 399366 8/0 1602 Conv:Conv_5
D RKNN: [11:24:25.846] 5 ConvRelu INT8 NPU (1,64,160,160),(32,64,1,1),(32) (1,32,160,160) 399366 204800 399366 8/0 1602 Conv:Conv_12
D RKNN: [11:24:25.846] 6 ConvRelu INT8 NPU (1,32,160,160),(32,32,1,1),(32) (1,32,160,160) 266203 204800 266203 4/0 801 Conv:Conv_7
D RKNN: [11:24:25.846] 7 ConvReluAdd INT8 NPU (1,32,160,160),(32,32,3,3),(32),... (1,32,160,160) 400530 460800 460800 4/0 1609 Conv:Conv_9
D RKNN: [11:24:25.846] 8 Concat INT8 NPU (1,32,160,160),(1,32,160,160) (1,64,160,160) 531990 0 531990 2/0 1600 Concat:Concat_14
D RKNN: [11:24:25.846] 9 ConvRelu INT8 NPU (1,64,160,160),(64,64,1,1),(64) (1,64,160,160) 532738 409600 532738 8/0 1604 Conv:Conv_15
D RKNN: [11:24:25.846] 10 ConvRelu INT8 NPU (1,64,160,160),(128,64,3,3),(128) (1,128,80,80) 411128 921600 921600 9/0 1673 Conv:Conv_17
D RKNN: [11:24:25.846] 11 ConvRelu INT8 NPU (1,128,80,80),(64,128,1,1),(64) (1,64,80,80) 200910 102400 200910 4/0 808 Conv:Conv_19
D RKNN: [11:24:25.846] 12 ConvRelu INT8 NPU (1,128,80,80),(64,128,1,1),(64) (1,64,80,80) 200910 102400 200910 4/0 808 Conv:Conv_31
D RKNN: [11:24:25.846] 13 ConvRelu INT8 NPU (1,64,80,80),(64,64,1,1),(64) (1,64,80,80) 133746 102400 133746 2/0 404 Conv:Conv_21
D RKNN: [11:24:25.846] 14 ConvReluAdd INT8 NPU (1,64,80,80),(64,64,3,3),(64),... (1,64,80,80) 205564 460800 460800 3/0 836 Conv:Conv_23
D RKNN: [11:24:25.846] 15 ConvRelu INT8 NPU (1,64,80,80),(64,64,1,1),(64) (1,64,80,80) 133746 102400 133746 2/0 404 Conv:Conv_26
D RKNN: [11:24:25.846] 16 ConvReluAdd INT8 NPU (1,64,80,80),(64,64,3,3),(64),... (1,64,80,80) 205564 460800 460800 3/0 836 Conv:Conv_28
D RKNN: [11:24:25.846] 17 Concat INT8 NPU (1,64,80,80),(1,64,80,80) (1,128,80,80) 265995 0 265995 2/0 800 Concat:Concat_33
D RKNN: [11:24:25.846] 18 ConvRelu INT8 NPU (1,128,80,80),(128,128,1,1),(128) (1,128,80,80) 268821 204800 268821 4/0 817 Conv:Conv_34
D RKNN: [11:24:25.846] 19 ConvRelu INT8 NPU (1,128,80,80),(256,128,3,3),(256) (1,256,40,40) 247708 921600 921600 5/0 1090 Conv:Conv_36
D RKNN: [11:24:25.846] 20 ConvRelu INT8 NPU (1,256,40,40),(128,256,1,1),(128) (1,128,40,40) 105235 102400 105235 2/0 433 Conv:Conv_38
D RKNN: [11:24:25.846] 21 ConvRelu INT8 NPU (1,256,40,40),(128,256,1,1),(128) (1,128,40,40) 105235 102400 105235 2/0 433 Conv:Conv_55
D RKNN: [11:24:25.846] 22 ConvRelu INT8 NPU (1,128,40,40),(128,128,1,1),(128) (1,128,40,40) 69325 51200 69325 1/0 217 Conv:Conv_40
D RKNN: [11:24:25.846] 23 ConvReluAdd INT8 NPU (1,128,40,40),(128,128,3,3),(128),... (1,128,40,40) 123854 460800 460800 2/0 545 Conv:Conv_42
D RKNN: [11:24:25.846] 24 ConvRelu INT8 NPU (1,128,40,40),(128,128,1,1),(128) (1,128,40,40) 69325 51200 69325 1/0 217 Conv:Conv_45
D RKNN: [11:24:25.846] 25 ConvReluAdd INT8 NPU (1,128,40,40),(128,128,3,3),(128),... (1,128,40,40) 123854 460800 460800 2/0 545 Conv:Conv_47
D RKNN: [11:24:25.846] 26 ConvRelu INT8 NPU (1,128,40,40),(128,128,1,1),(128) (1,128,40,40) 69325 51200 69325 1/0 217 Conv:Conv_50
D RKNN: [11:24:25.846] 27 ConvReluAdd INT8 NPU (1,128,40,40),(128,128,3,3),(128),... (1,128,40,40) 123854 460800 460800 2/0 545 Conv:Conv_52
D RKNN: [11:24:25.846] 28 Concat INT8 NPU (1,128,40,40),(1,128,40,40) (1,256,40,40) 132998 0 132998 2/0 400 Concat:Concat_57
D RKNN: [11:24:25.846] 29 ConvRelu INT8 NPU (1,256,40,40),(256,256,1,1),(256) (1,256,40,40) 143970 204800 204800 3/0 466 Conv:Conv_58
D RKNN: [11:24:25.846] 30 ConvRelu INT8 NPU (1,256,40,40),(512,256,3,3),(512) (1,512,20,20) 291930 921600 921600 3/0 1556 Conv:Conv_60
D RKNN: [11:24:25.846] 31 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_62
D RKNN: [11:24:25.846] 32 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_69
D RKNN: [11:24:25.846] 33 ConvRelu INT8 NPU (1,256,20,20),(256,256,1,1),(256) (1,256,20,20) 44222 51200 51200 1/0 166 Conv:Conv_64
D RKNN: [11:24:25.846] 34 ConvReluAdd INT8 NPU (1,256,20,20),(256,256,3,3),(256),... (1,256,20,20) 145965 460800 460800 1/0 778 Conv:Conv_66
D RKNN: [11:24:25.846] 35 Concat INT8 NPU (1,256,20,20),(1,256,20,20) (1,512,20,20) 66499 0 66499 2/0 200 Concat:Concat_71
D RKNN: [11:24:25.846] 36 ConvRelu INT8 NPU (1,512,20,20),(512,512,1,1),(512) (1,512,20,20) 109723 204800 204800 2/0 460 Conv:Conv_72
D RKNN: [11:24:25.846] 37 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_74
D RKNN: [11:24:25.846] 38 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_76
D RKNN: [11:24:25.846] 39 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_77
D RKNN: [11:24:25.846] 40 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_78
D RKNN: [11:24:25.846] 41 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_79
D RKNN: [11:24:25.846] 42 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_80
D RKNN: [11:24:25.846] 43 MaxPool INT8 NPU (1,256,20,20) (1,256,20,20) 33250 0 33250 1/0 100 MaxPool:MaxPool_81
D RKNN: [11:24:25.846] 44 Concat INT8 NPU (1,256,20,20),(1,256,20,20),... (1,1024,20,20) 132998 0 132998 4/0 400 Concat:Concat_82
D RKNN: [11:24:25.846] 45 ConvRelu INT8 NPU (1,1024,20,20),(512,1024,1,1),(512) (1,512,20,20) 185532 409600 409600 3/0 916 Conv:Conv_83
D RKNN: [11:24:25.846] 46 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_85
D RKNN: [11:24:25.846] 47 Resize INT8 NPU (1,256,20,20),(0),(4) (1,256,40,40) 83126 0 83126 16/0 100 Resize:Resize_88
D RKNN: [11:24:25.846] 48 Concat INT8 NPU (1,256,40,40),(1,256,40,40) (1,512,40,40) 265995 0 265995 2/0 800 Concat:Concat_89
D RKNN: [11:24:25.846] 49 ConvRelu INT8 NPU (1,512,40,40),(128,512,1,1),(128) (1,128,40,40) 177053 204800 204800 5/0 865 Conv:Conv_90
D RKNN: [11:24:25.846] 50 ConvRelu INT8 NPU (1,512,40,40),(128,512,1,1),(128) (1,128,40,40) 177053 204800 204800 5/0 865 Conv:Conv_96
D RKNN: [11:24:25.846] 51 ConvRelu INT8 NPU (1,128,40,40),(128,128,1,1),(128) (1,128,40,40) 69325 51200 69325 1/0 217 Conv:Conv_92
D RKNN: [11:24:25.846] 52 ConvRelu INT8 NPU (1,128,40,40),(128,128,3,3),(128) (1,128,40,40) 90605 460800 460800 2/0 345 Conv:Conv_94
D RKNN: [11:24:25.846] 53 Concat INT8 NPU (1,128,40,40),(1,128,40,40) (1,256,40,40) 132998 0 132998 2/0 400 Concat:Concat_98
D RKNN: [11:24:25.846] 54 ConvRelu INT8 NPU (1,256,40,40),(256,256,1,1),(256) (1,256,40,40) 143970 204800 204800 3/0 466 Conv:Conv_99
D RKNN: [11:24:25.846] 55 ConvRelu INT8 NPU (1,256,40,40),(128,256,1,1),(128) (1,128,40,40) 105235 102400 105235 2/0 433 Conv:Conv_101
D RKNN: [11:24:25.846] 56 Resize INT8 NPU (1,128,40,40),(0),(4) (1,128,80,80) 166250 0 166250 8/0 200 Resize:Resize_104
D RKNN: [11:24:25.846] 57 Concat INT8 NPU (1,128,80,80),(1,128,80,80) (1,256,80,80) 531990 0 531990 2/0 1600 Concat:Concat_105
D RKNN: [11:24:25.846] 58 ConvRelu INT8 NPU (1,256,80,80),(64,256,1,1),(64) (1,64,80,80) 335237 204800 335237 8/0 1616 Conv:Conv_106
D RKNN: [11:24:25.846] 59 ConvRelu INT8 NPU (1,256,80,80),(64,256,1,1),(64) (1,64,80,80) 335237 204800 335237 8/0 1616 Conv:Conv_112
D RKNN: [11:24:25.846] 60 ConvRelu INT8 NPU (1,64,80,80),(64,64,1,1),(64) (1,64,80,80) 133746 102400 133746 2/0 404 Conv:Conv_108
D RKNN: [11:24:25.846] 61 ConvRelu INT8 NPU (1,64,80,80),(64,64,3,3),(64) (1,64,80,80) 139066 460800 460800 3/0 436 Conv:Conv_110
D RKNN: [11:24:25.846] 62 Concat INT8 NPU (1,64,80,80),(1,64,80,80) (1,128,80,80) 265995 0 265995 2/0 800 Concat:Concat_114
D RKNN: [11:24:25.846] 63 ConvRelu INT8 NPU (1,128,80,80),(128,128,1,1),(128) (1,128,80,80) 268821 204800 268821 4/0 817 Conv:Conv_115
D RKNN: [11:24:25.846] 64 ConvRelu INT8 NPU (1,128,80,80),(128,128,3,3),(128) (1,128,40,40) 190353 460800 460800 5/0 945 Conv:Conv_117
D RKNN: [11:24:25.846] 65 Concat INT8 NPU (1,128,40,40),(1,128,40,40) (1,256,40,40) 132998 0 132998 2/0 400 Concat:Concat_119
D RKNN: [11:24:25.846] 66 ConvSigmoid INT8 NPU (1,128,80,80),(255,128,1,1),(255) (1,255,80,80) 404624 409600 409600 5/1 833 Conv:Conv_145
D RKNN: [11:24:25.846] 67 ConvRelu INT8 NPU (1,256,40,40),(128,256,1,1),(128) (1,128,40,40) 105235 102400 105235 2/0 433 Conv:Conv_120
D RKNN: [11:24:25.846] 68 ConvRelu INT8 NPU (1,256,40,40),(128,256,1,1),(128) (1,128,40,40) 105235 102400 105235 2/0 433 Conv:Conv_126
D RKNN: [11:24:25.846] 69 ConvRelu INT8 NPU (1,128,40,40),(128,128,1,1),(128) (1,128,40,40) 69325 51200 69325 1/0 217 Conv:Conv_122
D RKNN: [11:24:25.846] 70 ConvRelu INT8 NPU (1,128,40,40),(128,128,3,3),(128) (1,128,40,40) 90605 460800 460800 2/0 345 Conv:Conv_124
D RKNN: [11:24:25.846] 71 Concat INT8 NPU (1,128,40,40),(1,128,40,40) (1,256,40,40) 132998 0 132998 2/0 400 Concat:Concat_128
D RKNN: [11:24:25.846] 72 ConvRelu INT8 NPU (1,256,40,40),(256,256,1,1),(256) (1,256,40,40) 143970 204800 204800 3/0 466 Conv:Conv_129
D RKNN: [11:24:25.846] 73 ConvRelu INT8 NPU (1,256,40,40),(256,256,3,3),(256) (1,256,20,20) 179214 460800 460800 3/0 978 Conv:Conv_131
D RKNN: [11:24:25.846] 74 Concat INT8 NPU (1,256,20,20),(1,256,20,20) (1,512,20,20) 66499 0 66499 2/0 200 Concat:Concat_133
D RKNN: [11:24:25.846] 75 ConvSigmoid INT8 NPU (1,256,40,40),(255,256,1,1),(255) (1,255,40,40) 143929 204800 204800 4/1 465 Conv:Conv_147
D RKNN: [11:24:25.846] 76 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_134
D RKNN: [11:24:25.846] 77 ConvRelu INT8 NPU (1,512,20,20),(256,512,1,1),(256) (1,256,20,20) 71487 102400 102400 2/0 330 Conv:Conv_140
D RKNN: [11:24:25.846] 78 ConvRelu INT8 NPU (1,256,20,20),(256,256,1,1),(256) (1,256,20,20) 44222 51200 51200 1/0 166 Conv:Conv_136
D RKNN: [11:24:25.846] 79 ConvRelu INT8 NPU (1,256,20,20),(256,256,3,3),(256) (1,256,20,20) 129340 460800 460800 1/0 678 Conv:Conv_138
D RKNN: [11:24:25.846] 80 Concat INT8 NPU (1,256,20,20),(1,256,20,20) (1,512,20,20) 66499 0 66499 2/0 200 Concat:Concat_142
D RKNN: [11:24:25.846] 81 ConvRelu INT8 NPU (1,512,20,20),(512,512,1,1),(512) (1,512,20,20) 109723 204800 204800 2/0 460 Conv:Conv_143
D RKNN: [11:24:25.846] 82 ConvSigmoid INT8 NPU (1,512,20,20),(255,512,1,1),(255) (1,255,20,20) 71403 102400 102400 3/1 329 Conv:Conv_149
D RKNN: [11:24:25.846] 83 OutputOperator INT8 NPU (1,255,80,80),(1,80,80,256) \ 531990 0 531990 21/0 3200 OutputOperator:output
D RKNN: [11:24:25.846] 84 OutputOperator INT8 NPU (1,255,40,40),(1,40,40,256) \ 146298 0 146298 8/0 880 OutputOperator:283
D RKNN: [11:24:25.846] 85 OutputOperator INT8 NPU (1,255,20,20),(1,20,20,256) \ 43225 0 43225 4/0 260 OutputOperator:285
D RKNN: [11:24:25.846] ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.847] <<<<<<<< end: rknn::RKNNModelRegCmdbuildPass
D RKNN: [11:24:25.848] >>>>>> start: rknn::RKNNFlatcModelBuildPass
D RKNN: [11:24:25.868] Export Mini RKNN model to /tmp/tmpmzghee_t/dumps/torch-jit-export.mini.rknn
D RKNN: [11:24:25.871] >>>>>> end: rknn::RKNNFlatcModelBuildPass
D RKNN: [11:24:25.872] >>>>>> start: rknn::RKNNMemStatisticsPass
D RKNN: [11:24:25.873] --------------------------------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.873] Feature Tensor Information Table
D RKNN: [11:24:25.873] ----------------------------------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.873] ID User Tensor DataType DataFormat OrigShape NativeShape | [Start End) Size
D RKNN: [11:24:25.873] ----------------------------------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.873] 1 Conv images INT8 NC1HWC2 (1,3,640,640) (1,1,640,640,3) | 0x007297c0 0x008557c0 0x0012c000
D RKNN: [11:24:25.873] 2 ConvRelu 128 INT8 NC1HWC2 (1,12,320,320) (1,1,320,320,16) | 0x008557c0 0x009e57c0 0x00190000
D RKNN: [11:24:25.873] 3 ConvRelu 131 INT8 NC1HWC2 (1,32,320,320) (1,2,320,320,16) | 0x009e57c0 0x00d057c0 0x00320000
D RKNN: [11:24:25.873] 4 ConvRelu 133 INT8 NC1HWC2 (1,64,160,160) (1,4,160,160,16) | 0x007297c0 0x008b97c0 0x00190000
D RKNN: [11:24:25.873] 5 ConvRelu 133 INT8 NC1HWC2 (1,64,160,160) (1,4,160,160,16) | 0x007297c0 0x008b97c0 0x00190000
D RKNN: [11:24:25.873] 6 ConvRelu 135 INT8 NC1HWC2 (1,32,160,160) (1,2,160,160,16) | 0x008b97c0 0x009817c0 0x000c8000
D RKNN: [11:24:25.873] 7 ConvReluAdd 137 INT8 NC1HWC2 (1,32,160,160) (1,2,160,160,16) | 0x007297c0 0x007f17c0 0x000c8000
D RKNN: [11:24:25.873] 7 ConvReluAdd 135 INT8 NC1HWC2 (1,32,160,160) (1,2,160,160,16) | 0x008b97c0 0x009817c0 0x000c8000
D RKNN: [11:24:25.873] 8 Concat 140 INT8 NC1HWC2 (1,32,160,160) (1,2,160,160,16) | 0x007f17c0 0x008b97c0 0x000c8000
D RKNN: [11:24:25.873] 8 Concat 142 INT8 NC1HWC2 (1,32,160,160) (1,2,160,160,16) | 0x009817c0 0x00a497c0 0x000c8000
D RKNN: [11:24:25.873] 9 ConvRelu 143 INT8 NC1HWC2 (1,64,160,160) (1,4,160,160,16) | 0x00a497c0 0x00bd97c0 0x00190000
D RKNN: [11:24:25.873] 10 ConvRelu 145 INT8 NC1HWC2 (1,64,160,160) (1,4,160,160,16) | 0x007297c0 0x008b97c0 0x00190000
D RKNN: [11:24:25.873] 11 ConvRelu 147 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x008b97c0 0x009817c0 0x000c8000
D RKNN: [11:24:25.873] 12 ConvRelu 147 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x008b97c0 0x009817c0 0x000c8000
D RKNN: [11:24:25.873] 13 ConvRelu 149 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 14 ConvReluAdd 151 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 14 ConvReluAdd 149 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 15 ConvRelu 154 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x008557c0 0x008b97c0 0x00064000
D RKNN: [11:24:25.873] 16 ConvReluAdd 156 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 16 ConvReluAdd 154 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x008557c0 0x008b97c0 0x00064000
D RKNN: [11:24:25.873] 17 Concat 159 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 17 Concat 161 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x0078d7c0 0x007f17c0 0x00064000
D RKNN: [11:24:25.873] 18 ConvRelu 162 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x008557c0 0x0091d7c0 0x000c8000
D RKNN: [11:24:25.873] 19 ConvRelu 164 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x007297c0 0x007f17c0 0x000c8000
D RKNN: [11:24:25.873] 20 ConvRelu 166 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 21 ConvRelu 166 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 22 ConvRelu 168 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 23 ConvReluAdd 170 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 23 ConvReluAdd 168 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 24 ConvRelu 173 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008237c0 0x008557c0 0x00032000
D RKNN: [11:24:25.873] 25 ConvReluAdd 175 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 25 ConvReluAdd 173 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008237c0 0x008557c0 0x00032000
D RKNN: [11:24:25.873] 26 ConvRelu 178 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 27 ConvReluAdd 180 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 27 ConvReluAdd 178 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 28 Concat 183 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008237c0 0x008557c0 0x00032000
D RKNN: [11:24:25.873] 28 Concat 185 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008877c0 0x008b97c0 0x00032000
D RKNN: [11:24:25.873] 29 ConvRelu 186 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008b97c0 0x0091d7c0 0x00064000
D RKNN: [11:24:25.873] 30 ConvRelu 188 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 31 ConvRelu 190 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 32 ConvRelu 190 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 33 ConvRelu 192 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 34 ConvReluAdd 194 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008557c0 0x0086e7c0 0x00019000
D RKNN: [11:24:25.873] 34 ConvReluAdd 192 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 35 Concat 197 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x0086e7c0 0x008877c0 0x00019000
D RKNN: [11:24:25.873] 35 Concat 199 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008a07c0 0x008b97c0 0x00019000
D RKNN: [11:24:25.873] 36 ConvRelu 200 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008b97c0 0x008eb7c0 0x00032000
D RKNN: [11:24:25.873] 37 ConvRelu 202 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 38 MaxPool 204 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 39 MaxPool 205 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008557c0 0x0086e7c0 0x00019000
D RKNN: [11:24:25.873] 40 MaxPool 206 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x0086e7c0 0x008877c0 0x00019000
D RKNN: [11:24:25.873] 41 MaxPool 207 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008557c0 0x0086e7c0 0x00019000
D RKNN: [11:24:25.873] 42 MaxPool 208 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008a07c0 0x008b97c0 0x00019000
D RKNN: [11:24:25.873] 43 MaxPool 209 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008557c0 0x0086e7c0 0x00019000
D RKNN: [11:24:25.873] 44 Concat 204 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 44 Concat 206 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x0086e7c0 0x008877c0 0x00019000
D RKNN: [11:24:25.873] 44 Concat 208 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008a07c0 0x008b97c0 0x00019000
D RKNN: [11:24:25.873] 44 Concat 210 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008b97c0 0x008d27c0 0x00019000
D RKNN: [11:24:25.873] 45 ConvRelu 211 INT8 NC1HWC2 (1,1024,20,20) (1,64,20,20,16) | 0x008d27c0 0x009367c0 0x00064000
D RKNN: [11:24:25.873] 46 ConvRelu 213 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008557c0 0x008877c0 0x00032000
D RKNN: [11:24:25.873] 47 Resize 215 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 48 Concat 220 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008a07c0 0x009047c0 0x00064000
D RKNN: [11:24:25.873] 48 Concat 188 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 49 ConvRelu 221 INT8 NC1HWC2 (1,512,40,40) (1,32,40,40,16) | 0x009047c0 0x009cc7c0 0x000c8000
D RKNN: [11:24:25.873] 50 ConvRelu 221 INT8 NC1HWC2 (1,512,40,40) (1,32,40,40,16) | 0x009047c0 0x009cc7c0 0x000c8000
D RKNN: [11:24:25.873] 51 ConvRelu 223 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008a07c0 0x008d27c0 0x00032000
D RKNN: [11:24:25.873] 52 ConvRelu 225 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 53 Concat 227 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008a07c0 0x008d27c0 0x00032000
D RKNN: [11:24:25.873] 53 Concat 229 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008d27c0 0x009047c0 0x00032000
D RKNN: [11:24:25.873] 54 ConvRelu 230 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 55 ConvRelu 232 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008a07c0 0x009047c0 0x00064000
D RKNN: [11:24:25.873] 56 Resize 234 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 57 Concat 239 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x008a07c0 0x009687c0 0x000c8000
D RKNN: [11:24:25.873] 57 Concat 164 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x007297c0 0x007f17c0 0x000c8000
D RKNN: [11:24:25.873] 58 ConvRelu 240 INT8 NC1HWC2 (1,256,80,80) (1,16,80,80,16) | 0x009687c0 0x00af87c0 0x00190000
D RKNN: [11:24:25.873] 59 ConvRelu 240 INT8 NC1HWC2 (1,256,80,80) (1,16,80,80,16) | 0x009687c0 0x00af87c0 0x00190000
D RKNN: [11:24:25.873] 60 ConvRelu 242 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x008237c0 0x008877c0 0x00064000
D RKNN: [11:24:25.873] 61 ConvRelu 244 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x0078d7c0 0x007f17c0 0x00064000
D RKNN: [11:24:25.873] 62 Concat 246 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x008237c0 0x008877c0 0x00064000
D RKNN: [11:24:25.873] 62 Concat 248 INT8 NC1HWC2 (1,64,80,80) (1,4,80,80,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 63 ConvRelu 249 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x008a07c0 0x009687c0 0x000c8000
D RKNN: [11:24:25.873] 64 ConvRelu 251 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x007297c0 0x007f17c0 0x000c8000
D RKNN: [11:24:25.873] 65 Concat 253 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008237c0 0x008557c0 0x00032000
D RKNN: [11:24:25.873] 65 Concat 234 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 66 ConvSigmoid 251 INT8 NC1HWC2 (1,128,80,80) (1,8,80,80,16) | 0x007297c0 0x007f17c0 0x000c8000
D RKNN: [11:24:25.873] 67 ConvRelu 254 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008a07c0 0x009047c0 0x00064000
D RKNN: [11:24:25.873] 68 ConvRelu 254 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008a07c0 0x009047c0 0x00064000
D RKNN: [11:24:25.873] 69 ConvRelu 256 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007297c0 0x0075b7c0 0x00032000
D RKNN: [11:24:25.873] 70 ConvRelu 258 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x008a07c0 0x008d27c0 0x00032000
D RKNN: [11:24:25.873] 71 Concat 260 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x007297c0 0x0075b7c0 0x00032000
D RKNN: [11:24:25.873] 71 Concat 262 INT8 NC1HWC2 (1,128,40,40) (1,8,40,40,16) | 0x0075b7c0 0x0078d7c0 0x00032000
D RKNN: [11:24:25.873] 72 ConvRelu 263 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x008a07c0 0x009047c0 0x00064000
D RKNN: [11:24:25.873] 73 ConvRelu 265 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 74 Concat 267 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008a07c0 0x008b97c0 0x00019000
D RKNN: [11:24:25.873] 74 Concat 215 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008877c0 0x008a07c0 0x00019000
D RKNN: [11:24:25.873] 75 ConvSigmoid 265 INT8 NC1HWC2 (1,256,40,40) (1,16,40,40,16) | 0x007297c0 0x0078d7c0 0x00064000
D RKNN: [11:24:25.873] 76 ConvRelu 268 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008b97c0 0x008eb7c0 0x00032000
D RKNN: [11:24:25.873] 77 ConvRelu 268 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x008b97c0 0x008eb7c0 0x00032000
D RKNN: [11:24:25.873] 78 ConvRelu 270 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x008eb7c0 0x009047c0 0x00019000
D RKNN: [11:24:25.873] 79 ConvRelu 272 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x007427c0 0x0075b7c0 0x00019000
D RKNN: [11:24:25.873] 80 Concat 274 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x0075b7c0 0x007747c0 0x00019000
D RKNN: [11:24:25.873] 80 Concat 276 INT8 NC1HWC2 (1,256,20,20) (1,16,20,20,16) | 0x007297c0 0x007427c0 0x00019000
D RKNN: [11:24:25.873] 81 ConvRelu 277 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x007f17c0 0x008237c0 0x00032000
D RKNN: [11:24:25.873] 82 ConvSigmoid 279 INT8 NC1HWC2 (1,512,20,20) (1,32,20,20,16) | 0x007297c0 0x0075b7c0 0x00032000
D RKNN: [11:24:25.873] 83 OutputOperator output INT8 NC1HWC2 (1,255,80,80) (1,16,80,80,16) | 0x009047c0 0x00a947c0 0x00190000
D RKNN: [11:24:25.873] 83 OutputOperator output_exSecondary0 INT8 NC1HWC2 (1,80,80,256) (1,5,80,256,16) | 0x00a947c0 0x00c247c0 0x00190000
D RKNN: [11:24:25.873] 83 OutputOperator output_exSecondary INT8 NHWC (1,80,80,255) (1,80,80,255) | 0x00c247c0 0x00db2ec0 0x0018e700
D RKNN: [11:24:25.873] 84 OutputOperator 283 INT8 NC1HWC2 (1,255,40,40) (1,16,40,40,16) | 0x0078d7c0 0x007f17c0 0x00064000
D RKNN: [11:24:25.873] 84 OutputOperator 283_exSecondary0 INT8 NC1HWC2 (1,40,40,256) (1,2,40,256,16) | 0x007f17c0 0x008557c0 0x00064000
D RKNN: [11:24:25.873] 84 OutputOperator 283_exSecondary INT8 NHWC (1,40,40,255) (1,40,40,255) | 0x008557c0 0x008b9180 0x000639c0
D RKNN: [11:24:25.873] 85 OutputOperator 285 INT8 NC1HWC2 (1,255,20,20) (1,16,20,20,16) | 0x0075b7c0 0x007747c0 0x00019000
D RKNN: [11:24:25.873] 85 OutputOperator 285_exSecondary0 INT8 NC1HWC2 (1,20,20,256) (1,1,20,256,16) | 0x007747c0 0x0078d7c0 0x00019000
D RKNN: [11:24:25.873] 85 OutputOperator 285_exSecondary INT8 NHWC (1,20,20,255) (1,20,20,255) | 0x007297c0 0x00742630 0x00018e70
D RKNN: [11:24:25.873] ----------------------------------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.873] --------------------------------------------------------------------------------------------------------
D RKNN: [11:24:25.873] Const Tensor Information Table
D RKNN: [11:24:25.873] ----------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.873] ID User Tensor DataType OrigShape | [Start End) Size
D RKNN: [11:24:25.873] ----------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.873] 1 Conv model.0.convsp.weight INT8 (12,3,2,2) | 0x00000000 0x000000c0 0x000000c0
D RKNN: [11:24:25.873] 1 Conv model.0.convsp.weight_bias_0 INT32 (12) | 0x006f6140*0x006f61c0 0x00000080
D RKNN: [11:24:25.873] 2 ConvRelu 287 INT8 (32,12,3,3) | 0x006f4d40 0x006f5f40 0x00001200
D RKNN: [11:24:25.873] 2 ConvRelu 288 INT32 (32) | 0x006f5f40 0x006f6040 0x00000100
D RKNN: [11:24:25.873] 3 ConvRelu model.1.conv.weight INT8 (64,32,3,3) | 0x000000c0 0x000048c0 0x00004800
D RKNN: [11:24:25.873] 3 ConvRelu model.1.conv.bias INT32 (64) | 0x000048c0 0x00004ac0 0x00000200
D RKNN: [11:24:25.873] 4 ConvRelu model.2.cv1.conv.weight INT8 (32,64,1,1) | 0x00004ac0 0x000052c0 0x00000800
D RKNN: [11:24:25.873] 4 ConvRelu model.2.cv1.conv.bias INT32 (32) | 0x000052c0 0x000053c0 0x00000100
D RKNN: [11:24:25.873] 5 ConvRelu model.2.cv2.conv.weight INT8 (32,64,1,1) | 0x000053c0 0x00005bc0 0x00000800
D RKNN: [11:24:25.873] 5 ConvRelu model.2.cv2.conv.bias INT32 (32) | 0x00005bc0 0x00005cc0 0x00000100
D RKNN: [11:24:25.873] 6 ConvRelu model.2.m.0.cv1.conv.weight INT8 (32,32,1,1) | 0x00006ec0 0x000072c0 0x00000400
D RKNN: [11:24:25.873] 6 ConvRelu model.2.m.0.cv1.conv.bias INT32 (32) | 0x000072c0 0x000073c0 0x00000100
D RKNN: [11:24:25.873] 7 ConvReluAdd model.2.m.0.cv2.conv.weight INT8 (32,32,3,3) | 0x000073c0 0x000097c0 0x00002400
D RKNN: [11:24:25.873] 7 ConvReluAdd model.2.m.0.cv2.conv.bias INT32 (32) | 0x000097c0 0x000098c0 0x00000100
D RKNN: [11:24:25.873] 9 ConvRelu model.2.cv3.conv.weight INT8 (64,64,1,1) | 0x00005cc0 0x00006cc0 0x00001000
D RKNN: [11:24:25.873] 9 ConvRelu model.2.cv3.conv.bias INT32 (64) | 0x00006cc0 0x00006ec0 0x00000200
D RKNN: [11:24:25.873] 10 ConvRelu model.3.conv.weight INT8 (128,64,3,3) | 0x000098c0 0x0001b8c0 0x00012000
D RKNN: [11:24:25.873] 10 ConvRelu model.3.conv.bias INT32 (128) | 0x0001b8c0 0x0001bcc0 0x00000400
D RKNN: [11:24:25.873] 11 ConvRelu model.4.cv1.conv.weight INT8 (64,128,1,1) | 0x0001bcc0 0x0001dcc0 0x00002000
D RKNN: [11:24:25.873] 11 ConvRelu model.4.cv1.conv.bias INT32 (64) | 0x0001dcc0 0x0001dec0 0x00000200
D RKNN: [11:24:25.873] 12 ConvRelu model.4.cv2.conv.weight INT8 (64,128,1,1) | 0x0001dec0 0x0001fec0 0x00002000
D RKNN: [11:24:25.873] 12 ConvRelu model.4.cv2.conv.bias INT32 (64) | 0x0001fec0 0x000200c0 0x00000200
D RKNN: [11:24:25.873] 13 ConvRelu model.4.m.0.cv1.conv.weight INT8 (64,64,1,1) | 0x000244c0 0x000254c0 0x00001000
D RKNN: [11:24:25.873] 13 ConvRelu model.4.m.0.cv1.conv.bias INT32 (64) | 0x000254c0 0x000256c0 0x00000200
D RKNN: [11:24:25.873] 14 ConvReluAdd model.4.m.0.cv2.conv.weight INT8 (64,64,3,3) | 0x000256c0 0x0002e6c0 0x00009000
D RKNN: [11:24:25.873] 14 ConvReluAdd model.4.m.0.cv2.conv.bias INT32 (64) | 0x0002e6c0 0x0002e8c0 0x00000200
D RKNN: [11:24:25.873] 15 ConvRelu model.4.m.1.cv1.conv.weight INT8 (64,64,1,1) | 0x0002e8c0 0x0002f8c0 0x00001000
D RKNN: [11:24:25.873] 15 ConvRelu model.4.m.1.cv1.conv.bias INT32 (64) | 0x0002f8c0 0x0002fac0 0x00000200
D RKNN: [11:24:25.873] 16 ConvReluAdd model.4.m.1.cv2.conv.weight INT8 (64,64,3,3) | 0x0002fac0 0x00038ac0 0x00009000
D RKNN: [11:24:25.873] 16 ConvReluAdd model.4.m.1.cv2.conv.bias INT32 (64) | 0x00038ac0 0x00038cc0 0x00000200
D RKNN: [11:24:25.873] 18 ConvRelu model.4.cv3.conv.weight INT8 (128,128,1,1) | 0x000200c0 0x000240c0 0x00004000
D RKNN: [11:24:25.873] 18 ConvRelu model.4.cv3.conv.bias INT32 (128) | 0x000240c0 0x000244c0 0x00000400
D RKNN: [11:24:25.873] 19 ConvRelu model.5.conv.weight INT8 (256,128,3,3) | 0x00038cc0 0x00080cc0 0x00048000
D RKNN: [11:24:25.873] 19 ConvRelu model.5.conv.bias INT32 (256) | 0x00080cc0 0x000814c0 0x00000800
D RKNN: [11:24:25.873] 20 ConvRelu model.6.cv1.conv.weight INT8 (128,256,1,1) | 0x000814c0 0x000894c0 0x00008000
D RKNN: [11:24:25.873] 20 ConvRelu model.6.cv1.conv.bias INT32 (128) | 0x000894c0 0x000898c0 0x00000400
D RKNN: [11:24:25.873] 21 ConvRelu model.6.cv2.conv.weight INT8 (128,256,1,1) | 0x000898c0 0x000918c0 0x00008000
D RKNN: [11:24:25.873] 21 ConvRelu model.6.cv2.conv.bias INT32 (128) | 0x000918c0 0x00091cc0 0x00000400
D RKNN: [11:24:25.873] 22 ConvRelu model.6.m.0.cv1.conv.weight INT8 (128,128,1,1) | 0x000a24c0 0x000a64c0 0x00004000
D RKNN: [11:24:25.873] 22 ConvRelu model.6.m.0.cv1.conv.bias INT32 (128) | 0x000a64c0 0x000a68c0 0x00000400
D RKNN: [11:24:25.873] 23 ConvReluAdd model.6.m.0.cv2.conv.weight INT8 (128,128,3,3) | 0x000a68c0 0x000ca8c0 0x00024000
D RKNN: [11:24:25.873] 23 ConvReluAdd model.6.m.0.cv2.conv.bias INT32 (128) | 0x000ca8c0 0x000cacc0 0x00000400
D RKNN: [11:24:25.873] 24 ConvRelu model.6.m.1.cv1.conv.weight INT8 (128,128,1,1) | 0x000cacc0 0x000cecc0 0x00004000
D RKNN: [11:24:25.873] 24 ConvRelu model.6.m.1.cv1.conv.bias INT32 (128) | 0x000cecc0 0x000cf0c0 0x00000400
D RKNN: [11:24:25.873] 25 ConvReluAdd model.6.m.1.cv2.conv.weight INT8 (128,128,3,3) | 0x000cf0c0 0x000f30c0 0x00024000
D RKNN: [11:24:25.873] 25 ConvReluAdd model.6.m.1.cv2.conv.bias INT32 (128) | 0x000f30c0 0x000f34c0 0x00000400
D RKNN: [11:24:25.873] 26 ConvRelu model.6.m.2.cv1.conv.weight INT8 (128,128,1,1) | 0x000f34c0 0x000f74c0 0x00004000
D RKNN: [11:24:25.873] 26 ConvRelu model.6.m.2.cv1.conv.bias INT32 (128) | 0x000f74c0 0x000f78c0 0x00000400
D RKNN: [11:24:25.873] 27 ConvReluAdd model.6.m.2.cv2.conv.weight INT8 (128,128,3,3) | 0x000f78c0 0x0011b8c0 0x00024000
D RKNN: [11:24:25.873] 27 ConvReluAdd model.6.m.2.cv2.conv.bias INT32 (128) | 0x0011b8c0 0x0011bcc0 0x00000400
D RKNN: [11:24:25.873] 29 ConvRelu model.6.cv3.conv.weight INT8 (256,256,1,1) | 0x00091cc0 0x000a1cc0 0x00010000
D RKNN: [11:24:25.873] 29 ConvRelu model.6.cv3.conv.bias INT32 (256) | 0x000a1cc0 0x000a24c0 0x00000800
D RKNN: [11:24:25.873] 30 ConvRelu model.7.conv.weight INT8 (512,256,3,3) | 0x0011bcc0 0x0023bcc0 0x00120000
D RKNN: [11:24:25.873] 30 ConvRelu model.7.conv.bias INT32 (512) | 0x0023bcc0 0x0023ccc0 0x00001000
D RKNN: [11:24:25.873] 31 ConvRelu model.8.cv1.conv.weight INT8 (256,512,1,1) | 0x0023ccc0 0x0025ccc0 0x00020000
D RKNN: [11:24:25.873] 31 ConvRelu model.8.cv1.conv.bias INT32 (256) | 0x0025ccc0 0x0025d4c0 0x00000800
D RKNN: [11:24:25.873] 32 ConvRelu model.8.cv2.conv.weight INT8 (256,512,1,1) | 0x0025d4c0 0x0027d4c0 0x00020000
D RKNN: [11:24:25.873] 32 ConvRelu model.8.cv2.conv.bias INT32 (256) | 0x0027d4c0 0x0027dcc0 0x00000800
D RKNN: [11:24:25.873] 33 ConvRelu model.8.m.0.cv1.conv.weight INT8 (256,256,1,1) | 0x002becc0 0x002cecc0 0x00010000
D RKNN: [11:24:25.873] 33 ConvRelu model.8.m.0.cv1.conv.bias INT32 (256) | 0x002cecc0 0x002cf4c0 0x00000800
D RKNN: [11:24:25.873] 34 ConvReluAdd model.8.m.0.cv2.conv.weight INT8 (256,256,3,3) | 0x002cf4c0 0x0035f4c0 0x00090000
D RKNN: [11:24:25.873] 34 ConvReluAdd model.8.m.0.cv2.conv.bias INT32 (256) | 0x0035f4c0 0x0035fcc0 0x00000800
D RKNN: [11:24:25.873] 36 ConvRelu model.8.cv3.conv.weight INT8 (512,512,1,1) | 0x0027dcc0 0x002bdcc0 0x00040000
D RKNN: [11:24:25.873] 36 ConvRelu model.8.cv3.conv.bias INT32 (512) | 0x002bdcc0 0x002becc0 0x00001000
D RKNN: [11:24:25.873] 37 ConvRelu model.9.cv1.conv.weight INT8 (256,512,1,1) | 0x0035fcc0 0x0037fcc0 0x00020000
D RKNN: [11:24:25.873] 37 ConvRelu model.9.cv1.conv.bias INT32 (256) | 0x0037fcc0 0x003804c0 0x00000800
D RKNN: [11:24:25.873] 45 ConvRelu model.9.cv2.conv.weight INT8 (512,1024,1,1) | 0x003804c0 0x004004c0 0x00080000
D RKNN: [11:24:25.873] 45 ConvRelu model.9.cv2.conv.bias INT32 (512) | 0x004004c0 0x004014c0 0x00001000
D RKNN: [11:24:25.873] 46 ConvRelu model.10.conv.weight INT8 (256,512,1,1) | 0x004014c0 0x004214c0 0x00020000
D RKNN: [11:24:25.873] 46 ConvRelu model.10.conv.bias INT32 (256) | 0x004214c0 0x00421cc0 0x00000800
D RKNN: [11:24:25.873] 47 Resize 219 FLOAT (0) | 0x00000000 0x00000000 0x00000000
D RKNN: [11:24:25.873] 47 Resize 289 FLOAT (4) | 0x006f6040 0x006f60c0 0x00000080
D RKNN: [11:24:25.873] 49 ConvRelu model.13.cv1.conv.weight INT8 (128,512,1,1) | 0x00421cc0 0x00431cc0 0x00010000
D RKNN: [11:24:25.873] 49 ConvRelu model.13.cv1.conv.bias INT32 (128) | 0x00431cc0 0x004320c0 0x00000400
D RKNN: [11:24:25.873] 50 ConvRelu model.13.cv2.conv.weight INT8 (128,512,1,1) | 0x004320c0 0x004420c0 0x00010000
D RKNN: [11:24:25.873] 50 ConvRelu model.13.cv2.conv.bias INT32 (128) | 0x004420c0 0x004424c0 0x00000400
D RKNN: [11:24:25.873] 51 ConvRelu model.13.m.0.cv1.conv.weight INT8 (128,128,1,1) | 0x00452cc0 0x00456cc0 0x00004000
D RKNN: [11:24:25.873] 51 ConvRelu model.13.m.0.cv1.conv.bias INT32 (128) | 0x00456cc0 0x004570c0 0x00000400
D RKNN: [11:24:25.873] 52 ConvRelu model.13.m.0.cv2.conv.weight INT8 (128,128,3,3) | 0x004570c0 0x0047b0c0 0x00024000
D RKNN: [11:24:25.873] 52 ConvRelu model.13.m.0.cv2.conv.bias INT32 (128) | 0x0047b0c0 0x0047b4c0 0x00000400
D RKNN: [11:24:25.873] 54 ConvRelu model.13.cv3.conv.weight INT8 (256,256,1,1) | 0x004424c0 0x004524c0 0x00010000
D RKNN: [11:24:25.873] 54 ConvRelu model.13.cv3.conv.bias INT32 (256) | 0x004524c0 0x00452cc0 0x00000800
D RKNN: [11:24:25.873] 55 ConvRelu model.14.conv.weight INT8 (128,256,1,1) | 0x0047b4c0 0x004834c0 0x00008000
D RKNN: [11:24:25.873] 55 ConvRelu model.14.conv.bias INT32 (128) | 0x004834c0 0x004838c0 0x00000400
D RKNN: [11:24:25.873] 56 Resize 238 FLOAT (0) | 0x00000000 0x00000000 0x00000000
D RKNN: [11:24:25.873] 56 Resize 290 FLOAT (4) | 0x006f60c0 0x006f6140 0x00000080
D RKNN: [11:24:25.873] 58 ConvRelu model.17.cv1.conv.weight INT8 (64,256,1,1) | 0x004838c0 0x004878c0 0x00004000
D RKNN: [11:24:25.873] 58 ConvRelu model.17.cv1.conv.bias INT32 (64) | 0x004878c0 0x00487ac0 0x00000200
D RKNN: [11:24:25.873] 59 ConvRelu model.17.cv2.conv.weight INT8 (64,256,1,1) | 0x00487ac0 0x0048bac0 0x00004000
D RKNN: [11:24:25.873] 59 ConvRelu model.17.cv2.conv.bias INT32 (64) | 0x0048bac0 0x0048bcc0 0x00000200
D RKNN: [11:24:25.873] 60 ConvRelu model.17.m.0.cv1.conv.weight INT8 (64,64,1,1) | 0x004900c0 0x004910c0 0x00001000
D RKNN: [11:24:25.873] 60 ConvRelu model.17.m.0.cv1.conv.bias INT32 (64) | 0x004910c0 0x004912c0 0x00000200
D RKNN: [11:24:25.873] 61 ConvRelu model.17.m.0.cv2.conv.weight INT8 (64,64,3,3) | 0x004912c0 0x0049a2c0 0x00009000
D RKNN: [11:24:25.873] 61 ConvRelu model.17.m.0.cv2.conv.bias INT32 (64) | 0x0049a2c0 0x0049a4c0 0x00000200
D RKNN: [11:24:25.873] 63 ConvRelu model.17.cv3.conv.weight INT8 (128,128,1,1) | 0x0048bcc0 0x0048fcc0 0x00004000
D RKNN: [11:24:25.873] 63 ConvRelu model.17.cv3.conv.bias INT32 (128) | 0x0048fcc0 0x004900c0 0x00000400
D RKNN: [11:24:25.873] 64 ConvRelu model.18.conv.weight INT8 (128,128,3,3) | 0x0049a4c0 0x004be4c0 0x00024000
D RKNN: [11:24:25.873] 64 ConvRelu model.18.conv.bias INT32 (128) | 0x004be4c0 0x004be8c0 0x00000400
D RKNN: [11:24:25.873] 66 ConvSigmoid model.24.m.0.weight INT8 (255,128,1,1) | 0x006bb8c0 0x006c3840 0x00007f80
D RKNN: [11:24:25.873] 66 ConvSigmoid model.24.m.0.bias INT32 (255) | 0x006c3840 0x006c4040 0x00000800
D RKNN: [11:24:25.873] 67 ConvRelu model.20.cv1.conv.weight INT8 (128,256,1,1) | 0x004be8c0 0x004c68c0 0x00008000
D RKNN: [11:24:25.873] 67 ConvRelu model.20.cv1.conv.bias INT32 (128) | 0x004c68c0 0x004c6cc0 0x00000400
D RKNN: [11:24:25.873] 68 ConvRelu model.20.cv2.conv.weight INT8 (128,256,1,1) | 0x004c6cc0 0x004cecc0 0x00008000
D RKNN: [11:24:25.873] 68 ConvRelu model.20.cv2.conv.bias INT32 (128) | 0x004cecc0 0x004cf0c0 0x00000400
D RKNN: [11:24:25.873] 69 ConvRelu model.20.m.0.cv1.conv.weight INT8 (128,128,1,1) | 0x004df8c0 0x004e38c0 0x00004000
D RKNN: [11:24:25.873] 69 ConvRelu model.20.m.0.cv1.conv.bias INT32 (128) | 0x004e38c0 0x004e3cc0 0x00000400
D RKNN: [11:24:25.873] 70 ConvRelu model.20.m.0.cv2.conv.weight INT8 (128,128,3,3) | 0x004e3cc0 0x00507cc0 0x00024000
D RKNN: [11:24:25.873] 70 ConvRelu model.20.m.0.cv2.conv.bias INT32 (128) | 0x00507cc0 0x005080c0 0x00000400
D RKNN: [11:24:25.873] 72 ConvRelu model.20.cv3.conv.weight INT8 (256,256,1,1) | 0x004cf0c0 0x004df0c0 0x00010000
D RKNN: [11:24:25.873] 72 ConvRelu model.20.cv3.conv.bias INT32 (256) | 0x004df0c0 0x004df8c0 0x00000800
D RKNN: [11:24:25.873] 73 ConvRelu model.21.conv.weight INT8 (256,256,3,3) | 0x005080c0 0x005980c0 0x00090000
D RKNN: [11:24:25.873] 73 ConvRelu model.21.conv.bias INT32 (256) | 0x005980c0 0x005988c0 0x00000800
D RKNN: [11:24:25.873] 75 ConvSigmoid model.24.m.1.weight INT8 (255,256,1,1) | 0x006c4040 0x006d3f40 0x0000ff00
D RKNN: [11:24:25.873] 75 ConvSigmoid model.24.m.1.bias INT32 (255) | 0x006d3f40 0x006d4740 0x00000800
D RKNN: [11:24:25.873] 76 ConvRelu model.23.cv1.conv.weight INT8 (256,512,1,1) | 0x005988c0 0x005b88c0 0x00020000
D RKNN: [11:24:25.873] 76 ConvRelu model.23.cv1.conv.bias INT32 (256) | 0x005b88c0 0x005b90c0 0x00000800
D RKNN: [11:24:25.873] 77 ConvRelu model.23.cv2.conv.weight INT8 (256,512,1,1) | 0x005b90c0 0x005d90c0 0x00020000
D RKNN: [11:24:25.873] 77 ConvRelu model.23.cv2.conv.bias INT32 (256) | 0x005d90c0 0x005d98c0 0x00000800
D RKNN: [11:24:25.873] 78 ConvRelu model.23.m.0.cv1.conv.weight INT8 (256,256,1,1) | 0x0061a8c0 0x0062a8c0 0x00010000
D RKNN: [11:24:25.873] 78 ConvRelu model.23.m.0.cv1.conv.bias INT32 (256) | 0x0062a8c0 0x0062b0c0 0x00000800
D RKNN: [11:24:25.873] 79 ConvRelu model.23.m.0.cv2.conv.weight INT8 (256,256,3,3) | 0x0062b0c0 0x006bb0c0 0x00090000
D RKNN: [11:24:25.873] 79 ConvRelu model.23.m.0.cv2.conv.bias INT32 (256) | 0x006bb0c0 0x006bb8c0 0x00000800
D RKNN: [11:24:25.873] 81 ConvRelu model.23.cv3.conv.weight INT8 (512,512,1,1) | 0x005d98c0 0x006198c0 0x00040000
D RKNN: [11:24:25.873] 81 ConvRelu model.23.cv3.conv.bias INT32 (512) | 0x006198c0 0x0061a8c0 0x00001000
D RKNN: [11:24:25.873] 82 ConvSigmoid model.24.m.2.weight INT8 (255,512,1,1) | 0x006d4740 0x006f4540 0x0001fe00
D RKNN: [11:24:25.873] 82 ConvSigmoid model.24.m.2.bias INT32 (255) | 0x006f4540 0x006f4d40 0x00000800
D RKNN: [11:24:25.873] ----------------------------------------------------------------------+---------------------------------
D RKNN: [11:24:25.878] ----------------------------------------
D RKNN: [11:24:25.879] Total Internal Memory Size: 6693.75KB
D RKNN: [11:24:25.879] Total Weight Memory Size: 7128.44KB
D RKNN: [11:24:25.879] ----------------------------------------
D RKNN: [11:24:25.879] <<<<<<<< end: rknn::RKNNMemStatisticsPass
I rknn buiding done.
done
--> Export rknn model
done
--> Init runtime environment
I target set by user is: rv1106
I Check RV1106 board npu runtime version
I Starting ntp or adb, target is RV1106
I Device [bd547ee6900c058b] not found in ntb device list.
I Start adb...
I Connect to Device success!
I NPUTransfer: Starting NPU Transfer Client, Transfer version 2.1.0 (b5861e7@2020-11-23T11:50:36)
D NPUTransfer: Transfer spec = local:transfer_proxy
D NPUTransfer: Transfer interface successfully opened, fd = 3
D RKNNAPI: ==============================================
D RKNNAPI: RKNN VERSION:
D RKNNAPI: API: 1.6.0 (535b468 build@2023-12-11T09:05:46)
D RKNNAPI: DRV: rknn_server: 1.6.0 (535b468 build@2023-12-11T17:05:28)
D RKNNAPI: DRV: rknnrt: 1.6.0 (9a7b5d24c@2023-12-13T17:33:10)
D RKNNAPI: ==============================================
D RKNNAPI: Input tensors:
D RKNNAPI: index=0, name=images, n_dims=4, dims=[1, 640, 640, 3], n_elems=1228800, size=1228800, w_stride = 0, size_with_stride = 0, fmt=NHWC, type=UINT8, qnt_type=AFFINE, zp=-128, scale=0.003922
D RKNNAPI: Output tensors:
D RKNNAPI: index=0, name=output, n_dims=4, dims=[1, 255, 80, 80], n_elems=1632000, size=1632000, w_stride = 0, size_with_stride = 0, fmt=NCHW, type=INT8, qnt_type=AFFINE, zp=-128, scale=0.003860
D RKNNAPI: index=1, name=283, n_dims=4, dims=[1, 255, 40, 40], n_elems=408000, size=408000, w_stride = 0, size_with_stride = 0, fmt=NCHW, type=INT8, qnt_type=AFFINE, zp=-128, scale=0.003922
D RKNNAPI: index=2, name=285, n_dims=4, dims=[1, 255, 20, 20], n_elems=102000, size=102000, w_stride = 0, size_with_stride = 0, fmt=NCHW, type=INT8, qnt_type=AFFINE, zp=-128, scale=0.003915
done
--> Running model
doneclass score xmin, ymin, xmax, ymax
--------------------------------------------------person 0.884 [ 209, 244, 286, 506]person 0.868 [ 478, 238, 559, 526]person 0.825 [ 110, 238, 230, 534]person 0.339 [ 79, 354, 122, 516]bus 0.705 [ 94, 129, 553, 468]
Save results to result.jpg!
D NPUTransfer: Transfer client closed, fd = 3
相关文章:
Luckfox Pico Max运行RKNN-Toolkit2中的Yolov5 adb USB仿真
1:下载rknn-toolkit2 git clone https://github.com/rockchip-linux/rknn-toolkit2 2:修改onnx目录下的yolov5的test.py的代码 # pre-process config print(--> Config model) rknn.config(mean_values[[0, 0, 0]], std_values[[255, 255, …...
AI IDE - Trae -学习与实践
1.应用场景 主要用于使用AI IDE进行快速的开发,提高开发效率;节约开发时间; 额外话:可以预见搞出来的东西终将取代了我们自身; 2.学习/操作 1.文档阅读 Trae - Ship Faster with Trae -- 官网,下载安装 …...
内外网文件传输 安全、可控、便捷的跨网数据传输方案
一、背景与痛点 在内外网隔离的企业网络环境中,员工与外部协作伙伴(如钉钉用户)的文件传输面临以下挑战: 安全性风险:内外网直连可能导致病毒传播、数据泄露。 操作繁琐:传统方式需频繁切换网络环境&…...
pycharm 调试 debug 进入 remote_sources
解决办法1: pycharm函数跳转到remote_sources中的文件中_pycharm修改remotesource包存放地址-CSDN博客 file->settings->project structure将项目文件夹设为"Sources"(此时文件夹会变为蓝色)。 解决方法2 Debug:使用Pychar…...
Docker国内镜像源部署deepseek
部署deepseek时Docker拉取国内镜像失败可能是由于国内网络环境复杂或镜像源配置不正确导致的。 具体原因可能包括: 网络问题:国内网络环境复杂,可能导致访问国内镜像仓库的速度较慢或无法访问,进而影响Docker镜像的拉取…...
Ubuntu 下 nginx-1.24.0 源码分析 - ngx_os_specific_init函数
ngx_os_specific_init 声明在 src/os/unix/ngx_os.h ngx_int_t ngx_os_specific_init(ngx_log_t *log); 定义在 src\os\unix\ngx_linux_init.c ngx_int_t ngx_os_specific_init(ngx_log_t *log) {struct utsname u;if (uname(&u) -1) {ngx_log_error(NGX_LOG_ALERT, log,…...
C++算法基础笔记
算法学习 C语法字符和字符串输入输出输出控制 字符串拼接和扩充检查字符串是否存在大写、小写字母字符数组换行 C语法 字符和字符串输入输出 在C 中使用如下语法实现对容器中的对象进行遍历,类似于js或python的for in语法 for (element_declaration : container)…...
江苏地区电子行业DeepSeek AI+OdooERP业务升级规划方案
作者:Odoo技术开发/资深信息化负责人 日期:2025年2月22日 一、江苏电子行业现状与痛点分析 行业特点 产业集群效应显著:江苏电子产业以无锡、苏州、南京为核心,形成了涵盖PCB、集成电路、新能源、智能终端等领域的完整产业链&…...
Spring事务原理 二
在上一篇博文《Spring事务原理 一》中,我们熟悉了Spring声明式事务的AOP原理,以及事务执行的大体流程。 本文中,介绍了Spring事务的核心组件、传播行为的源码实现。下一篇中,我们将结合案例,来讲解实战中有关事务的易…...
【实用工具】在 Windows 上使用 JVMS 管理多版本 JDK
文章目录 前言JVMS 的主要功能安装 JVMS初始化 JVMS管理 JDK 版本远程添加(这块比较吃网络,如果不成功可以看下面手动添加)安装指定版本的 JDK查看本地已安装的 JDK 版本切换 JDK 版本 手动添加 JDK 前言 在 Java 开发过程中,针对…...
前端面试-JavaScript 数据类型详解
目录 一、数据类型分类 二、核心区别对比 1. 存储方式 2. 比较方式 3. 类型检测方法 三、特殊类型详解 1. Symbol 2. BigInt 3. null vs undefined 四、常见面试扩展问题 五、总结 一、数据类型分类 JavaScript 数据类型分为 基本数据类型(原始类型&…...
Oracle 连接报错:“ORA-12541:TNS:no listener ”,服务组件中找不到监听服务
一、 报错: navicat连接数据库报错:ORA-12541:TNS:no listener 二、排查问题 三、 解决问题 删除Oracle安装目录下选中的配置:listener.ora 及 listener*.bak相关的 cmd,用管理员打开 执行:netca 命…...
go-micro
一,课程介绍 1,主讲老师: 大地 2,合作网站: www.itying.com 3,我的专栏: https://www.itying.com/category_Z9-b0.html 4,必备基础:学习本教程要有golang和go web基础 5,大地老师Golang入门实战系列教…...
SVN把英文换中文
原文链接:SVN设置成中文版本 都是英文,换中文 Tortoise SVN 安装汉化教程(乌龟SVN) https://pan.quark.cn/s/cb6f2eee3f90 下载中文包...
Ubuntu 下 nginx-1.24.0 源码分析 - ngx_atoi 函数
ngx_atoi 声明在 src/core/ngx_string.h ngx_int_t ngx_atoi(u_char *line, size_t n); 定义在 src/core/ngx_string.c ngx_int_t ngx_atoi(u_char *line, size_t n) {ngx_int_t value, cutoff, cutlim;if (n 0) {return NGX_ERROR;}cutoff NGX_MAX_INT_T_VALUE / 10;cutlim…...
DeepSeek R1/V3满血版——在线体验与API调用
前言:在人工智能的大模型发展进程中,每一次新模型的亮相都宛如一颗投入湖面的石子,激起层层波澜。如今,DeepSeek R1/V3 满血版强势登场,为大模型应用领域带来了全新的活力与变革。 本文不但介绍在线体验 DeepSeek R1/…...
深度学习技术文章质量提升指南(基于CSDN评分算法优化)
一、质量缺陷诊断(基于CSDN质量分V5.0算法) 根据1提供的评分框架,当前文章可能存在的质量短板: 技术深度不足:缺乏具体模型实现细节与数学推导结构完整性缺失:未形成"理论-实践-应用"完整闭环代…...
力扣-回溯-37 解数独
思路 双层递归,而且在传递参数使用&的好处是不用在复制一次样本,浪费时间 class Solution { public:bool isVaild(vector<vector<char>> &board, int row, int cal, char val){for(int i 0; i < 9;i){if(board[row][i] val) …...
极简入门,本地部署dify低代码平台构建AI Agent大模型全流程(使用教程、微案例、配置详解、架构图解析)
文章目录 一、环境搭建1.1 安装VMware-workstationCentOS7.91.2 安装宝塔1.3 安装docker及改镜像、安装dify1.4 配置模型供应商 二、dify快速上手体验2.1 知识库2.2 微案例:基于知识库的助手 三、dify知识库配置详解3.1 分片策略3.2 父子分段3.3 索引方法3.4 检索结…...
ssh与服务器
目录 前言: 一、密码连接 二、密钥对连接 1.将公钥放在服务器 2.ssh连接 三、禁用密码 1.进入服务器/etc/ssh文件夹 2.打开sshd_config文件,进行如下配置 3.有可能还需要更改其他文件夹 4.重启ssh服务 四、config 五.ssh与github 1.本地创建…...
C++ bind基本使用
std::bind 是 C11 引入的一个函数模板,用于创建一个新的可调用对象,该对象可以调用某个函数或成员函数,并将一些参数绑定为固定值。通过 std::bind,你可以创建一个新的函数对象,这个对象可以接受剩余的参数,…...
【GPT】从GPT1到GPT3
every blog every motto: Although the world is full of suffering, it is full also of the overcoming of it 0. 前言 从GPT1 到GPT3 1. GPT1 论文: https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/lan…...
Java IO 设计模式总结
装饰器模式 装饰器(Decorator)模式 可以在不改变原有对象的情况下拓展其功能。 装饰器模式通过组合替代继承来扩展原始类的功能,在一些继承关系比较复杂的场景(IO 这一场景各种类的继承关系就比较复杂)更加实用。 对…...
js版本ES6、ES7、ES8、ES9、ES10、ES11、ES12、ES13、ES14[2023]新特性
ES全称ECMAScript,ECMAScript是ECMA制定的标准化脚本语言,本文讲述Javascript[ECMAScript]版本ES6、ES7、ES8、ES9、ES10、ES11、ES12、ES13、ES14[2023]的新特性,帮助朋友们更好的熟悉和使用Javascript ES5 1.严格模式 use strict2.Object getPrototypeOf,返回一个对象的原…...
基于ffmpeg+openGL ES实现的视频编辑工具-解码(四)
在开发视频编辑工具时,预览功能是基石,它涵盖视频、图片以及音频播放,而视频解码则是实现视频预览及后续编辑操作的关键环节。本文聚焦于基于 FFmpeg 实现视频解码的过程,详细阐述开发中遭遇的痛点、对应的解决方式,以及核心代码的运作原理。 一、开发背景与目标 视频编…...
机器学习:决策树
1. 初步概念 决策树是一种基于分裂特征的机器学习方法,用于分类和回归任务。它通过将数据按特征值进行分割,最终做出预测。与线性模型不同,决策树能够自动识别重要的特征,并根据数据情况生成复杂的决策规则。 2. 决策树的核心思想 决策树的核心思想在于选择一个特征作为…...
@media 的常用场景与示例
media 的常用场景与示例 1. 基本概念2. 常用场景2.1 不同屏幕宽度的布局调整2.2 隐藏或显示元素2.3 字体大小调整2.4 图片大小调整2.5 高度调整2.6 颜色调整2.7 鼠标悬停效果 3. 常用示例3.1 基本响应式布局3.2 隐藏侧边栏3.3 字体大小和图片大小 4. 总结 在现代网页设计中&…...
深入浅出:基于SpringBoot和JWT的后端鉴权系统设计与实现
文章目录 什么是鉴权系统定义与作用主要组成部分工作原理常用技术和框架 基于SpringBoot JWT的鉴权系统设计与实现指南前言技术对比令牌技术JWT令牌实现全流程1. **依赖引入**2. **JWT 工具类**3. **JWT 拦截器(Interceptor)** 4. **拦截器注册**5. **登…...
怎麼利用靜態ISP住宅代理在指紋流覽器中管理社媒帳號?
靜態ISP住宅代理是一種基於真實住宅IP的代理服務。這類代理IP通常由互聯網服務提供商(ISP)分配,具有非常高的真實性,與普通數據中心代理相比,更不容易被平臺檢測到為“虛假IP”或“代理IP”,靜態ISP住宅代理…...
DeepSeek掘金——SpringBoot 调用 DeepSeek API 快速实现应用开发
Spring Boot 实现 DeepSeek API 调用 1. 项目依赖 在 pom.xml 中添加以下依赖: <dependencies><dependency><groupId>org.springframework.boot</groupId><artifactId>spring-boot-starter-webflux</artifactId></dependency>&l…...
解决本地模拟IP的DHCP冲突问题
解决 DHCP 冲突导致的多 IP 绑定失效问题 前言 续接上一篇在本机上模拟IP地址。 在实际操作中,如果本机原有 IP(如 192.168.2.7)是通过 DHCP 自动获取的,直接添加新 IP(如 10.0.11.11)可能会导致 DHCP 服…...
Git LFS介绍(Large File Storage)大文件扩展,将大文件存储在外部存储,仓库中只记录文件的元数据(大文件的指针,类似一个小的占位符文件)
文章目录 LFS的功能?如何使用LFS?将大文件存储在外部系统是什么意思?具体是如何运作的?为什么要这样做? 对开发者的影响?1. **性能和效率**2. **协作体验**3. **版本管理差异**4. **额外的工具和配置** LFS…...
数据中心储能蓄电池状态监测管理系统 组成架构介绍
安科瑞刘鸿鹏 摘要 随着数据中心对供电可靠性要求的提高,蓄电池储能系统成为关键的后备电源。本文探讨了蓄电池监测系统在数据中心储能系统中的重要性,分析了ABAT系列蓄电池在线监测系统的功能、技术特点及其应用优势。通过蓄电池监测系统的实施&#…...
三甲医院网络架构与安全建设实战
一、设计目标 实现医疗业务网/卫生专网/互联网三网隔离 满足等保2.0三级合规要求 保障PACS影像系统低时延传输 实现医疗物联网统一接入管控 二、全网拓扑架构 三、网络分区与安全设计 IP/VLAN规划表 核心业务配置(华为CE6865) interface 100G…...
如何在 React 中测试高阶组件?
在 React 中测试高阶组件可以采用多种策略,以下是常见的测试方法: 1. 测试高阶组件返回的组件 高阶组件本身是一个函数,它返回一个新的组件。因此,可以通过测试这个返回的组件来间接测试高阶组件的功能。通常使用 Jest 作为测试…...
INA219电流、电压、功率测量芯片应用
INA219电流、电压、功率测量芯片应用 简述芯片引脚应用电路寄存器驱动代码 简述 INA219是一款由德州仪器(Texas Instruments)生产的高精度电流/功率监测芯片,广泛应用于电池监控、电源管理等需要精确电流和功率测量的应用中。该芯片通…...
深入解析设计模式之工厂模式
深入解析设计模式之工厂模式 在软件开发的复杂体系中,设计模式作为解决常见问题的有效方案,为开发者提供了强大的工具。工厂模式作为一种广泛应用的创建型设计模式,专注于对象的创建过程,通过巧妙的设计,将对象的创建…...
ollama修改监听ip: 0.0.0.0
确认Ollama绑定IP地址 默认情况下,Ollama可能仅监听本地回环地址(127.0.0.1)。要允许外部访问,需将其配置为监听所有IP(0.0.0.0)或指定IP(如10…19)。 修改启动命令(推荐…...
.NET MVC实现电影票管理
.NET MVC(Model-View-Controller)是微软推出的基于 Model-View-Controller 设计模式的 Web 应用框架,属于 ASP.NET Core 的重要组成部分。其核心目标是通过清晰的分层架构实现 高内聚、低耦合 的开发模式,适用于构建可扩展的企业级…...
FPGA DSP:Vivado 中带有 DDS 的 FIR 滤波器
本文使用 DDS 生成三个信号,并在 Vivado 中实现低通滤波器。低通滤波器将滤除相关信号。 介绍 用DDS生成三个信号,并在Vivado中实现低通滤波器。低通滤波器将滤除较快的信号。 本文分为几个主要部分: 信号生成:展示如何使用DDS&am…...
大数据组件(四)快速入门实时数据湖存储系统Apache Paimon(2)
Paimon的下载及安装,并且了解了主键表的引擎以及changelog-producer的含义参考: 大数据组件(四)快速入门实时数据湖存储系统Apache Paimon(1) 利用Paimon表做lookup join,集成mysql cdc等参考: 大数据组件(四)快速入门实时数据…...
vue3父子组件props传值,defineprops怎么用?(组合式)
目录 1.基础用法 2.使用解构赋值的方式定义props 3.使用toRefs的方式解构props (1).通过ref响应式变量,修改对象本身不会触发响应式 1.基础用法 父组件通过在子组件上绑定子组件中定义的props(:props“”)传递数据给子组件 <!-- 父组件…...
Linux /etc/fstab文件详解:自动挂载配置指南(中英双语)
Linux /etc/fstab 文件详解:自动挂载配置指南 在 Linux 系统中,/etc/fstab(File System Table)是一个至关重要的配置文件,它用于定义系统开机时自动挂载的文件系统。如果你想让磁盘分区、远程存储(如 NFS&…...
Test the complete case
Test the complete case python写的一段 由pytest测试框架/allure报告框架/parameters数据驱动组成的完整案例代码 目录结构 project/ ├── test_cases/ │ ├── __init__.py │ └── test_math_operations.py # 测试用例 ├── test_data/ │ └── math_dat…...
装win10系统提示“windows无法安装到这个磁盘,选中的磁盘采用GPT分区形式”解决方法
问题描述 我们在u盘安装原版win10 iso镜像时,发现在选择硬盘时提示了“windows无法安装到这个磁盘,选中的磁盘采用GPT分区形式”,直接导致了无法继续安装下去。出现这种情况要怎么解决呢? 原因分析: 当您在安装Windows操作系统…...
【pytest-jira】自动化用例结合jira初版集成思路
【pytest】编写自动化测试用例命名规范README 【python】连接Jira获取token以及jira对象 【python】解析自动化脚本文件并按照测试周期存储记录 【python】向Jira推送自动化用例执行成功 【python】向Jira测试计划下,附件中增加html测试报告 以下内容主要是介绍jira…...
PHP 会话(Session)实现用户登陆功能
Cookie是一种在客户端和服务器之间传递数据的机制。它是由服务器发送给客户端的小型文本文件,保存在客户端的浏览器中。每当浏览器向同一服务器发送请求时,它会自动将相关的Cookie信息包含在请求中,以便服务器可以使用这些信息来提供个性化的…...
大模型安全问题详解(攻击技术、红队测试与安全漏洞)
文章目录 大模型攻击技术提示注入攻击(Prompt Injection)数据投毒攻击(Data Poisoning)模型克隆攻击(Model Cloning)拒绝服务攻击(DoS)和拒绝钱包攻击(DoW)插…...
【愚公系列】《鸿蒙原生应用开发从零基础到多实战》002-TypeScript 类型系统详解
标题详情作者简介愚公搬代码头衔华为云特约编辑,华为云云享专家,华为开发者专家,华为产品云测专家,CSDN博客专家,CSDN商业化专家,阿里云专家博主,阿里云签约作者,腾讯云优秀博主&…...
C# 将非托管Dll嵌入exe中(一种实现方法)
一、环境准备 电脑系统:Windows 10 专业版 20H2 IDE:Microsoft Visual Studio Professional 2022 (64 位) - Current 版本 17.11.4 其他: 二、测试目的 将基于C++创建DLL库,封装到C#生成的exe中。 一般C++创建的库,在C#中使用,都是采用DllImport导入的,且要求库处…...