Zoom Logo

AIP Open Seminar - Shared screen with speaker view
Gang Niu
01:08:20
can the equivalence between tensor networks and neural networks take batch normalization into account?
chao li
01:10:40
As I know, there’s still no study by taking BN into the context of tensor theory.
Sourav Banerjee(Pandua Block TMYC President)
01:10:45
how to efficiently estimate the tensor rank for rucker compression?
chao li
01:11:50
Do you mean Tucker compression?
Gang Niu
01:12:33
thanks!
Sourav Banerjee(Pandua Block TMYC President)
01:12:34
Tcuker compression means tucker tensor decomposition based compression. Be it compression of NN or other multiway data
Sourav Banerjee(Pandua Block TMYC President)
01:13:56
any python or matlkab toolbox available for tensor network?
Yivan Zhang
01:14:01
is it easier to define and study properties like invariance/equivariance based on tensor networks?
minghou
01:19:03
Does tensorization has explainable property other than image data?
tlu_pc
01:44:10
If the methods can be also applied to other filed, except brain data
Kenji Harada
02:12:24
Is there another method to optimize the topology of TN, than GA?
minghou
02:12:34
Is the learned structure is unique or does it differ wildly on each ran of those images?
minghou
02:12:47
run
Liu Pengyu
02:14:05
Is it possible to add constraints to obtain specific tensor network structures (i.e. with circle or not)?
Qibin Zhao
02:33:42
What is the main assumption of DIP? Is it general for any image?
Yuning Qiu
02:34:40
Why does DIP suffer performance degradation when the iterations increase?
Huidong Jiang
02:38:16
thank you
Yuning Qiu
02:38:20
Thanks!
Namgil Lee
02:38:23
Thank you for great seminar