Can Tensorflow take gradient on matrix 2-norm?
Normally the matrix norm we took in Tensorflow is Frobenius norm which is easy to compute and easy to understand, e.g., a Bayesian view. But in many cases, it is the largest singular value matters. It is possible to optimize that in Tensorflow? It depends on whether tensorflow can take gradient with respect to matrix 2-norm.
tensorflow autodiff
add a comment |
Normally the matrix norm we took in Tensorflow is Frobenius norm which is easy to compute and easy to understand, e.g., a Bayesian view. But in many cases, it is the largest singular value matters. It is possible to optimize that in Tensorflow? It depends on whether tensorflow can take gradient with respect to matrix 2-norm.
tensorflow autodiff
add a comment |
Normally the matrix norm we took in Tensorflow is Frobenius norm which is easy to compute and easy to understand, e.g., a Bayesian view. But in many cases, it is the largest singular value matters. It is possible to optimize that in Tensorflow? It depends on whether tensorflow can take gradient with respect to matrix 2-norm.
tensorflow autodiff
Normally the matrix norm we took in Tensorflow is Frobenius norm which is easy to compute and easy to understand, e.g., a Bayesian view. But in many cases, it is the largest singular value matters. It is possible to optimize that in Tensorflow? It depends on whether tensorflow can take gradient with respect to matrix 2-norm.
tensorflow autodiff
tensorflow autodiff
asked Nov 15 '18 at 23:35
ArtificiallyIntelligenceArtificiallyIntelligence
336313
336313
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Actually, the spectral norm is equal the largest singular value. To get to this value you can use TensorFlow's linalg.svd
.
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53329357%2fcan-tensorflow-take-gradient-on-matrix-2-norm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Actually, the spectral norm is equal the largest singular value. To get to this value you can use TensorFlow's linalg.svd
.
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
add a comment |
Actually, the spectral norm is equal the largest singular value. To get to this value you can use TensorFlow's linalg.svd
.
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
add a comment |
Actually, the spectral norm is equal the largest singular value. To get to this value you can use TensorFlow's linalg.svd
.
Actually, the spectral norm is equal the largest singular value. To get to this value you can use TensorFlow's linalg.svd
.
answered Nov 16 '18 at 10:56
Lucas FariasLucas Farias
1781415
1781415
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
add a comment |
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
yes. I know but I mean how to take gradient on that so I can change the entry of matrix that reduce the max singular value.. Frankly speaking, I cannot imagine an explicit formula, not recursive, just straightforward how to take gradient on a matrix that reduce its largest singular value.
– ArtificiallyIntelligence
Nov 16 '18 at 19:26
1
1
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
@Shaowu There is an explicit form. As long as you do not brake the computational graph, I don't see a reason why this shouldn't work. It's been done.
– Lucas Farias
Nov 16 '18 at 20:00
1
1
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
Thanks for the information. The explicit form requires computation of SVD, while SVD is not that explicit in the numerical sense. It requires solving an eigenvalue problem to get the first singular vector, which corresponds to an iterative scheme nested with the backpropagation. I would be surprised if tensorflow can do that in default.
– ArtificiallyIntelligence
Nov 16 '18 at 21:24
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53329357%2fcan-tensorflow-take-gradient-on-matrix-2-norm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown