Changkun's Blog

Science and art, life in between.


  • Home

  • Ideas

  • Archives

  • Tags

  • Bio

别聊,一聊你就暴露

Published at: 2017-07-07   |   Reading: 267 words ~1min   |   PV/UV: /

这周吃饭的时候遇到个人,说什么自己对机器学习非常有兴趣,一直都在做这个。

好嘛,既然你这么自信我就随便问了几个问题:

  1. What’s the difference between L1 and L2 regularization?
  2. What’s the difference bewteen kernel function and basis function?
  3. What’s the benefits to use Rectified Linear Function instead of Sigmoid Function?
  4. Explain the relationship between Maxout and ReLU.
  5. Is dropout theoretically works when activation function is not a linear function? Why?

好嘛,这几个问题真的很基础,如果你说你做机器学习,搞深度学习,连这几个问题都答不上来,或者答得没办法让我满意。

我最多只能认为你在机器学习领域的建树就是跑过几个 Demo ,了解一些基本概念,把它当「黑盒」来用。

我只能说,基础不牢,连一些基本的东西都搞不清楚的话,真的很难在这条路走得很远。

真的,别聊,别在我面前装,一聊你就暴露水平。

#随笔# #机器学习# #深度学习#
  • Author: Changkun Ou
  • Link: https://changkun.de/blog/posts/do-not-talk/
  • License: All articles in this blog are licensed under CC BY-NC-ND 4.0 unless stating additionally.
人肉计算(9): 陷阱的解法
人肉计算(8): 人肉计算与数据科学中的陷阱
  • TOC
  • Overview
Changkun Ou

Changkun Ou

Stop Talking. Just Coding.

276 Blogs
165 Tags
Homepage GitHub Email YouTube Twitter Zhihu
Friends
    Frimin ZZZero march1993 qcrao maiyang Xargin Muniao
© 2008 - 2024 Changkun Ou. All rights reserved. | PV/UV: /
0%