ไม่สามารถเล่นวิดีโอนี้
ขออภัยในความไม่สะดวก

1D Convolutional Neural Networks for Time Series Modeling - Nathan Janos, Jeff Roach

แชร์
ฝัง
  • เผยแพร่เมื่อ 18 ส.ค. 2024
  • PyData LA 2018
    This talk describes an experimental approach to time series modeling using 1D convolution filter layers in a neural network architecture. This approach was developed at System1 for forecasting marketplace value of online advertising categories.
    Slides - www.slideshare...
    ---
    www.pydata.org
    PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
    PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases. 00:00 Welcome!
    00:10 Help us add time stamps or captions to this video! See the description for details.
    Want to help add timestamps to our TH-cam videos to help with discoverability? Find out more here: github.com/num...

ความคิดเห็น • 4

  • @Dr.R.K.Verma_ABST
    @Dr.R.K.Verma_ABST ปีที่แล้ว +1

    on 7:08 one layer of filters has n*(n+1)/2= 6*(6+1)/2 = 6*7/2= 21
    otherwise this will be one layer of filters n*(n-1)/2= 6*(6-1)/2 = 6*5/2 = 15

  • @zxynj
    @zxynj 2 ปีที่แล้ว +4

    7:30 I think one layer is 21 weights.

    • @Jeff-qw2de
      @Jeff-qw2de ปีที่แล้ว

      I think so too. Lately, the academic community in computing seems to confuse people and offer no explanation

  • @alexanderskusnov5119
    @alexanderskusnov5119 ปีที่แล้ว

    Why did you change the order: pool, relu instead of relu, pool?