새소식

Deep Learning

Up-sampling

  • -
  • Up-sampling (For making the input or feature map larger)
    • Why we need?
      • Upsampling features
        • RGB Image inputs to Segmentation Output (e.g. Segnet Architecture)
    • How?
      • By resize
        • Nearest Neighbor Interpolation
        • Bilinear Interpolation
      • By transposed convolution
        • Let the neural networks learns how to upscaling features.
    • Upsampling through Resize (Interpolation)
      • 1D
        • 1D Nearest-Neighbor
        • Linear
          • f(1), f(2) 가 주어질 때, f(1.6)= (0.6)_f(2) + (0.4)_f(1)
        • Cubic
      • 2D
        • 2D Nearest-Neighbor
          • 가장 가까운 화소값을 사용
          • 계산이 빠르다
          • (*) 경계선(jagged edges)이 망가지며, 해상도가 낮아진다.
        • Bilinear
          • Interpolation of interpolation
          • 4개의 점이 rectangle을 이루고 있을 때, 이 rectangle 내부의 임의의 점에 대한 값을 추정할 수 있음
        • Bicubic
      • 학습되는 weight가 없고, less quality
    • Upsampling through Transposed Convolution
      • Original convolution이 [1, 2, 3] * [ [ w_1a w_1b ], [w_2a, w_2b], [w_3a, w_3b] ] = [a, b]의 형태일 때
      • Transposed convolution은 [a, b] * [ [ w_1a, w_2a, w_3a ], [w_1b, w_2b, w_3b] ] = [1, 2, 3]의 형태
      • Transposed convolution output contains copies of the filter weighted by the input, summing at where at overlaps in the output.
      • Need to crop one pixel from output to make output exactly 2x input (2배 input을 정확히 자르려면 하나를 crop해야 한다… input a,b -> filter x, y, z
      • Convolution out = (n + 2p - k)/s + 1, Transposed Convolution out = (out-1) * s + k - 2p = n // s=stride, p=padding 이 정반대로 사용됨
      • 학습되는 Convolution weight가 있음
      • Checker board patterns of artifacts 문제 (high frequency noise)
        • Convolution overlapping이 원인
        • Regular patterns
        • To remedy those patterns
          • First, resize the image using NN interpolation or bilinear interpolation)
          • And then do a convolutional layer
          • 1 transposed convolution => resize + convolution
      • Many names
        • Transposed convolution (recommended)
        • Upsampling
        • Deconvolution (좀 다른 의미)
        • Fractionally-strided convolution

'Deep Learning' 카테고리의 다른 글

CNN Architectures  (0) 2021.06.14
Aggregating Features  (0) 2021.06.14
CNN [1] : Pooling & Convolution  (0) 2021.06.07
CNN [0]  (0) 2021.05.24
[Draft] Numerical Instability in Deep Learning  (0) 2021.05.24
Contents

포스팅 주소를 복사했습니다

이 글이 도움이 되었다면 공감 부탁드립니다.