Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register. Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Flores Maza, Carlos Bladimir"

Filter results by typing the first few letters
Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Item
    Desarrollo de una herramienta para la evaluación objetiva de la calidad de video empleando redes neuronales
    (Universidad de Cuenca, 2024-09-23) Flores Maza, Carlos Bladimir; González Martínez, Santiago Renán
    Video quality assessment is essential for determining the appropriate compression of a video file or its real-time transmission. The developed tool allows for the selection of quality (Quantization Parameter (QP)), temporal (Frame Per Second (FPS)), and spatial (Bit-rate) scalability parameters, comparing traditional metrics like Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) with the Perceptual Image Patch Similarity (LPIPS) metric, which uses neural networks (VGG, AlexNet, and SqueezeNet). The experiments included a two-phase subjective evaluation. In the first phase, participants evaluated videos encoded with the same scalability parameter, establishing a relationship between visual perception and the metrics. Results showed that an .excellent”subjective evaluation corresponded to a PSNR of 44.2 dB, an SSIM of 0.99, and an LPIPS of 0.0 with AlexNet, 0.01 with SqueezeNet, and 0.02 with VGG for quality scalability. In the second phase, different participant groups evaluated the same videos with various scalability parameters, preferring quality scalability at high parameters and spatial scalability at intermediate parameters. Additional experiments validated the metrics against human perception, applying distortions such as blurring, Poisson noise, and salt-and-pepper noise. Results indicated that LPIPS is more sensitive to human perception, with percentage values starting from 73.64 %, compared to SSIM (-24.9 %) and PSNR (-14.17 %). The main contribution of this work is the development of a tool that facilitates research and learning in video quality assessment through both objective and subjective approaches.

DSpace software copyright © 2002-2025 LYRASIS

  • Privacy policy
  • End User Agreement
  • Send Feedback