Novel Intra-Field Deinterlacing Algorithm Using Trilateral Filtering Interpolation

Total Page:16

File Type:pdf, Size:1020Kb

Novel Intra-Field Deinterlacing Algorithm Using Trilateral Filtering Interpolation International Conference on Electronics, Biomedical Engineering and its Applications (ICEBEA'2012) Jan. 7-8, 2012 Dubai Novel Intra-field Deinterlacing Algorithm Using Trilateral Filtering Interpolation Xiangdong Chen and Jechang Jeong The conventional deinterlacing methods interpolate the Abstract—This paper proposes an efficient intra-field missing pixels in two ways: (1) intra-field interpolation, and deinterlacing algorithm using trilateral filtering interpolation method (2) inter-field interpolation. Since inter-field interpolation which has outstanding visual effect. The conventional edge based methods usually use the temporal motion information to deinterlacing algorithms provide unsatisfied image visual effect due estimate the motion statement of objects, and then interpolate to wrongly estimation of edge direction or only taking limited numbers of edge directions into consideration; moreover, In order to pixels along the motion directions. In order to implement make accurate edge estimation, the existing deinterlacing algorithms interpolation properly, accurate motion estimation is essential, try to exhaust all the possible edge directions with setting complex complex motion estimation algorithm is necessary to refine conditions which will unavoidably enhance the computational burden the motion information which need high computational cost, while still producing the pinniform or blur artifacts at edge and especially, when fast and irregular motion exists, it is hard to complex regions. To avoid these problems, the proposed algorithm estimate the motion information or gain wrong motion introduces trilateral filtering interpolation which utilizes the correlation of adjacent 6 pixels by measuring the spatial closeness, information, in this situation, performing deinterlacing with intensity similarity and local gradient among them. Experimental the wrong motion information, artifacts cannot be avoided results show that the proposed algorithm provides satisfied inter-filed interpolation methods also need to use intra-field performances in terms of both objective and subjective image interpolation methods to improve the image quality, so in this qualities. What is more, it just exploits the local spatial similarity paper we focus here on the intra-field interpolation method. among the neighboring pixels without complex preset-conditions Because intra-field deinterlaced methods have a lower which is easier to implement than most of the existing algorithms. computational burden than inter-field methods and it only Keywords—Bilateral filter, Deinterlacing, Edge-preserving, utilizes current frames, these methods are more suitable for Trilateral filter. real-time applications. Many intra-field interpolation methods have been proposed I. INTRODUCTION including line average (LA) and directional spatial interpolations. Edge directional interpolation algorithms such HE international industrial standard for interlaced as ELA (Edge-based LA) [4], EELA (Efficient ELA) [5], scanning technology has been widely applied in various T M-ELA (Modified ELA) [6], these methods interpolate a existing TV broadcasting standards, such as NTSC, PAL, and missing line linearly along the direction between adjacent SECAM. In interlaced scan fields, which contain half samples pixels which has the highest correlation. However, those of original image, that means, only the even or the odd lines of directional interpolation techniques have a low performance a frame, are scanned and displayed sequentially. The goal of due to wrongly estimate the direction or only use limited interlaced scanning is to achieve a tradeoff between frame rate direction models in high spatial frequency regions or and transmission bandwidth requirements [1]. However, due horizontal edge. to the adoption of interlaced scanning, current display systems One of the well-known direction oriented methods is such as HDTV, LCD, and 3DTV, suffer from rebarbative edge-based line average (ELA) [4] algorithm. This method visual artifacts such as interline flicker, line crawling and field considers correlations among neighboring six pixels in upper aliasing. Progressive scanning is preferred because interlacing and low lines around the center pixel to be interpolated, the reduces the vertical display resolution and causes twitter ELA has advantage in that it exhibits high performance with a effects for displaying pictures with high vertical frequency small computational load. However, ELA algorithm has [2]-[3]. Thus, various methods have been presented to reduce artifacts when edge direction is incorrectly estimated. these artifacts in digital display devices. The process to Moreover, ELA suffers from the degradation of the image due convert interlaced fields into progressive frames is called to the limitation of considering candidate edge directions, only de-interlacing. three direction, that is vertical, diagonal and anti-diagonal directions. In order to alleviate the disadvantages of ELA, Xiangdong Chen is with the Department of Electronics & Computer many improved edge-based algorithms, such as efficient ELA Engineering, Hanyang University, Seoul,133-791,Korea (phone: (EELA) [5],modified ELA (MELA) [6], low-complexity +82-2-2220-4370; fax: +82-2-2293-8877; e-mail:[email protected]). Jechang Jeong is now with the Department of Electronics & Computer interpolation method for deinterlacing (LCID) [7], fine Engineering, Hanyang University, Seoul,133-791,Korea (e-mail: directional deinterlacing (FDD)[8] and FDIF deinterlacing [9] [email protected]) 300 International Conference on Electronics, Biomedical Engineering and its Applications (ICEBEA'2012) Jan. 7-8, 2012 Dubai have been proposed. Among these methods, FDIF functions for the spatial and intensity components are defined deinterlacing has outstanding performance since it combinates respectively as 2 adaptive distance weighting scheme with fixed directional s |(x,y) (x0,y0)| W (x, y) = ex p (2) x0,y0 2 2 interpolation filter based on MELA. It utilizes a 6-tap fixed − s coefficients sinc interpolation filter to realize high accurate And σ �− �2 R |I(x,y) I(x0,y0)| interpolation on the edge estimated by MELA, though FDIF W (x, y) = ex p (3) x0,y0 2 2 has high PSNR performance, it still yields jagged artifacts on − R Where I(., .) is the intensity value at the given position. small angle edge because of limited edge directions taken into �− σ � Then, the ensemble weight in the bilateralfilter is the product consideration. Since these algorithms consider more candidate of (2) and (3): edge direction and more accurate edge judgment condition W (x, y) = Ws (x, y)WR (x, y) (4) than ELA, they have better objective or subjective x0,y0 x0,y0 x0,y0 performance than ELA, while they still yields a pinniform-like In practice, each pixel is filtered using normalized weights noise in the complex or texture region and flicker on small as angle edges. To reduce this issue, we apply a trilateral filtering ( 0, 0) ( , ) ( , ) interpolator to interpolate the missing pixel by taking ( , ) ( 0, 0) 0, 0 =̃ (5) closeness among the neighboring pixels and intensity ( , ) ∑ (, ) ( 0,0) 0,0 similarity among them into consideration, and also consider Where ( 0, 0) is the filtered image at location( 0, 0). ∑ the local pixel gradient correlation. Since the local pixel The parameters and are used to adjustthe influence gradient implies the edge information, we do not need to ̃ ofWS and WR, respectively. They can be treated as rough estimate edge directions. The problem we discussed thresholds for identifying pixelssufficiently close or similar to previously can be avoided. The proposed algorithm also has the pixel being filtered.Therefore, compared tothe merits of low complexity and good visual quality. conventional Gaussian filter, the bilateral filter caneffectively The remainder of the paper is organized as follows. The separate the textual and structural information ofthe image. conventional bilateral filter is briefly introduced in section 2. However, even though the bilateral filter is widelyused, no Also, the trilateral filtering method will be introduced in this theoretic manner has been established to determinethe optimal section. The proposed deinterlacing algorithm based on and . Therefore, these parameters aregenerally selected by trilateral filtering interpolation will be explained in Section 3, the empirical method. and the experimental results are presented to evaluate the Bilateral filteringtakes all the neighboring pixels into performance of the proposed method in Section 4. Finally, consideration which have better performance in image conclusions are presented in Section. 5. denoising application, because it make full use of the spatial closeness and intensity similarity of the neighboring pixels, II. CONVENTIONALBILATERAL FILTERAND TRILATERAL however, one of the main limitations of bilateral filtering is FILTER that the range filter coefficients rely heavily on actualpixel A. The Bilateral Filter intensity values, as it does not take into account any regional A bilateral filter is a nonlinear filter that depends on characteristics, which may in turn have beeninfluenced by underlying image data and smoothes images while preserving noise therefore potentially resulting in smoothed texture edges [10]. Bilateral filtering can be regarded as an extended regions and fuzzy boundary when denoising which is proved version of the Gaussian low-pass filtering (smoothing) but in [11]. Motivated by this, we present a novel framework
Recommended publications
  • A Review and Comparison on Different Video Deinterlacing
    International Journal of Research ISSN NO:2236-6124 A Review and Comparison on Different Video Deinterlacing Methodologies 1Boyapati Bharathidevi,2Kurangi Mary Sujana,3Ashok kumar Balijepalli 1,2,3 Asst.Professor,Universal College of Engg & Technology,Perecherla,Guntur,AP,India-522438 [email protected],[email protected],[email protected] Abstract— Video deinterlacing is a key technique in Interlaced videos are generally preferred in video broadcast digital video processing, particularly with the widespread and transmission systems as they reduce the amount of data to usage of LCD and plasma TVs. Interlacing is a widely used be broadcast. Transmission of interlaced videos was widely technique, for television broadcast and video recording, to popular in various television broadcasting systems such as double the perceived frame rate without increasing the NTSC [2], PAL [3], SECAM. Many broadcasting agencies bandwidth. But it presents annoying visual artifacts, such as made huge profits with interlaced videos. Video acquiring flickering and silhouette "serration," during the playback. systems on many occasions naturally acquire interlaced video Existing state-of-the-art deinterlacing methods either ignore and since this also proved be an efficient way, the popularity the temporal information to provide real-time performance of interlaced videos escalated. but lower visual quality, or estimate the motion for better deinterlacing but with a trade-off of higher computational cost. The question `to interlace or not to interlace' divides the TV and the PC communities. A proper answer requires a common understanding of what is possible nowadays in deinterlacing video signals. This paper outlines the most relevant methods, and provides a relative comparison.
    [Show full text]
  • High Frame-Rate Television
    Research White Paper WHP 169 September 2008 High Frame-Rate Television M Armstrong, D Flynn, M Hammond, S Jolly, R Salmon BRITISH BROADCASTING CORPORATION BBC Research White Paper WHP 169 High Frame-Rate Television M Armstrong, D Flynn, M Hammond, S Jolly, R Salmon Abstract The frame and field rates that have been used for television since the 1930s cause problems for motion portrayal, which are increasingly evident on the large, high-resolution television displays that are now common. In this paper we report on a programme of experimental work that successfully demonstrated the advantages of higher frame rate capture and display as a means of improving the quality of television systems of all spatial resolutions. We identify additional benefits from the use of high frame-rate capture for the production of programmes to be viewed using conventional televisions. We suggest ways to mitigate some of the production and distribution issues that high frame-rate television implies. This document was originally published in the proceedings of the IBC2008 conference. Additional key words: static, dynamic, compression, shuttering, temporal White Papers are distributed freely on request. Authorisation of the Head of Broadcast/FM Research is required for publication. © BBC 2008. All rights reserved. Except as provided below, no part of this document may be reproduced in any material form (including photocopying or storing it in any medium by electronic means) without the prior written permission of BBC Future Media & Technology except in accordance with the provisions of the (UK) Copyright, Designs and Patents Act 1988. The BBC grants permission to individuals and organisations to make copies of the entire document (including this copyright notice) for their own internal use.
    [Show full text]
  • Alchemist File - Understanding Cadence
    GV File Understanding Cadence Alchemist File - Understanding Cadence Version History Date Version Release by Reason for changes 27/08/2015 1.0 J Metcalf Document originated (1st proposal) 09/09/2015 1.1 J Metcalf Rebranding to Alchemist File 19/01/2016 1.2 G Emerson Completion of rebrand 07/10/2016 1.3 J Metcalf Updated for additional cadence controls added in V2.2.3.2 12/10/2016 1.4 J Metcalf Added Table of Terminology 11/12/2018 1.5 J Metcalf Rebrand for GV and update for V4.*** 16/07/2019 1.6 J Metcalf Minor additions & corrections 05/03/2021 1.7 J Metcalf Rebrand 06/09/2021 1.8 J Metcalf Add User Case (case 9) Version Number: 1.8 © 2021 GV Page 2 of 53 Alchemist File - Understanding Cadence Table of Contents 1. Introduction ............................................................................................................................................... 6 2. Alchemist File Input Cadence controls ................................................................................................... 7 2.1 Input / Source Scan - Scan Type: ............................................................................................................ 7 2.1.1 Incorrect Metadata ............................................................................................................................ 8 2.1.2 Psf Video sources ............................................................................................................................. 9 2.2 Input / Source Scan - Field order ..........................................................................................................
    [Show full text]
  • Be) (Bexncbe) \(Be
    US 20090067508A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2009/0067508 A1 Wals (43) Pub. Date: Mar. 12, 2009 (54) SYSTEMAND METHOD FOR BLOCK-BASED Related U.S. Application Data PER-PXEL CORRECTION FOR FILMI-BASED SOURCES (60) Provisional application No. 60/971,662, filed on Sep. 12, 2007. (75) Inventor: Edrichters als Publication Classification (51) Int. Cl. Correspondence Address: H04N II/02 (2006.01) LAW OFFICE OF OUANES. KOBAYASH P.O. Box 4160 (52) U.S. Cl. ............................ 375/240.24; 375/E07.076 Leesburg, VA 20177 (US) (57) ABSTRACT (73) Assignee: Broadcom Corporation, Irvine, A system and method for block-based per-pixel correction for CA (US) film-based sources. The appearance of mixed film/video can be improved through an adaptive selection of normal deinter (21) Appl. No.: 12/105,664 laced video relative to inverse telecine video. This adaptive selection process is based on pixel difference measures of (22)22) Filed: Apr.pr. 18,18, 2008 sub-blocks within defined blocks of ppixels. SOURCE FILM FRAMES Frame 1 Frame 2 Fram Frame 4 Frame 5 Frame 6 INTERLACED 3:2 VIDEO (BE) 9.(BE) (BEXNCBE)( \(BE). FIELD PHASE DEINTERLACED FRAMES USING REVERSE 3:2 SOURCE OF f BWD WD AWG B WD B FWD AWG B) FWD BD MISSING FIELD Patent Application Publication Mar. 12, 2009 Sheet 1 of 4 US 2009/0067508 A1 I'61) CIE?OV/THE_LNIEC] €)NISTSEIN\/H-] Z.8ESHE/\EH Patent Application Publication Mar. 12, 2009 Sheet 2 of 4 US 2009/0067508 A1 W9. s W9. US 2009/0067508 A1 Mar.
    [Show full text]
  • High-Quality Spatial Interpolation of Interlaced Video
    High-Quality Spatial Interpolation of Interlaced Video Alexey Lukin Laboratory of Mathematical Methods of Image Processing Department of Computational Mathematics and Cybernetics Moscow State University, Moscow, Russia [email protected] Abstract Deinterlacing is the process of converting of interlaced-scan video sequences into progressive scan format. It involves interpolating missing lines of video data. This paper presents a new algorithm of spatial interpolation that can be used as a part of more com- plex motion-adaptive or motion-compensated deinterlacing. It is based on edge-directional interpolation, but adds several features to improve quality and robustness: spatial averaging of directional derivatives, ”soft” mixing of interpolation directions, and use of several interpolation iterations. High quality of the proposed algo- rithm is demonstrated by visual comparison and PSNR measure- ments. Keywords: deinterlacing, edge-directional interpolation, intra- field interpolation. 1 INTRODUCTION (a) (b) Interlaced scan (or interlacing) is a technique invented in 1930-ies to improve smoothness of motion in video without increasing the bandwidth. It separates a video frame into 2 fields consisting of Figure 1: a: ”Bob” deinterlacing (line averaging), odd and even raster lines. Fields are updated on a screen in alter- b: ”weave” deinterlacing (field insertion). nating manner, which permits updating them twice as fast as when progressive scan is used, allowing capturing motion twice as often. Interlaced scan is still used in most television systems, including able, it is estimated from the video sequence. certain HDTV broadcast standards. In this paper, a new high-quality method of spatial interpolation However, many television and computer displays nowadays are of video frames in suggested.
    [Show full text]
  • Deinterlacing Network for Early Interlaced Videos
    1 Rethinking deinterlacing for early interlaced videos Yang Zhao, Wei Jia, Ronggang Wang Abstract—In recent years, high-definition restoration of early videos have received much attention. Real-world interlaced videos usually contain various degradations mixed with interlacing artifacts, such as noises and compression artifacts. Unfortunately, traditional deinterlacing methods only focus on the inverse process of interlacing scanning, and cannot remove these complex and complicated artifacts. Hence, this paper proposes an image deinterlacing network (DIN), which is specifically designed for joint removal of interlacing mixed with other artifacts. The DIN is composed of two stages, i.e., a cooperative vertical interpolation stage for splitting and fully using the information of adjacent fields, and a field-merging stage to perceive movements and suppress ghost artifacts. Experimental results demonstrate the effectiveness of the proposed DIN on both synthetic and real- world test sets. Fig. 1. Illustration of the interlaced scanning mechanism. Index Terms—deinterlacing, early videos, interlacing artifacts this paper is to specifically design an effective deinterlacing network for the joint restoration tasks of interlaced frames. I. INTRODUCTION The traditional interlaced scanning mechanism can be de- Interlacing artifacts are commonly observed in many early fined as Y = S(X1; X2), where Y denotes the interlaced videos, which are caused by interlacing scanning in early frame, S(·) is the interlaced scanning function, and X1; X2 television systems, e.g., NTSC, PAL, and SECAM. As shown denote the odd and even fields. Traditional deinterlacing in Fig. 1, the odd lines and even lines of an interlaced frame are methods focus on the reversed process of S(·), which can scanned from two different half-frames, i.e., the odd/top/first be roughly divided into four categories, i.e., temporal inter- field and the even/bottom/second field.
    [Show full text]
  • Exporting from Adobe Premiere for Dmds
    EXPORTING FROM ADOBE PREMIERE FOR DMDS Open Adobe Premiere and create a NEW PROJECT. Specify a project location and name, then press OK to continue. On the next screen, a list of available presets appears. CONFIGURING THE SEQUENCE HIGH DEFINITION For a high definition project, expand the XDCAM HD422 folder. Based on the configuration of your source footage, select one of the following: To create an interlaced sequence, expand the 1080i folder, and select the XDCAM HD422 1080i30 (60i) preset. To create a progressive sequence, expand the 1080p folder, and select the XDCAM HD422 1080p30 preset. Once you've selected the correct preset and checked your settings, click OK to continue. STANDARD DEFINITION If youʼre working with standard definition footage, youʼll need to select an SD sequence preset. Expand the DV - NTSC folder. Select Standard 48kHz. Click on the GENERAL tab and verify the FIELDS category. This should match your source footage. Select either UPPER or LOWER FIELD FIRST for interlaced footage, or NO FIELDS (PROGRESSIVE SCAN) for progressive footage. Once you've selected the correct preset and checked your settings, click OK to continue. PREPARE THE SEQUENCE Import your source video and drag it into the sequence that you have just created. FINAL CHECKS MONO / STEREO On the timeline, check the audio tracks. You will either have stereo paired tracks, indicated by this icon: as well as the L / R indicators, or separate mono tracks: In either case, you must ensure that your audio will be stereo. Open the audio mixer and play the sequence. Watch the audio meters.
    [Show full text]
  • A Guide to Standard and High-Definition Digital Video Measurements
    Primer A Guide to Standard and High-Definition Digital Video Measurements 3G, Dual Link and ANC Data Information A Guide to Standard and High-Definition Digital Video Measurements Primer Table of Contents In The Beginning . .1 Ancillary data . .55 Traditional television . .1 Video Measurements . .61 The “New” Digital Television . .2 Monitoring and measuring tools . .61 Monitoring digital and analog signal . .62 Numbers describing an analog world . .2 Assessment of video signal degradation . .62 Component digital video . .2 Video amplitude . .62 Moving Forward from Analog to Digital . .3 Signal amplitude . .63 The RGB component signal . .3 Frequency response . .65 Gamma correction . .4 Group delay . .65 Gamma correction is more than correction for Non-linear effects . .66 CRT response . .5 Differential gain . .67 Conversion of R'G'B' into luma and color difference . .5 Differential phase . .67 The Digital Video Interface . .7 Digital System Testing . .67 601 sampling . .9 Stress testing . .67 The parallel digital interface . .11 Cable-length stress testing . .67 The serial digital interface (SDI) . .12 SDI check field . .68 High-definition video builds on standard In-service testing . .68 definition principles . .14 Eye-pattern testing . .70 Jitter testing . .72 Timing and Synchronization . .17 SDI status display . .76 Analog video timing . .17 Cable-length measurements . .76 Horizontal timing . .18 Timing between video sources . .77 Vertical timing . .20 Intrachannel timing of component signals . .78 Analog high-definition component video parameters . .24 Waveform method . .78 Timing using the Tektronix Lightning display . .78 Digital Studio Scanning Formats . .25 Bowtie method . .79 Segmented frame production formats . .25 Operating a Digital Television System . .81 Digital Studio Synchronization and Timing .
    [Show full text]
  • TSG95 PAL/NTSC Signal Generator
    TSG 95 PAL/NTSC Signal Ge n e r a t o r systems and for verifying PAL, NTSC, or Japan NTSC operating television transmitter automatic standards correction systems. Full set of test signals for system The TSG 95 Signal Generator installation and setup provides a powerful combination of test signals, ID capabilities, Stereo audio outputs with L/R and other features making it a identification must for the TV engineer’s toolbox or workbench. Video character ID for circuit identification Test Signals The TSG 95 generator provides Battery or AC operation 20 user-selected test signals in PAL, 20 in NTSC, and 21 in zero setup Japan NTSC. • NTC7 Composite • NTC7 Combination PAL: • FCC Composite • 75% Color Bars • Cable Multiburst • 100% Color Bars • Cable Sweep • 75% Bars over Red • Sin (x)/x • 100% Bars over Red • Matrix of NTC7 Composite, • Convergence NTC7 Combination, Color • Pluge Bars, Sin(x)/x, 50 IRE Flat • Safe Area Field • Green Field • 0 IRE No Burst • Blue Field • Field Square Wave • Red Field • Bounce • 100% Flat Field Vertical Interval Test Signals TSG 95 PAL/NTSC Signal Generator. • 50% Flat Field (VITS) may be included on Flat • 0% Flat Field Field and Matrix test signals in Te k t ronix is the worldwide • Multiburst single standard configurations. leader supplying test equipment • 60% Reduced Line Sweep The TSG 95 may be configured for the entire range of video and • 5 Step Gray Scale for multiple signal standards by audio signal applications. Our • 4.43 MHz Modulated 5 Step selecting up to 26 signals as a video and audio test port f o l i o • Matrix of CCIR 17, CCIR 18, User Signal Set.
    [Show full text]
  • Real-Time Deep Video Deinterlacing
    Real-time Deep Video Deinterlacing HAICHAO ZHU, The Chinese University of Hong Kong XUETING LIU, The Chinese University of Hong Kong XIANGYU MAO, The Chinese University of Hong Kong TIEN-TSIN WONG, The Chinese University of Hong Kong Soccer Leaves (a) Input frames (b) SRCNN (trained with our dataset) (c) Blown-ups (d) Ours Fig. 1. (a) Input interlaced frames. (b) Deinterlaced results generated by SRCNN [4] re-trained with our dataset. (c) Blown-ups from (b) and (d) respectively. (d) Deinterlaced results generated by our method. The classical super-resolution method SRCNN reconstruct each frame based on a single field and has large information loss. It also follows the conventional translation-invariant assumption which does not hold for the deinterlacing problem. Therefore, it inevitably generates blurry edges and artifacts, especially around sharp boundaries. In contrast, our method can circumvent this issue and reconstruct frames with higher visual quality and reconstruction accuracy. Interlacing is a widely used technique, for television broadcast and video captured for the following frame (Fig. 2(a), lower). It basically trades recording, to double the perceived frame rate without increasing the band- the frame resolution for the frame rate, in order to double the per- width. But it presents annoying visual artifacts, such as flickering and sil- ceived frame rate without increasing the bandwidth. Unfortunately, houette “serration,” during the playback. Existing state-of-the-art deinter- since the two half frames are captured in different time instances, lacing methods either ignore the temporal information to provide real-time there are significant visual artifacts such as line flickering and“ser- performance but lower visual quality, or estimate the motion for better dein- ration” on the silhouette of moving objects (Fig.
    [Show full text]
  • Guide to the Use of the ATSC Digital Television Standard, Including Corrigendum No
    Doc. A/54A 4 December 2003 Corrigendum No. 1 dated 20 December 2006 Recommended Practice: Guide to the Use of the ATSC Digital Television Standard, including Corrigendum No. 1 Advanced Television Systems Committee 1750 K Street, N.W. Suite 1200 Washington, D.C. 20006 www.atsc.org ATSC Guide to Use of the ATSC DTV Standard 4 December 2003 The Advanced Television Systems Committee, Inc., is an international, non-profit organization developing voluntary standards for digital television. The ATSC member organizations represent the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries. Specifically, ATSC is working to coordinate television standards among different communications media focusing on digital television, interactive systems, and broadband multimedia communications. ATSC is also developing digital television implementation strategies and presenting educational seminars on the ATSC standards. ATSC was formed in 1982 by the member organizations of the Joint Committee on InterSociety Coordination (JCIC): the Electronic Industries Association (EIA), the Institute of Electrical and Electronic Engineers (IEEE), the National Association of Broadcasters (NAB), the National Cable and Telecommunications Association (NCTA), and the Society of Motion Picture and Television Engineers (SMPTE). Currently, there are approximately 160 members representing the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor
    [Show full text]
  • IDENTIFYING TOP/BOTTOM FIELD in INTERLACED VIDEO Raja Subramanian and Sanjeev Retna Mistral Solutions Pvt Ltd., India
    IDENTIFYING TOP/BOTTOM FIELD IN INTERLACED VIDEO Raja Subramanian and Sanjeev Retna Mistral Solutions Pvt Ltd., India BOB + Use the Same as the Close to the Weaving previous top or source source. Lesser ABSTRACT bottom field artifacts than all This paper elaborates an approach that can be adopted to data to the above determine top/bottom fields in an interlaced video. Knowing construct the the top and bottom field is important if the video is de- frame at interlaced using Field Combination, Weaving + Bob, Discard doubled FPS and other algorithms based on motion detection. Determining the field information helps to re-construct the frame with lesser artifacts. This approach can be used if the The captured frame shown below depicts the stair-case top/bottom field information is not provided by video artifacts of discard algorithm. decoder chip. INTRODUCTION Interlaced video has been in use for more than 50 years. When dealing with interlaced video, de-interlacing algorithms are essential to remove any interlacing artifacts. There are many de-interlacing algorithms available for NTSC/PAL interlaced video. For low-end systems (system with less processing capability), following approaches can be considered:- Figure 1: Discard If the field order is unknown (or not provided by the video Algorithm Description Quality decoder chip) then Discard or Bob algorithm is the efficient Stationary Moving method to de-interlace. Weaving / Combine the Same as the Artifacts due to Field top and bottom source time delay Combine field to form a between top and single frame bottom field Discard Discard the top Stair case Stair case or bottom field, artifacts due to artifacts due to resize (double resize/line copy.
    [Show full text]