Be) (Bexncbe) \(Be

Total Page:16

File Type:pdf, Size:1020Kb

Be) (Bexncbe) \(Be US 20090067508A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2009/0067508 A1 Wals (43) Pub. Date: Mar. 12, 2009 (54) SYSTEMAND METHOD FOR BLOCK-BASED Related U.S. Application Data PER-PXEL CORRECTION FOR FILMI-BASED SOURCES (60) Provisional application No. 60/971,662, filed on Sep. 12, 2007. (75) Inventor: Edrichters als Publication Classification (51) Int. Cl. Correspondence Address: H04N II/02 (2006.01) LAW OFFICE OF OUANES. KOBAYASH P.O. Box 4160 (52) U.S. Cl. ............................ 375/240.24; 375/E07.076 Leesburg, VA 20177 (US) (57) ABSTRACT (73) Assignee: Broadcom Corporation, Irvine, A system and method for block-based per-pixel correction for CA (US) film-based sources. The appearance of mixed film/video can be improved through an adaptive selection of normal deinter (21) Appl. No.: 12/105,664 laced video relative to inverse telecine video. This adaptive selection process is based on pixel difference measures of (22)22) Filed: Apr.pr. 18,18, 2008 sub-blocks within defined blocks of ppixels. SOURCE FILM FRAMES Frame 1 Frame 2 Fram Frame 4 Frame 5 Frame 6 INTERLACED 3:2 VIDEO (BE) 9.(BE) (BEXNCBE)( \(BE). FIELD PHASE DEINTERLACED FRAMES USING REVERSE 3:2 SOURCE OF f BWD WD AWG B WD B FWD AWG B) FWD BD MISSING FIELD Patent Application Publication Mar. 12, 2009 Sheet 1 of 4 US 2009/0067508 A1 I'61) CIE?OV/THE_LNIEC] €)NISTSEIN\/H-] Z.8ESHE/\EH Patent Application Publication Mar. 12, 2009 Sheet 2 of 4 US 2009/0067508 A1 W9. s W9. US 2009/0067508 A1 Mar. 12, 2009 SYSTEMAND METHOD FOR BLOCK-BASED 0011. Unfortunately, many films that have undergone tele PER-PXEL CORRECTION FOR cine have been Subsequently edited using video equipment FILM-BASED SOURCES that ignores the original film cadence. The result is “mixed content where normal video is overlaid on top of film that has undergone a telecine process (e.g., 3:2). If inverse telecine is 0001. This application claims priority to provisional appli performed on Such video, the video-mode parts appear to be cation No. 60/971,662, filed Sep. 12, 2007, which is incorpo “weaved as well, which is visually unappealing. rated by reference herein, in its entirety, for all purposes. 0012. One solution to this problem is to turn off telecine for certain portions of the screen and use a normal video BACKGROUND mode deinterlacer for those cases. What is needed therefore is an efficient mechanism for detecting the regions that contain 0002 1. Field of the Invention Video-mode content and appropriately Switching to video 0003. The present invention relates generally to television mode for those regions. signal decoding and, more particularly, to a system and method for block-based per-pixel correction for film-based SUMMARY SOUCS. 0004 2. Introduction 0013. A system and/or method for block-based per-pixel 0005 Modern high-resolution and high-definition televi correction for film-based sources, Substantially as shown in sions require a method for displaying lower-resolution field and/or described in connection with at least one of the figures, based content (e.g., SDTV. 1080i) in a high quality way. Some as set forth more completely in the claims. types of field-based content can be created through the con version of a progressive source (such as film) using a process BRIEF DESCRIPTION OF THE DRAWINGS commonly referred to as telecine. 0014. In order to describe the manner in which the above 0006 To illustrate the telecine process, consider for recited and other advantages and features of the invention can example the process of converting 24 frame/s film to 59.94 Hz be obtained, a more particular description of the invention NTSC video. NTSC is made up of fields, which are made up briefly described above will be rendered by reference to spe of every other horizontal scanline. One field contains all the cific embodiments thereof which are illustrated in the odd numbered scanlines, while the next field contains all the appended drawings. Understanding that these drawings even numbered Scanlines. A frame is then the combination of depict only typical embodiments of the invention and are not two consecutive fields viewed at the same time. This process therefore to be considered limiting of its scope, the invention is called interlacing. will be described and explained with additional specificity 0007. The conversion of 24 frame/s file to 29.97 frame/s and detail through the use of the accompanying drawings in NTSC interlaced video is known as 3:2 pulldown. In this which: conversion, there are approximately four frames of film for 0015 FIG. 1 illustrates an example of a telecine process. every five frames of NTSC video. These four frames are 0016 FIG. 2 illustrates an example of sub-blocks within a “stretched into five by exploiting the interlaced nature of block of pixels. NTSC video. To illustrate this relation, consider the 3:2 pull 0017 FIG. 3 illustrates an embodiment of a processing down of FIG. 1. block in a motion-adaptive deinterlacer (MAD) that weaves 0008. As illustrated, forevery NTSC frame, there are actu fields together. ally two fields, one for the odd-numbered lines of the image, 0018 FIG. 4 illustrates a flowchart of a process of the designated as the top field (TF), and one for the even-num present invention. bered lines, designated as the bottom field (BF). There are, therefore, ten NTSC fields for every four film frames. To DETAILED DESCRIPTION accommodate this “stretching the telecine process alter nately places the first film frame across two fields, the second 0019 Various embodiments of the invention are discussed frame across three fields, the third frame across two fields, in detail below. While specific implementations are dis and the fourth frame across three fields. The cycle repeats cussed, it should be understood that this is done for illustra itself completely after four film frames have been converted. tion purposes only. A person skilled in the relevant art will As illustrated, the TF of the second film frame and the BF of recognize that other components and configurations may be the fourth film frame are duplicate copies. used without parting from the spirit and scope of the inven 0009. The particular pulldown method that is applied is tion. based on the type of film and the type of display format (e.g., 0020. When normal video is overlaid on top of film that NTSC, PAL, etc.). As such, various other pulldown methods has undergone a telecine process (e.g., 3:2), objectionable Such as 2:2, 5:5, 6:4, 8:7, etc. can be used for a given appli artifacts can occur when an inverse telecine process is per cation formed. For example, titles (or tickers) that move or fade may 0010. To display field-based content that is generated by a be laid over film material. When this mixed content is passed particular telecine process on a modern television display, it is to an intelligent MPEG encoder, the encoder can sometimes therefore necessary to recover the original progressive source pickup the strong 3:2 cadence of the underlying film material, content Such as film. This source can be reconstructed from resulting in a poor display of the moving text. the field-based content using an inverse telecine process. In 0021. In accordance with the present invention, an adap this inverse telecine process, it is critical that the correct tive block-based per-pixel mechanism is provided that can cadence (e.g., 2:2, 3:2, 5:5, 6:4, 8:7, etc.) of the field-based turn off the inverse telecine process for certain portions of the content be identified so that the correct fields can be weaved screen and use a normal video-mode deinterlacer instead. In together to produce the original progressive source. this manner, the normal video-mode content would not US 2009/0067508 A1 Mar. 12, 2009 undergo the inverse telecine process, which can produce 0032. The sbdiff1 for each of the vertically-oriented sub objectionable display of the video-mode content. blocks and the sbdiff2 values for each of the horizontally 0022. One of the difficulties of per-pixel correction is the oriented sub-blocks are compared to a threshold to determine determination of which regions contain video-mode content. if per-pixel correction should be enabled for that particular In the present invention, it is recognized that the differences block. If any of the sbdiff1 or sbdiff2 values exceed the between the repeat fields can be used to determine the areas threshold, then a flag rf motion(bx,by) is set for that block on the screen that are suspected to contain video-mode con indicating the presence of significant repeat field motion. tent. 0033. As would be appreciated, the specific number of 0023. In the example of a 3:2 cadence, such as that illus vertically-oriented and horizontally-oriented sub-blocks cho trated in FIG. 1, the repeat field occurs when the same field is Sen can be implementation dependent. Also, the same number utilized twice. This happens during the '3' part of the 3:2 of sub-blocks need not be used. If that is the case, different cadence. In the example of FIG. 1, the repeat fields occur in thresholds may be used in the analysis vertical and horizontal the top fields (TFs) of Frame 2 and the bottom fields (BFs) of analysis. Frame 4. 0034. In one embodiment, a histogram of values of sbdiff1 0024. In the present invention, the detection of those andsbdiff2 are created.
Recommended publications
  • A Review and Comparison on Different Video Deinterlacing
    International Journal of Research ISSN NO:2236-6124 A Review and Comparison on Different Video Deinterlacing Methodologies 1Boyapati Bharathidevi,2Kurangi Mary Sujana,3Ashok kumar Balijepalli 1,2,3 Asst.Professor,Universal College of Engg & Technology,Perecherla,Guntur,AP,India-522438 [email protected],[email protected],[email protected] Abstract— Video deinterlacing is a key technique in Interlaced videos are generally preferred in video broadcast digital video processing, particularly with the widespread and transmission systems as they reduce the amount of data to usage of LCD and plasma TVs. Interlacing is a widely used be broadcast. Transmission of interlaced videos was widely technique, for television broadcast and video recording, to popular in various television broadcasting systems such as double the perceived frame rate without increasing the NTSC [2], PAL [3], SECAM. Many broadcasting agencies bandwidth. But it presents annoying visual artifacts, such as made huge profits with interlaced videos. Video acquiring flickering and silhouette "serration," during the playback. systems on many occasions naturally acquire interlaced video Existing state-of-the-art deinterlacing methods either ignore and since this also proved be an efficient way, the popularity the temporal information to provide real-time performance of interlaced videos escalated. but lower visual quality, or estimate the motion for better deinterlacing but with a trade-off of higher computational cost. The question `to interlace or not to interlace' divides the TV and the PC communities. A proper answer requires a common understanding of what is possible nowadays in deinterlacing video signals. This paper outlines the most relevant methods, and provides a relative comparison.
    [Show full text]
  • High Frame-Rate Television
    Research White Paper WHP 169 September 2008 High Frame-Rate Television M Armstrong, D Flynn, M Hammond, S Jolly, R Salmon BRITISH BROADCASTING CORPORATION BBC Research White Paper WHP 169 High Frame-Rate Television M Armstrong, D Flynn, M Hammond, S Jolly, R Salmon Abstract The frame and field rates that have been used for television since the 1930s cause problems for motion portrayal, which are increasingly evident on the large, high-resolution television displays that are now common. In this paper we report on a programme of experimental work that successfully demonstrated the advantages of higher frame rate capture and display as a means of improving the quality of television systems of all spatial resolutions. We identify additional benefits from the use of high frame-rate capture for the production of programmes to be viewed using conventional televisions. We suggest ways to mitigate some of the production and distribution issues that high frame-rate television implies. This document was originally published in the proceedings of the IBC2008 conference. Additional key words: static, dynamic, compression, shuttering, temporal White Papers are distributed freely on request. Authorisation of the Head of Broadcast/FM Research is required for publication. © BBC 2008. All rights reserved. Except as provided below, no part of this document may be reproduced in any material form (including photocopying or storing it in any medium by electronic means) without the prior written permission of BBC Future Media & Technology except in accordance with the provisions of the (UK) Copyright, Designs and Patents Act 1988. The BBC grants permission to individuals and organisations to make copies of the entire document (including this copyright notice) for their own internal use.
    [Show full text]
  • Alchemist File - Understanding Cadence
    GV File Understanding Cadence Alchemist File - Understanding Cadence Version History Date Version Release by Reason for changes 27/08/2015 1.0 J Metcalf Document originated (1st proposal) 09/09/2015 1.1 J Metcalf Rebranding to Alchemist File 19/01/2016 1.2 G Emerson Completion of rebrand 07/10/2016 1.3 J Metcalf Updated for additional cadence controls added in V2.2.3.2 12/10/2016 1.4 J Metcalf Added Table of Terminology 11/12/2018 1.5 J Metcalf Rebrand for GV and update for V4.*** 16/07/2019 1.6 J Metcalf Minor additions & corrections 05/03/2021 1.7 J Metcalf Rebrand 06/09/2021 1.8 J Metcalf Add User Case (case 9) Version Number: 1.8 © 2021 GV Page 2 of 53 Alchemist File - Understanding Cadence Table of Contents 1. Introduction ............................................................................................................................................... 6 2. Alchemist File Input Cadence controls ................................................................................................... 7 2.1 Input / Source Scan - Scan Type: ............................................................................................................ 7 2.1.1 Incorrect Metadata ............................................................................................................................ 8 2.1.2 Psf Video sources ............................................................................................................................. 9 2.2 Input / Source Scan - Field order ..........................................................................................................
    [Show full text]
  • High-Quality Spatial Interpolation of Interlaced Video
    High-Quality Spatial Interpolation of Interlaced Video Alexey Lukin Laboratory of Mathematical Methods of Image Processing Department of Computational Mathematics and Cybernetics Moscow State University, Moscow, Russia [email protected] Abstract Deinterlacing is the process of converting of interlaced-scan video sequences into progressive scan format. It involves interpolating missing lines of video data. This paper presents a new algorithm of spatial interpolation that can be used as a part of more com- plex motion-adaptive or motion-compensated deinterlacing. It is based on edge-directional interpolation, but adds several features to improve quality and robustness: spatial averaging of directional derivatives, ”soft” mixing of interpolation directions, and use of several interpolation iterations. High quality of the proposed algo- rithm is demonstrated by visual comparison and PSNR measure- ments. Keywords: deinterlacing, edge-directional interpolation, intra- field interpolation. 1 INTRODUCTION (a) (b) Interlaced scan (or interlacing) is a technique invented in 1930-ies to improve smoothness of motion in video without increasing the bandwidth. It separates a video frame into 2 fields consisting of Figure 1: a: ”Bob” deinterlacing (line averaging), odd and even raster lines. Fields are updated on a screen in alter- b: ”weave” deinterlacing (field insertion). nating manner, which permits updating them twice as fast as when progressive scan is used, allowing capturing motion twice as often. Interlaced scan is still used in most television systems, including able, it is estimated from the video sequence. certain HDTV broadcast standards. In this paper, a new high-quality method of spatial interpolation However, many television and computer displays nowadays are of video frames in suggested.
    [Show full text]
  • Deinterlacing Network for Early Interlaced Videos
    1 Rethinking deinterlacing for early interlaced videos Yang Zhao, Wei Jia, Ronggang Wang Abstract—In recent years, high-definition restoration of early videos have received much attention. Real-world interlaced videos usually contain various degradations mixed with interlacing artifacts, such as noises and compression artifacts. Unfortunately, traditional deinterlacing methods only focus on the inverse process of interlacing scanning, and cannot remove these complex and complicated artifacts. Hence, this paper proposes an image deinterlacing network (DIN), which is specifically designed for joint removal of interlacing mixed with other artifacts. The DIN is composed of two stages, i.e., a cooperative vertical interpolation stage for splitting and fully using the information of adjacent fields, and a field-merging stage to perceive movements and suppress ghost artifacts. Experimental results demonstrate the effectiveness of the proposed DIN on both synthetic and real- world test sets. Fig. 1. Illustration of the interlaced scanning mechanism. Index Terms—deinterlacing, early videos, interlacing artifacts this paper is to specifically design an effective deinterlacing network for the joint restoration tasks of interlaced frames. I. INTRODUCTION The traditional interlaced scanning mechanism can be de- Interlacing artifacts are commonly observed in many early fined as Y = S(X1; X2), where Y denotes the interlaced videos, which are caused by interlacing scanning in early frame, S(·) is the interlaced scanning function, and X1; X2 television systems, e.g., NTSC, PAL, and SECAM. As shown denote the odd and even fields. Traditional deinterlacing in Fig. 1, the odd lines and even lines of an interlaced frame are methods focus on the reversed process of S(·), which can scanned from two different half-frames, i.e., the odd/top/first be roughly divided into four categories, i.e., temporal inter- field and the even/bottom/second field.
    [Show full text]
  • Exporting from Adobe Premiere for Dmds
    EXPORTING FROM ADOBE PREMIERE FOR DMDS Open Adobe Premiere and create a NEW PROJECT. Specify a project location and name, then press OK to continue. On the next screen, a list of available presets appears. CONFIGURING THE SEQUENCE HIGH DEFINITION For a high definition project, expand the XDCAM HD422 folder. Based on the configuration of your source footage, select one of the following: To create an interlaced sequence, expand the 1080i folder, and select the XDCAM HD422 1080i30 (60i) preset. To create a progressive sequence, expand the 1080p folder, and select the XDCAM HD422 1080p30 preset. Once you've selected the correct preset and checked your settings, click OK to continue. STANDARD DEFINITION If youʼre working with standard definition footage, youʼll need to select an SD sequence preset. Expand the DV - NTSC folder. Select Standard 48kHz. Click on the GENERAL tab and verify the FIELDS category. This should match your source footage. Select either UPPER or LOWER FIELD FIRST for interlaced footage, or NO FIELDS (PROGRESSIVE SCAN) for progressive footage. Once you've selected the correct preset and checked your settings, click OK to continue. PREPARE THE SEQUENCE Import your source video and drag it into the sequence that you have just created. FINAL CHECKS MONO / STEREO On the timeline, check the audio tracks. You will either have stereo paired tracks, indicated by this icon: as well as the L / R indicators, or separate mono tracks: In either case, you must ensure that your audio will be stereo. Open the audio mixer and play the sequence. Watch the audio meters.
    [Show full text]
  • A Guide to Standard and High-Definition Digital Video Measurements
    Primer A Guide to Standard and High-Definition Digital Video Measurements 3G, Dual Link and ANC Data Information A Guide to Standard and High-Definition Digital Video Measurements Primer Table of Contents In The Beginning . .1 Ancillary data . .55 Traditional television . .1 Video Measurements . .61 The “New” Digital Television . .2 Monitoring and measuring tools . .61 Monitoring digital and analog signal . .62 Numbers describing an analog world . .2 Assessment of video signal degradation . .62 Component digital video . .2 Video amplitude . .62 Moving Forward from Analog to Digital . .3 Signal amplitude . .63 The RGB component signal . .3 Frequency response . .65 Gamma correction . .4 Group delay . .65 Gamma correction is more than correction for Non-linear effects . .66 CRT response . .5 Differential gain . .67 Conversion of R'G'B' into luma and color difference . .5 Differential phase . .67 The Digital Video Interface . .7 Digital System Testing . .67 601 sampling . .9 Stress testing . .67 The parallel digital interface . .11 Cable-length stress testing . .67 The serial digital interface (SDI) . .12 SDI check field . .68 High-definition video builds on standard In-service testing . .68 definition principles . .14 Eye-pattern testing . .70 Jitter testing . .72 Timing and Synchronization . .17 SDI status display . .76 Analog video timing . .17 Cable-length measurements . .76 Horizontal timing . .18 Timing between video sources . .77 Vertical timing . .20 Intrachannel timing of component signals . .78 Analog high-definition component video parameters . .24 Waveform method . .78 Timing using the Tektronix Lightning display . .78 Digital Studio Scanning Formats . .25 Bowtie method . .79 Segmented frame production formats . .25 Operating a Digital Television System . .81 Digital Studio Synchronization and Timing .
    [Show full text]
  • TSG95 PAL/NTSC Signal Generator
    TSG 95 PAL/NTSC Signal Ge n e r a t o r systems and for verifying PAL, NTSC, or Japan NTSC operating television transmitter automatic standards correction systems. Full set of test signals for system The TSG 95 Signal Generator installation and setup provides a powerful combination of test signals, ID capabilities, Stereo audio outputs with L/R and other features making it a identification must for the TV engineer’s toolbox or workbench. Video character ID for circuit identification Test Signals The TSG 95 generator provides Battery or AC operation 20 user-selected test signals in PAL, 20 in NTSC, and 21 in zero setup Japan NTSC. • NTC7 Composite • NTC7 Combination PAL: • FCC Composite • 75% Color Bars • Cable Multiburst • 100% Color Bars • Cable Sweep • 75% Bars over Red • Sin (x)/x • 100% Bars over Red • Matrix of NTC7 Composite, • Convergence NTC7 Combination, Color • Pluge Bars, Sin(x)/x, 50 IRE Flat • Safe Area Field • Green Field • 0 IRE No Burst • Blue Field • Field Square Wave • Red Field • Bounce • 100% Flat Field Vertical Interval Test Signals TSG 95 PAL/NTSC Signal Generator. • 50% Flat Field (VITS) may be included on Flat • 0% Flat Field Field and Matrix test signals in Te k t ronix is the worldwide • Multiburst single standard configurations. leader supplying test equipment • 60% Reduced Line Sweep The TSG 95 may be configured for the entire range of video and • 5 Step Gray Scale for multiple signal standards by audio signal applications. Our • 4.43 MHz Modulated 5 Step selecting up to 26 signals as a video and audio test port f o l i o • Matrix of CCIR 17, CCIR 18, User Signal Set.
    [Show full text]
  • Real-Time Deep Video Deinterlacing
    Real-time Deep Video Deinterlacing HAICHAO ZHU, The Chinese University of Hong Kong XUETING LIU, The Chinese University of Hong Kong XIANGYU MAO, The Chinese University of Hong Kong TIEN-TSIN WONG, The Chinese University of Hong Kong Soccer Leaves (a) Input frames (b) SRCNN (trained with our dataset) (c) Blown-ups (d) Ours Fig. 1. (a) Input interlaced frames. (b) Deinterlaced results generated by SRCNN [4] re-trained with our dataset. (c) Blown-ups from (b) and (d) respectively. (d) Deinterlaced results generated by our method. The classical super-resolution method SRCNN reconstruct each frame based on a single field and has large information loss. It also follows the conventional translation-invariant assumption which does not hold for the deinterlacing problem. Therefore, it inevitably generates blurry edges and artifacts, especially around sharp boundaries. In contrast, our method can circumvent this issue and reconstruct frames with higher visual quality and reconstruction accuracy. Interlacing is a widely used technique, for television broadcast and video captured for the following frame (Fig. 2(a), lower). It basically trades recording, to double the perceived frame rate without increasing the band- the frame resolution for the frame rate, in order to double the per- width. But it presents annoying visual artifacts, such as flickering and sil- ceived frame rate without increasing the bandwidth. Unfortunately, houette “serration,” during the playback. Existing state-of-the-art deinter- since the two half frames are captured in different time instances, lacing methods either ignore the temporal information to provide real-time there are significant visual artifacts such as line flickering and“ser- performance but lower visual quality, or estimate the motion for better dein- ration” on the silhouette of moving objects (Fig.
    [Show full text]
  • Guide to the Use of the ATSC Digital Television Standard, Including Corrigendum No
    Doc. A/54A 4 December 2003 Corrigendum No. 1 dated 20 December 2006 Recommended Practice: Guide to the Use of the ATSC Digital Television Standard, including Corrigendum No. 1 Advanced Television Systems Committee 1750 K Street, N.W. Suite 1200 Washington, D.C. 20006 www.atsc.org ATSC Guide to Use of the ATSC DTV Standard 4 December 2003 The Advanced Television Systems Committee, Inc., is an international, non-profit organization developing voluntary standards for digital television. The ATSC member organizations represent the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries. Specifically, ATSC is working to coordinate television standards among different communications media focusing on digital television, interactive systems, and broadband multimedia communications. ATSC is also developing digital television implementation strategies and presenting educational seminars on the ATSC standards. ATSC was formed in 1982 by the member organizations of the Joint Committee on InterSociety Coordination (JCIC): the Electronic Industries Association (EIA), the Institute of Electrical and Electronic Engineers (IEEE), the National Association of Broadcasters (NAB), the National Cable and Telecommunications Association (NCTA), and the Society of Motion Picture and Television Engineers (SMPTE). Currently, there are approximately 160 members representing the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor
    [Show full text]
  • IDENTIFYING TOP/BOTTOM FIELD in INTERLACED VIDEO Raja Subramanian and Sanjeev Retna Mistral Solutions Pvt Ltd., India
    IDENTIFYING TOP/BOTTOM FIELD IN INTERLACED VIDEO Raja Subramanian and Sanjeev Retna Mistral Solutions Pvt Ltd., India BOB + Use the Same as the Close to the Weaving previous top or source source. Lesser ABSTRACT bottom field artifacts than all This paper elaborates an approach that can be adopted to data to the above determine top/bottom fields in an interlaced video. Knowing construct the the top and bottom field is important if the video is de- frame at interlaced using Field Combination, Weaving + Bob, Discard doubled FPS and other algorithms based on motion detection. Determining the field information helps to re-construct the frame with lesser artifacts. This approach can be used if the The captured frame shown below depicts the stair-case top/bottom field information is not provided by video artifacts of discard algorithm. decoder chip. INTRODUCTION Interlaced video has been in use for more than 50 years. When dealing with interlaced video, de-interlacing algorithms are essential to remove any interlacing artifacts. There are many de-interlacing algorithms available for NTSC/PAL interlaced video. For low-end systems (system with less processing capability), following approaches can be considered:- Figure 1: Discard If the field order is unknown (or not provided by the video Algorithm Description Quality decoder chip) then Discard or Bob algorithm is the efficient Stationary Moving method to de-interlace. Weaving / Combine the Same as the Artifacts due to Field top and bottom source time delay Combine field to form a between top and single frame bottom field Discard Discard the top Stair case Stair case or bottom field, artifacts due to artifacts due to resize (double resize/line copy.
    [Show full text]
  • Analog TV Systems: Monochrome TV
    Analog TV Systems: Monochrome TV Yao Wang Polytechnic University, Brooklyn, NY11201 [email protected] Outline • Overview of TV systems development • Video representation by raster scan: – Human vision system properties – progressive vs. interlaced scan –NTSC video • Spectrum of a raster video • Multiplexing of audio and video • Multiplexing of Multiple TV Channel •TV Receivers © Yao Wang, 2003 EE4414: Monochrome Analog TV systems Overview of TV System Development • Analog Black and White TV: – 1941 NTSC standard settled, first commercial broadcast in US • Analog Color TV – 1950 first commercial color TV broadcast (CBS), incompatible with B/W systems – 1953 FCC approves RCA color TV system (compatible with B/W systems) (NTSC color) • Cable TV, Satellite TV, VCR – Cable TV becoming popular in 70’s • Digital TV through small dish satellite (MPEG2 encoded) – mid 90’s • DVD (MPEG-2 encoded) – mid 90’s • Digital TV broadcasting: SD and HD (MPEG-2 encoded) – Selected programs on air since late 90’s – All stations must broadcast in digital by 2006 (FCC requirement) See http://www.tvhistory.tv © Yao Wang, 2003 EE4414: Monochrome Analog TV systems 3 Analog TV Broadcasting and Receiving Luminance, RGB Chrominance, ---> Audio Modulation YC1C2 Multiplexing YC1C2 De- De- ---> Multiplexing Modulation RGB © Yao Wang, 2003 EE4414: Monochrome Analog TV systems 4 Questions to be Answered • How is video captured and displayed? – How to represent a monochrome video • Video raster – How to represent color video, and multiplex different color components (next
    [Show full text]