NTT Technical Review, May 2015, Vol. 13, No. 5

Total Page:16

File Type:pdf, Size:1020Kb

NTT Technical Review, May 2015, Vol. 13, No. 5 Feature Articles: Movable and Deployable ICT Resource Unit— Architecture for Quickly Responding to Unexpected ICT Demand A Media Storage and Transmission System Using MDRUs Takayuki Nakachi, Sun Yong Kim, Atsushi Okada, and Tatsuya Fujii Abstract Developing information and communication technology (ICT) networks and services that are resilient to disasters is a critical task. NTT Network Innovation Laboratories is promoting innovative research and development on a transportable ICT node called the Movable and Deployable ICT Resource Unit (MDRU), which is designed to promptly recover ICT services after a disaster. In this article, we intro- duce a media storage and transmission system based on the use of MDRUs. It efficiently provides infor- mation such as text and image/video data to people in disaster areas in circumstances when ICT resourc- es are constrained such as when network bandwidth is limited or storage failures occur. We demon- strated its feasibility in a field trial. Keywords: layered coding, distributed storage, image/video transmission 1. Introduction greatest extent possible within the capacity restric- tions posed by MDRUs. The proposed system is The Great East Japan Earthquake of March 11, implemented by the following steps. 20 damaged or completely destroyed many infor- () Transport and activation of stand-alone MDRUs mation and communication technology (ICT) to disaster areas resources, which demonstrated that the development An MDRU with a cache server is quickly trans- of resilient ICT networks and services is a critical ported to a disaster area. The MDRU holds a cache issue demanding urgent attention. Immediately after server that runs software for the media storage and a disaster, it is important to rapidly share accurate transmission system. Layered storage consisting of information. In addition to text data, image/video random-access memory, solid-state drive (SSD)*, data are important in disaster recovery. Image/video and hard disk drive (HDD)*2 devices is installed in content can instantly provide detailed information to the cache server. The MDRU forms a wireless access people in disaster-stricken areas. Examples of infor- network, with or without wired network connections, mation that may need to be transmitted or exchanged to reach user terminals such as laptop personal com- after a disaster are shown in Fig. 1. puters (PCs), smartphones, and tablet devices. People A portable media storage and transmission system within 500 m or so from the MDRU can view infor- can provide such information to people working in mation such as text and image/video data stored in the disaster relief offices and to evacuees at evacuation cache server and/or upload data to the cache server. centers and in their own homes. Our media storage and transmission system is based on the specially * SSD: A data storage device that uses integrated circuit assemblies designed Movable and Deployable ICT Resource as memory to store data persistently. Compared with electrome- Unit, which we refer to as an MDRU. MDRUs will be chanical disks such as HDDs, SSDs are typically more resistant to transported immediately to a disaster area to provide physical shock and have lower access time and less latency. *2 HDD: A data storage device that uses rapidly rotating disks coated recovery of ICT services. The media storage and with magnetic material. HDDs remain the dominant medium due transmission system must transmit information to the to advantages in price per unit of storage and recording capacity. NTT Technical Review Feature Articles (2) Evacuation lists handwritten information Images/video can stored as images convey information instantly. (1) Photos and videos from individuals taken by smartphones, tablet computers, etc. MDRU Evacuation center MDRU coverage area (500-m radius) (3) Surveillance camera images and videos used for observing infrastructure such as arterial roads, airports, and Image/video sharing harbor facilities and viewing Fig. 1. Information needed in disaster areas. MDRU (equipped with cache server) MDRU Transmission based on bandwidth Distribution and sharing of images/video Upload Desktop computer (Viewing) MDRU MDRU MDRU MDRUs Smartphone (equipped with distributed storage) (image/video capturing, viewing) Fig. 2. Operation of multiple MDRUs in local disaster areas. (2) Operation of multiple MDRUs in local disaster area can watch news coverage originating outside the areas disaster area and share information about other As many MDRUs as needed are deployed to the areas. disaster area. Each MDRU has interworking storage The media storage and transmission system is devices that form a distributed storage system along based on two technologies. One is disaster resilient with the cache server. Data uploaded to each MDRU layered data transmission [], and the other is net- can be backed up to and shared among the multiple worked distributed storage [2]. The former transfers MDRUs (Fig. 2). information efficiently to work within the MDRU’s (3) Operation of multiple MDRUs connected to limited network bandwidth and power supply. The wide area network latter recovers data lost by ) network congestion or MDRUs will be connected to a local datacenter and disconnection or 2) storage failures caused by MDRU wide area network. Information data can be widely accidental power-off or MDRU transport. More shared via the local datacenter. People in the disaster details of these technologies are given below. Vol. 13 No. 5 May 2015 2 Feature Articles MDRU Layered storage Layered data (3) Transmission of layered code stream Low Large Low HDD g PL2 High resolution SSD PL1 Priority Mappin Memory Capacity Reliability PL0 High Small High Low resolution (2) Mapping layered code (1) Layered stream to layered storage image/video coding Fig. 3. Layered data transmission. 2. Disaster resilient layered data transmission (2) Mapping layered code-stream to layered stor- age The proposed disaster resilient layered data trans- The layered storage system consists of random- mission system offers image/video services to people access memory, SSD, and HDD devices. The JPEG in disaster-affected areas through MDRUs. The disas- 2000 bit streams are mapped to the layered storage as ter resilient image/video transmission scheme is shown in Fig. 3. PL0, PL1, and PL2 are priority-map- designed around three technologies: () layered ping layers and are mapped to random-access memo- image/video coding, (2) mapping of the layered code ry, SSD, and HDD, respectively. The progression stream to layered storage systems installed in the order of the JPEG 2000 bit stream is determined in MDRUs, and (3) transmission of layered code advance. Different scalability levels are achieved by streams according to network bandwidth (Fig. 3). ordering packets and can be arbitrarily assigned to () Layered image/video coding each form of storage. If MDRU power consumption We utilize the JPEG 2000 standard [3] created by becomes a problem, we can turn off the HDD devices, the Joint Photographic Experts Group (JPEG) for and if additional energy saving measures are neces- layered image/video coding. Input images are decom- sary, the SSD devices can be deactivated. posed into layered data in a priority manner; that is, (3) Transmission of layered code stream JPEG 2000 bit streams are created. JPEG 2000 offers JPEG 2000 is attractive as it can truncate lower four basic scalability dimensions in each JPEG 2000 priority bit streams from partial ones. This allows for bit stream: resolution (R), SNR (signal-to-noise ratio) simple responses to changes in network bandwidth quality (L), spatial location (P), and component (C). and MDRU power consumption. The truncation pro- Different scalability levels are achieved by ordering cess consists simply of extracting the required layers packets within the JPEG 2000 bit stream. We assume from the encoded bit stream; there is no additional that text data consist of handwritten notes on a bul- letin board set up in an evacuation center. The hand- written information is captured by smartphones and/ *3 H.264/MPEG-4 AVC (Advanced Video Coding): A video coding standard that was developed by the JVT (Joint Video Team) of the or tablet PCs. We use Motion JPEG 2000 (MJ2) for ITU-T (International Telecommunications Union-Telecommuni- the layered video coding. It specifies the use of the cation Standardization Sector) and the ISO/IEC (International JPEG 2000 format for timed sequences of images Organization for Standardization / International Electrotechnical (video). Other layered video codecs such as Scalable Commission) in 2003. It covers common video applications rang- ing from mobile services and video conferencing to IPTV (Inter- Video Coding (SVC), which is a scalable extension of net protocol television), high-definition (HD) TV, and HD video. H.264/MPEG-4 AVC*3, or Scalable High Efficiency *4 H.265/HEVC (High Efficiency Video Coding): The most recent Video Coding (SHVC), a scalable extension of standardized video compression standard developed by the JCT- H.265/HEVC*4, can also be used. We selected MJ2 VC (Joint Collaborative Team on Video Coding) of the ITU-T and the ISO/IEC. It reduces the bit rate by around 50% while maintain- because of its transmission robustness and low-com- ing the same subjective video quality relative to its predecessor plexity compared with SVC and SHVC. H.264/ MPEG-4 AVC. 3 NTT Technical Review Feature Articles processing of the stream itself. The MDRU forms a Layered storage wireless access network around itself to reach user- Layered LDGM held equipment. The truncation consists of extracting Layered data the required layers based on the bandwidth between Low Redundant data the MDRU and the user. An example of layered trans- PL2 mission is shown in Fig. 3 when the priority of the Low PL1 Mid resolution is selected. One usage example is that the Priority High user can see a specific full-resolution image after High PL0 overviewing multiple thumbnail-sized images. Send- ing only the higher resolution components of an image specified by a user results in more efficient use Distributed storage of network bandwidth resources.
Recommended publications
  • (L3) - Audio/Picture Coding
    Committee: (L3) - Audio/Picture Coding National Designation Title (Click here to purchase standards) ISO/IEC Document L3 INCITS/ISO/IEC 9281-1:1990:[R2013] Information technology - Picture Coding Methods - Part 1: Identification IS 9281-1:1990 INCITS/ISO/IEC 9281-2:1990:[R2013] Information technology - Picture Coding Methods - Part 2: Procedure for Registration IS 9281-2:1990 INCITS/ISO/IEC 9282-1:1988:[R2013] Information technology - Coded Representation of Computer Graphics Images - Part IS 9282-1:1988 1: Encoding principles for picture representation in a 7-bit or 8-bit environment :[] Information technology - Coding of Multimedia and Hypermedia Information - Part 7: IS 13522-7:2001 Interoperability and conformance testing for ISO/IEC 13522-5 (MHEG-7) :[] Information technology - Coding of Multimedia and Hypermedia Information - Part 5: IS 13522-5:1997 Support for Base-Level Interactive Applications (MHEG-5) :[] Information technology - Coding of Multimedia and Hypermedia Information - Part 3: IS 13522-3:1997 MHEG script interchange representation (MHEG-3) :[] Information technology - Coding of Multimedia and Hypermedia Information - Part 6: IS 13522-6:1998 Support for enhanced interactive applications (MHEG-6) :[] Information technology - Coding of Multimedia and Hypermedia Information - Part 8: IS 13522-8:2001 XML notation for ISO/IEC 13522-5 (MHEG-8) Created: 11/16/2014 Page 1 of 44 Committee: (L3) - Audio/Picture Coding National Designation Title (Click here to purchase standards) ISO/IEC Document :[] Information technology - Coding
    [Show full text]
  • MMT(MPEG Media Transport)
    TECH & TREND 김용한 서울시립대학교 전자전기컴퓨터공학부 교수 TECH & MMT(MPEG Media TREND Transport) 표준의 이해 + 김용한 서울시립대학교 전자전기컴퓨터공학부 교수 MPEG(Moving Picture Experts Group)은 ISO/IEC JTC 1/SC 29/WG 11의 별칭으로서, 디지털 방송, 인터넷 멀티미디어 전송, 멀티미디어 저장 매체 등의 분야에서 널리 활용되고 있는 MPEG-1, MPEG-2, MPEG-4 등의 표준을 제정한 국제표준화 그룹이다. 특히 MPEG에서 제정한 표 준 중에서 ISO/IEC 13818-1이라는 공식 표준번호를 갖고 있는 MPEG-2 시스템 표준은 초기 디지털 방송 방식에서부터 HDTV, DMB 등에 이 르기까지 압축된 비디오, 오디오, 부가 데이터를 다중화하고 비디오와 오디오의 동기화를 달성하며 송신 측에서 수신 측으로 필요한 제어 정 보를 전달하는 방식으로 널리 활용되어 왔다. 올해 제정되고 있는 국내 4K UHDTV 표준들에도 이 MPEG-2 시스템 표준이 채택될 예정이 다. MPEG-2 시스템 표준에 맞게 구성된 비트스트림을 트랜스포트 스트림(Transport Stream, TS)이라 부르는 관계로, 많은 이들이 이 표준을 MPEG-2 TS 표준이라 부르기도 한다. MPEG-2 TS 표준이 제정된 것은 1990년대 초반으로, 인터넷이 아직 글로벌 네트워크로서 일반화되기 전이었다. 당시에는 ATM(Asynchronous Transfer Mode, 비동기 전송 모드)에 의한 차세대 네트워크가 인터넷을 대체하여 새로운 글로벌 네트워크가 될 수 있도 록 여러 나라에서 연구개발 역량을 집중하고 있을 때였다. IP 기반의 네트워크에 비해, ATM 기반 네트워크는 여러 가지 측면에서 최적화된 구조를 가지고 있었음에도 불구하고, 아이러니하게도 이러한 최적화된 기능의 장점을 무색하게 만들 정도로 고속 전송이 가능한 패스트 이 더넷이 출현하면서, IP 기반의 인터넷이 ATM 기반의 네트워크보다 더욱 널리 사용되게 되었고 그야말로 현재 인터넷은 유일한 글로벌 네트 워크가 되었다. 이에 따라 모든 통신망은 IP가 대세인 시대가 지속되고 있다. 급속한 통신 기술의 발전과 환경의 변화로 말미암아 20여 년 전 MPEG-2 TS 표준이 제정되던 때의 환경과 지금의 환경은 너무도 달라졌다.
    [Show full text]
  • Scene-Based Audio and Higher Order Ambisonics
    Scene -Based Audio and Higher Order Ambisonics: A technology overview and application to Next-Generation Audio, VR and 360° Video NOVEMBER 2019 Ferdinando Olivieri, Nils Peters, Deep Sen, Qualcomm Technologies Inc., San Diego, California, USA The main purpose of an EBU Technical Review is to critically examine new technologies or developments in media production or distribution. All Technical Reviews are reviewed by 1 (or more) technical experts at the EBU or externally and by the EBU Technical Editions Manager. Responsibility for the views expressed in this article rests solely with the author(s). To access the full collection of our Technical Reviews, please see: https://tech.ebu.ch/publications . If you are interested in submitting a topic for an EBU Technical Review, please contact: [email protected] EBU Technology & Innovation | Technical Review | NOVEMBER 2019 2 1. Introduction Scene Based Audio is a set of technologies for 3D audio that is based on Higher Order Ambisonics. HOA is a technology that allows for accurate capturing, efficient delivery, and compelling reproduction of 3D audio sound fields on any device, such as headphones, arbitrary loudspeaker configurations, or soundbars. We introduce SBA and we describe the workflows for production, transport and reproduction of 3D audio using HOA. The efficient transport of HOA is made possible by state-of-the-art compression technologies contained in the MPEG-H Audio standard. We discuss how SBA and HOA can be used to successfully implement Next Generation Audio systems, and to deliver any combination of TV, VR, and 360° video experiences using a single audio workflow. 1.1 List of abbreviations & acronyms CBA Channel-Based Audio SBA Scene-Based Audio HOA Higher Order Ambisonics OBA Object-Based Audio HMD Head-Mounted Display MPEG Motion Picture Experts Group (also the name of various compression formats) ITU International Telecommunications Union ETSI European Telecommunications Standards Institute 2.
    [Show full text]
  • Final Report of the ATSC Planning Team on Internet-Enhanced Television
    Final Report of the ATSC Planning Team on Internet-Enhanced Television Doc. PT3-028r14 27 September 2011 Advanced Television Systems Committee 1776 K Street, N.W., Suite 200 Washington, D.C. 20006 202-872-9160 ATSC PT3-028r14 PT-3 Final Report 27 September 2011 The Advanced Television Systems Committee, Inc. is an international, non-profit organization developing voluntary standards for digital television. The ATSC member organizations represent the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries. Specifically, ATSC is working to coordinate television standards among different communications media focusing on digital television, interactive systems, and broadband multimedia communications. ATSC is also developing digital television implementation strategies and presenting educational seminars on the ATSC standards. ATSC was formed in 1982 by the member organizations of the Joint Committee on InterSociety Coordination (JCIC): the Electronic Industries Association (EIA), the Institute of Electrical and Electronic Engineers (IEEE), the National Association of Broadcasters (NAB), the National Cable Telecommunications Association (NCTA), and the Society of Motion Picture and Television Engineers (SMPTE). Currently, there are approximately 150 members representing the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries. ATSC Digital TV Standards include digital high definition television (HDTV),
    [Show full text]
  • Design and Implementation for Media-Aware Transmission Method Based on Mmt
    Journal of Theoretical and Applied Information Technology 31st January 2018. Vol.96. No 2 © 2005 – ongoing JATIT & LLS ISSN: 1992-8645 www.jatit.org E-ISSN: 1817-3195 DESIGN AND IMPLEMENTATION FOR MEDIA-AWARE TRANSMISSION METHOD BASED ON MMT 1JONGHO PAIK, 1 SONGYEON LEE, 1 KYUNGA YU 1Department of Software Convergence, Seoul Women’s University, Korea E-mail: 1 [email protected] ABSTRACT Ultra-High-Definition (UHD) broadcasting, which is one of advanced broadcasting technologies, has met the user’s needs for high-definition and realistic services. By the way, UHD content is too large to be transported due to the limited bandwidth of existing HD broadcasting systems. In order to solve this problem, a lot of advanced broadcasting technologies have been suggested. One of the related researches proposed a UHD broadcasting service that each layer of video separated hierarchically are transmitted over heterogeneous networks. Unlike the existing MPEG-2 TS based on stream mode, UHD contents are transmitted based on packet mode. Therefore, to ensure the Quality of Service (QoS) of UHD content against inevitable packet delay, the UHD broadcasting system has to be taken into account the priorities of transmission depending on type of media and the layers of video simultaneously. In this paper, we propose a media-aware transmission method and also experimental results will be shown to verify the effectiveness of the proposed method. Keywords: MPEG Media Transport, Transmission Model, Scheduling, UHD, Hybrid Broadcasting System 1. INTRODUCTION broadcasting system has to be taken into account the priorities of transmission depending on type of As video technology and device media and the layers of video simultaneously [4].
    [Show full text]
  • Multipath MMT-Based Approach for Streaming High Quality Video Over Multiple Wireless Access Networks
    Multipath MMT-Based Approach for Streaming High Quality Video over Multiple Wireless Access Networks Samira Afzala,∗, Christian Esteve Rothenberg.b, Vanessa Testonic, Prakash Koland, Imed Bouazizie aDepartment of Electronic Systems Engineering, Polytechnic School, University of São Paulo (USP), São Paulo,SP, Brazil bSchool of Electrical and Computer Engineering (FEEC), University of Campinas (Unicamp), Campinas, SP, Brazil. cSamsung Research Brazil (SRBR), Campinas, SP, Brazil. dSamsung Research America (SRA), Dallas, TX, USA eQualcomm, San Diego, CA, USA Abstract Demand for video streaming content and multimedia services have increased dramatically. Delivering high throughput and low delay video data are real challenges to provide and sustain high quality mobile video streaming services. One type of solution to provide quality of experience is to exploit multipath strategies by taking advantage of the multiple network interfaces that are currently available in most wireless devices. Our work aims at delivering contributions by exploring the advantages of multipath video streaming using the MPEG Media Transport protocol. MPEG Media Transport is a multimedia application layer protocol with the capability of hybrid media delivery. In this article, we propose a Content-Aware and Path-Aware (CAPA) scheduling strategy for this protocol, considering both video features and path condition. The performance of our CAPA proposal is evaluated using ns-3 DCE with different realistic multipath network scenarios. We evaluate the performance of CAPA over heterogeneous wireless networks under congested network and wireless lossy network conditions, which are common network situations with significant adverse effects on video quality. Our approach yields considerable video quality improvement in both scenarios compared to Path- Aware strategy and a simple scheduling strategy, called Evenly Splitting.
    [Show full text]
  • Ts 103 589 V1.1.1 (2018-06)
    ETSI TS 103 589 V1.1.1 (2018-06) TECHNICAL SPECIFICATION Higher Order Ambisonics (HOA) Transport Format 2 ETSI TS 103 589 V1.1.1 (2018-06) Reference DTS/JTC-047 Keywords audio, broadcasting, TV, UHDTV ETSI 650 Route des Lucioles F-06921 Sophia Antipolis Cedex - FRANCE Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16 Siret N° 348 623 562 00017 - NAF 742 C Association à but non lucratif enregistrée à la Sous-Préfecture de Grasse (06) N° 7803/88 Important notice The present document can be downloaded from: http://www.etsi.org/standards-search The present document may be made available in electronic versions and/or in print. The content of any electronic and/or print versions of the present document shall not be modified without the prior written authorization of ETSI. In case of any existing or perceived difference in contents between such versions and/or in print, the only prevailing document is the print of the Portable Document Format (PDF) version kept on a specific network drive within ETSI Secretariat. Users of the present document should be aware that the document may be subject to revision or change of status. Information on the current status of this and other ETSI documents is available at https://portal.etsi.org/TB/ETSIDeliverableStatus.aspx If you find errors in the present document, please send your comment to one of the following services: https://portal.etsi.org/People/CommiteeSupportStaff.aspx Copyright Notification No part may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying and microfilm except as authorized by written permission of ETSI.
    [Show full text]
  • Iso/Iec Jtc 1 N 13155
    ISO/IEC JTC 1 N 13155 ISO/IEC JTC 1 Information technology Secretariat: ANSI (United States) Document type: Business Plan Title: Business Plan for JTC 1/SC 29, Coding of Audio, Picture, Multimedia and Hypermedia Information for the Period covered: September 2015 - August 2016 Status: This document is circulated for review and consideration at the November 2016 JTC 1 meeting in Norway. Date of document: 2016-10-03 Source: SC 29 Chair Expected action: ACT Action due date: 2016-11-07 Email of secretary: [email protected] Committee URL: http://isotc.iso.org/livelink/livelink/open/jtc1 JTC 1/SC 29 Business Report Coding of Audio, Picture, Multimedia and Hypermedia Information Period covered: September 2015 - August 2016 1.0 Executive summary SC 29 has been developing and delivering the international standards which are basis of digital media services and applications. Those standards are designed to represent, package, record, preserve and convey digital media information. The standards support functionalities such as coding, composition, transport, filing and archiving of media and their control, interface, middleware for general and/or specific applications. In the period SC 29 has made significant progress in coding of still pictures with extended bit-depth, lossless compression and alpha channel supporting backward compatibility with JPEG standard (JPEG-XT), coding of natural video with enhancement to screen content (MPEG-H HEVC-SCC), immersive audio (MPEG-H 3D Audio), 3D graphics objects (Web3DCoding), Application Formats (Augmented Reality, Multimedia Preservation, Publish/Subscribe, Media Linking), MPEG-V and other standards. In the marketplace HEVC has been implemented in smart-phones and devices for Ultra High Definition Television (UHDTV) such as 4K/8K.
    [Show full text]
  • Hevc/H.265 Codec System and Transmission Experiments Aimed at 8K Broadcasting
    HEVC/H.265 CODEC SYSTEM AND TRANSMISSION EXPERIMENTS AIMED AT 8K BROADCASTING Y. Sugito1, K. Iguchi1, A. Ichigaya1, K. Chida1, S. Sakaida1, H. Sakate2, Y. Matsuda2, Y. Kawahata2 and N. Motoyama2 1NHK, Japan and 2Mitsubishi Electric Corporation, Japan ABSTRACT This paper introduces the world’s first video and audio codec system that complies with 8K broadcasting standards and describes transmission experiments via a broadcasting satellite using this system. 8K Super Hi-Vision (8K) is a broadcasting system capable of highly realistic 8K Ultra High Definition Television (UHDTV) video and 22.2 multichannel audio. In Japan, domestic standards for 8K broadcasting were formulated in 2014 and 8K test broadcasting will begin in 2016. We have developed an 8K High Efficiency Video Coding (HEVC)/H.265 codec system that complies with the domestic standards. In this paper, we first explain the features and a roadmap for 8K broadcasting. We then introduce the features of the 8K HEVC/H.265 codec system developed. Finally, we describe transmission experiments using a satellite system that is equivalent to the one that will be used in test broadcasting. The results allowed us to confirm that the developed system provides high-quality transmission at the expected bit rate during test broadcasting. INTRODUCTION An 8K Super Hi-Vision (8K) broadcasting system capable of highly realistic 8K Ultra High Definition Television (UHDTV) video and 22.2 multichannel (22.2 ch) audio is currently under development. In Japan, domestic standards for 8K broadcasting were formulated in 2014 and 8K test broadcasting using a broadcasting satellite will begin in 2016. The standards prescribe video coding, audio coding, multiplexing and transmission schemes, and other such procedures.
    [Show full text]
  • Service Configuration, Media Transport Protocol, and Signalling Information for MMT-Based Broadcasting Systems
    Recommendation ITU-R BT.2074-0 (05/2015) Service configuration, media transport protocol, and signalling information for MMT-based broadcasting systems BT Series Broadcasting service (television) ii Rec. ITU-R BT.2074-0 Foreword The role of the Radiocommunication Sector is to ensure the rational, equitable, efficient and economical use of the radio- frequency spectrum by all radiocommunication services, including satellite services, and carry out studies without limit of frequency range on the basis of which Recommendations are adopted. The regulatory and policy functions of the Radiocommunication Sector are performed by World and Regional Radiocommunication Conferences and Radiocommunication Assemblies supported by Study Groups. Policy on Intellectual Property Right (IPR) ITU-R policy on IPR is described in the Common Patent Policy for ITU-T/ITU-R/ISO/IEC referenced in Annex 1 of Resolution ITU-R 1. Forms to be used for the submission of patent statements and licensing declarations by patent holders are available from http://www.itu.int/ITU-R/go/patents/en where the Guidelines for Implementation of the Common Patent Policy for ITU-T/ITU-R/ISO/IEC and the ITU-R patent information database can also be found. Series of ITU-R Recommendations (Also available online at http://www.itu.int/publ/R-REC/en) Series Title BO Satellite delivery BR Recording for production, archival and play-out; film for television BS Broadcasting service (sound) BT Broadcasting service (television) F Fixed service M Mobile, radiodetermination, amateur and related satellite services P Radiowave propagation RA Radio astronomy RS Remote sensing systems S Fixed-satellite service SA Space applications and meteorology SF Frequency sharing and coordination between fixed-satellite and fixed service systems SM Spectrum management SNG Satellite news gathering TF Time signals and frequency standards emissions V Vocabulary and related subjects Note: This ITU-R Recommendation was approved in English under the procedure detailed in Resolution ITU-R 1.
    [Show full text]
  • JTC 1 SC29 WG1 and WG11 Meetings: Below Are Some Brief Notes on Work in MPEG and JPEG That Could Be Relevant to IEC TC100 1
    JTC 1 SC29 WG1 and WG11 Meetings: October/November 2013 Geneva January 2014 San Jose, California March/April 2014 Valencia, Spain. Forthcoming meetings: July 2014 Sapporo, Japan October 2014 Strasbourg, France Note: WG1 and WG11 meetings co-located from October 2013—October 2014 Below are some brief notes on work in MPEG and JPEG that could be relevant to IEC TC100 1. MPEG news 1.1 ISO/IEC 14496-10 Advanced Video Coding Several amendments will be integrated in a new edition including: “3D-AVC”, which carries video texture of an AVC high profile compatible base view along with one or several depth maps and allows more efficient compression of additional views, such that the bit rate of either multiview or multiview-plus depth representations can be further decreased; “Multi-resolution frame compatible” (MFC), which enhances frame-compatible stereo formats to full resolution by encoding a very compact difference signal; “Additional colour space and tone mapping descriptors”, which enables signalling of metadata as needed e.g. for wide-gamut chroma formats. 1.2 ISO/IEC 23008-1: MPEG Media Transport(MMT) This standard enables the efficient delivery of emerging types of services in the heterogeneous environments such as: hybrid service or multiscreen service. MMT inherits the technical advantages of the widely used MPEG2-TS standard, such as a self-contained multiplexing structure, strict timing model and reference buffer model in the emerging IP environments while incorporating modern features such as the flexible splicing of content, name based access of data and AL-FEC (application layer forward error correction) enabling multiple Qualities of Service within one packet flow.
    [Show full text]
  • Understanding Timelines Within MPEG Standards
    IEEE COMMUNICATIONS SURVEYS & TUTORIALS, VOL. XX, NO. XX, XXXXXX 2015 1 Understanding Timelines within MPEG Standards Lourdes Beloqui Yuste, Member, IEEE, Fernando Boronat, Senior Member, IEEE, Mario Montagud, Member, IEEE, and Hugh Melvin, Member, IEEE Abstract— Nowadays, media content can be delivered via advances in (IP) broadband delivery technologies, combined diverse broadband and broadcast technologies. Although these with their widespread deployment, has sparked the growth in different technologies have somehow become rivals, their coordi- media delivery using this kind of distribution channels [2]. In nated usage and convergence, by leveraging of their strengths and complementary characteristics, can bring many benefits to both this context, media can be delivered by using different forms operators and customers. For example, broadcast TV content of streaming and downloading techniques. can be augmented by on-demand broadband media content to For broadcast delivery, Moving Picture Experts Group provide enriched and personalized services, such as multi-view (MPEG) standards are the means used by DVB technologies. TV, audio language selection and inclusion of real-time web feeds. For broadband delivery, Real-Time Transport Protocol (RTP) A piece of evidence is the recent Hybrid Broadcast Broadband TV (HbbTV) standard, which aims at harmonizing the delivery and [3] and HTTP-based Adaptive Streaming (HAS) solutions consumption of (hybrid) broadcast and broadband TV content. are commonly used [1]. For instance, MPEG has proposed A key challenge in these emerging scenarios is the synchro- Dynamic Adaptive Streaming over HTTP (MPEG-DASH) [4], nization between the involved media streams, which can be as a client-driven streaming solution that aims at improving originated by the same or different sources, and delivered via the the adaptability, smoothness and continuity of media play-out same or different technologies.
    [Show full text]