<<

Embedded

National Chiao Tung University Chun-Jen Tsai 3/21/2011 Embedded Middleware

 “Middleware”is a very popular terminology in design  Just like “cloud ,”it is merely a new name for old technologies  We can further divide embedded middleware into two layers  System middleware –an API that provides system abstraction to to increase application portability  middleware –an API specification that acts as a hardware abstraction layer for chip-level integration

2/48 Why System Middleware

 The purpose of system middleware is to provide a homogeneous environment for applications  does the same thing, but due to the variety of embedded hardware and competitions among OS companies, we need something beyond OS

application Defined by an international standard organization

Middleware Middleware Middleware

Designed by a OS 1 OS 2 OS 1 system manufacturer Hardware 1 Hardware 1 Hardware 2

3/48 System Middleware Components

 A system middleware must provide the following abstractions for () applications  Application model  GUI subsystem

 Graphics, video, audio rendering

 User event handling  Timing control  File system

e.g. DVB-MHP application

4/48 DVB-MHP Architecture Revisited

 Multimedia Home Platform functional blocks:

Native Application Data Xlets Xlets based on GEM ) )

A Sun

2 DVB HAVI DAVIC ) A - y -TV - r n c i

G (Event, Services (UI (FS a E e (Application r w E d

H Abstraction) Abstraction) Abstraction)

b Transport o ( i o P Model) r l

c c C M i

c Demux

i e ,

o Platform-dependent M h d

4 JMF 1.1 e (MPEG-2 TS, Application Manager . p o modules 6 d g a i C . 2 r RTSP/RTP) (Navigator)

. V

SW e o G ( H i ( d u

A MHP Middleware

Operating Systems Java Standard Classes (CDC/PBP)

RISC Graphics audio Video Java HW I/O Devices (< 300MHz) accelerator accelerator accelerator Processor SoC

5/48 Java-based Middleware

 DVB-MHP 1.0 (aka DVB-J) system abstraction is based on Java RE, just like many other middleware  In particular, CDC/PBP is adopted for MHP  Google Android is also based on Java Model  Google defined their own “profile”and VM (not JVM byte- code compatible)  Rationale: Java is “virtually a plus an OS!”

Applets, Xlets, Applications MIDlets, etc.

Class , GC, OS API classes, etc.

CPU VM Exec. Engine

Real Computer Model Java Model 6/48 Native vs. Java-based Middleware

 We can use native code to design system middleware. For example, BREW from Qualcomm is a native middleware for mobile handsets:  A C/C++ development environment for ARM-based platforms  Application: BREW “Applet”model; (IThread API)  Graphics API: OpenGL/ES  Multimedia API: IMedia  Supposedly, BREW is faster than Java, but with a properly designed system, this may not be true

7/48 MHP Application Model Abstraction

 Application implements the javax.tv.xlet.Xlet  Similar to java.applet.Applet, but simpler  Methods are called in the following sequence:

 initXlet()

 startXlet() : resource allocation begins here!

 destroyXlet()  Xlet life cycle (initiated by the MHP application manager, which sets up an XletContext for the Xlet) initXlet() startXlet() Start Loaded Paused Active

destroyXlet() pauseXlet()

destroyXlet() destroyXlet()

Destoryed 8/48 Middleware Architecture for DVB-J

 MHP middleware layers†:

Application Manager and Xlet API DVB UI HAVi

Service Selection Conditional Return Inter-Xlet JMF Access Channel AWT Comm.

Tuning UI Events

DSM-CC SI

Transport Section Filtering Stream

MPEG Java

†S. Morris and A. Smith-Chaigneau, Interactive TV Standards, Elsevier, 2005 9/48 DVB-J

 Java Language API  Application Model (Life Cycle) API  Based on Java TV xlets  Graphical API  Based on AWT and HAVi UI level 2  Data Access API  Multimedia API  Based on JMF  Service and User Settings API

10/48 MHP Application UI Construction

 An MHP application can allocate screen resource, called HScene, using HAVi API  An HScene represents a region on the TV display HScreen

HScene

 UI components are mainly drawn using  java.awt.Graphics primitives  HAVi widget classes

11/48 MHP Display Model Abstraction

 The MHP display model is defined in Sun Java-TV middleware:

Background Background HBackgroundDevice Data Decoder

Video Video Decoder JMF Decoder Format scaling HVideoDevice Data Conversion Manipulation + HScreen broadcast TV metadata user preference Behavior Control display info. Format signaling Application to display Behavior Control

Graphics Graphics HGraphicsDevice Implementation? Data Composition

Graphics primitives created using 12/48 java.awt and HAVi level 2 UI MHP Display Model Implementation

 If MHP adopts Sun’s PBP reference implementation:

add() widgets org.havi.ui.HScene

Should be java.awt.Container Java PBP Class Libraries merged for Portable () efficiency! java.awt.Component Native Graphics Library (X11)

OS () fbdev driver

HScene

VGA/DVI HScreen HScreen HW video frame buffer Controller has has has

HVideoDevice HGraphicsDevice HBackgroundDevice

HScreenDevice 13/48 Building a GUI Application

 During initXlet()  Configure the required HScene and retrieve the handle to the HScene object  During startXlet()  Create widget components under the HScene  Display the HScene using setVisible(true)  Request the focus  During destroyXlet  Hide the HScene  Release the allocated resources (components, images, etc.)

14/48 Controlling Media from Xlets

 Xlets can control audio/video media using Java Media Framework  Based on JMF 1.0  Use org.davic.net.dvb.DVBLocator to select what to show  MHP provides extra JMF controls for  Video scaling  Subtitle control and service components  Audio playback fine control (for local playback)  Notification of changes in incoming TV signals

15/48 JMF Class Architecture

 JMF is used as media playback Manager

interface in MHP has a Clock TimeBase  Only JMF 1.1 API Duration extends are used: JMF extends

 Player Controller MediaHandler

 DataSource extends extends has a  Control Player DataSource

Class AWT Networking JNI Lang Library

JVM

16/48 Video-Graphics Integration†for MHP

getBestConfiguration() 1 n HScreen HScreenConfiguration

Java TV 1 getDefaultVideoDevice() extends getSource() n 1 1 1 n locator ServiceContext HVideoDevice HVideoConfiguration 1 1 1 select() 1 Implements 1 getVideoDevice() HVideoComponent Constructed HAVi L2 UI Constructed with new 1 1 with new DataSource ServiceContext MediaSelector (MediaLocator) (locator) 1 getSource()

1 getVideoComponent() 1 1 1 MediaLocator MediaPlayer Player Implements Java Media Framework

†iTV Handbook, Prentice Hall 2003 17/48 DVB File System Model†

to cable Remote network file Head station system

Up-link Tuner, demodulation, & video encoder Channel channel decode MPEG coding & TV 1 audio encoder program modulation MUX data encoder MPEG Transport User input, DEMUX program selection 38 Mbps conditional access MPEG encoder video TV 2 transport MPEG Program Local TV 3 MUX encoder audio file DEMUX system TV 4 encoder data

†L. Atzori et al., “Multimedia Information Broadcasting Using Digital TV Channels,”IEEE T-Broadcasting, Sep. 199718/48 DVB File System Abstraction

 DVB applications (Xlets, etc.) and data can be transmitted to STBs using DSM-CC Object Carousel†

Local virtual file system

†D.-H. Park et al., “Real-Time Carousel Caching and Monitoring in Data Broadcasting,”IEEE T-Consumer 19/48 , Feb. 2006 Extraction of Xlets from DSM-CC

 One possible implementation of Xlets extractor:

DSM-CC Parser

Xlet Object DII DII Cache Cache Cache

Real-Time Thread Pool Xlet Running Application Space Manager

Version Manager Monitor Download Threads Threads DSM-CC Cache Space

Incoming data stream Section Filter User Input

20/48 Synchronization

 Synchronization of application with media are facilitated by the following API functions:  DSM-CC stream events

 org.dvb.dsmcc.DSMCCStreamEvent.subscribe()

 org.davic.media.StreamEventControl.subscribeStreamEvent()  Media Time (NPT)

 org.davic.media.MediaTimeEventControl  Private section events

 org.davic.mpeg.sections

21/48 Java for 2.5/3G Mobile Services

 3GPP adopts the concept of middleware for CPE  For application middleware, Java RE has been selected (CLDC/MIDP 2.0)  For multimedia device integration middleware, OpenMAX has been proposed (by ARM, Motorola, Nokia, TI, etc)  Google also selects Java R.E. for G-phone, but use their own and profile libraries  This avoids Sun’s J2ME licensing fee

22/48 Google Android Architecture

Applications Home Contacts Phone Browser ...

Application Framework

Activity Window Content View Notification Manager Manager Providers System Manager

Package Telephony Resource Location XMPP Manager Manager Manager Manager Service

System Libraries

Surface Manager Media Framework SQLite Core Libraries OpenGL/ES FreeType WebKit Dalvik Virtual Machine SGL SSL C Library

Linux Kernel

23/48 Linux’s Role in Android

 Java model does not have an I/O subsystem, is only (hopefully) used to provide  Hardware abstraction layer (for different drivers)  Multi-threading management  Shared library interface  The following “infamous”Linux components are not included  Native windowing system (e.g. X11)  glibc

24/48 Android Applications

 An Android application package (APK) is a collection of components which usually run in one thread  Services or ContentProviders sometimes need another thread ()

Activity Activity Activity Activity

ContentProvider ContentProvider Process

Service Service

Process Process APK Package 1 APK Package 2

25/48 Android Application Model

 An android application has four building blocks  Activity  an operation that optionally associated with a UI component (window), similar to execution context in MHP  Service  background task  Content Provider  file system abstraction  Intent Receiver  event handler

Thread Local Intent Service Call Receiver

Looper UI Thread Events Activity

External Service Call Message System Activity Queue Events

26/48 Activities Life Cycle

Activity starts

 Application activities onCreate() onRestart() User navigates (e.g. UI objects) have back to the onStart() following entry points activity Process is killed onResume()  Life cycle begins with onCreate(), ends with Activity running The activity comes to the onDestroy() foreground Another activity  Visible life cycle begins comes in front of the with onStart(), ends activity Other with onStop() applications onPause() need memory  Foreground (in-focus) activity no longer visible activity comes to the life cycle begins with foreground onResume(), ends with onStop()

onPause() onDestory()

Activity shutdown 27/48 an Android Platform

 Init process starts the zygote process:  Zygote spawns and initializes Dalvik VM instances based on requests; it also loads classes

Android Managed Services

Window Package Activity Manager Manager Manager Audio Flinger

Surface Flinger

Service Manager System

daemons daemonsdaemons runtime Zygote Dalvik VM

Init 28/48 Invoking Media Accelerators

 Invocation of audio device from a Java application

Application

Application Framework (Java) MediaPlayer

JNI C library Binder IPC MediaPlayer Audio Flinger

Dynamic loading

Media libaudio.so Framework

ALSA

Linux Kernel

29/48 System Middleware Issues

 Although system middleware can ideally solve the application portability problem, it does not (yet)  Java games may not work on all Java handsets  MHP xlets may not run on all MHP set-top boxes  However, as a system designer, the holy goal of middleware should still be honored

30/48 HW/SW Integration Issues

 In the past, IC/IP designers tend to define their own HW interface, including IC pinout specs, IC control registers, and subsystem behaviors  For complex systems, the partition of a system into sub-behaviors is not standardized

video Combined into one circuit? bitstream decoded image To output (display) N VLD DC/AC-1 Q-1 IDCT Use MC? Y +

MC

Or break down as three circuits? Bilinear

reference image

31/48 Example Code to Control ICs/IPs

 When a software function needs to interact with the hardware circuit, how do we invoke†the circuit from a program?

/* HW accelerator interface */ /* HW accelerator interface */ short *input = (short *) 0x0F0000; short *input = (short *) 0x0F0000; short *output = (short *) 0x0F0200; short *output = (short *) 0x0F0200; short *call_idct = (short *) 0x0F0400; volatile short *call_idct = (short *) 0x0F0400; void idct(short *block) { void idct(short *block) int unit = sizeof(short); { memcpy(input, block, 64*unit); int unit = sizeof(short); *call_idct = 1; memcpy(input, block, 64*unit); memcpy(block, output, 64*unit); *call_idct = 1; } while (call_idct != 0) /* idle */; memcpy(block, output, 64*unit); }

†Here, memory-mapped I/O is used 32/48 Chip vs. System Designers

 For large IC design houses, they want to sell chips to as many system manufacturers as possible  For small IC design houses, they want to compete with large IC design houses without stepping on patents/copyrights  For system manufacturers, they want to use the lowest-priced chip (given same specification) available without changing their application software

applications an open standard here can middleware enables third party chip market!

HW (chips)

33/48 OpenMax Middleware

 The OpenMAX†standard was originally conceived as a method of enabling portability of components and media applications throughout the mobile device landscape  The proposal of OpenMax was brought to the Khronos Group in mid-2004 by a handful of key mobile hardware companies  Khronos Group is a member-funded industry consortium focused on the creation of open standard APIs for multimedia applications

†Check http://www.khronos.org/openmax for the spec. 34/48 Khronos Multimedia

Applications or middleware libraries (JSR 184 engines, Flash players, media players etc.)

AL Playback and Platform Media recording Frameworks interfaces SOUND 3D Vector 2D Enhanced Small footprint 3D for Low-level vector Audio embedded systems acceleration API Component interfaces IL for codec integration

Graphics surface management for efficient Image Libraries, Video EGL mixed mode 2D/3D/video rendering Codecs, Sound Libraries

Accelerated media DL primitives for codec development Media Engines –CPUs, DSP, Hardware Accelerators etc.

†Jim van Welzan, “OpenMax Overview”, 2007 35/48 Why OpenMax?

 Goal:  Solve the problem of supporting expanding multimedia standards and the increasing complexity of solutions for system and chip manufacturers  Solution:  Standardize acceleration API for multimedia kernels  Focused on key ‘hotspot’functions and  Standardize acceleration API for multimedia codecs  Abstract system architecture for increased OS portability  Provide reference designs across a wide variety of architectures and OS’s  Question:  Who shall write (port) the middleware?

36/48 OpenMax Layers†

 OpenMAX DL –Development Layer  For codec authors that need to leverage low-level hardware accelerated primitives  OpenMAX IL –Integration Layer  For system integrators that need to build solutions from mid- level component pieces (e.g. codecs) potentially from multiple sources  OpenMAX AL –Application Layer  For application writers that need high-level access to multimedia functionality (e.g. player, recorder, input, and output object)

†Jim van Welzan, “OpenMax Overview”, 2007 37/48 Integration and Development Layers†

†Neil Trevett, “Introduction to OpenMax”, SIGGRAPH presentation, 2004 38/48 OpenMax DL Domains

 Audio Coding  MP3, AAC  Image Coding  JPEG (encode and decode)  Image Processing  Color space conversion  Pixel packing/unpacking  De-blocking / de-ringing  Rotation, scaling, compositing, etc.  Signal Processing  FIR, IIR, FFT, Dot Product  Video Coding  MPEG-4 SP/H.263 BL (encode and decode)  H.264 (encode and decode)

39/48 Example: Video DL

 Provide common building block API to handle  MPEG-4 SP/H.263 BL (encode & decode)  H.264 (encode/decode)  Java Video DL (under discussion)  Hardware accelerators (under discussion)  The building blocks includes:  8x8 SAD and 16X16 SAD  8x8 DCT+Q and 8x8 IDCT+Q  MPEG-4 Variable Length Decode  Multiple 8x8 SADs in one function  Multiple 8x8 DCTs in one function  ...

40/48 Example: Image DL

 Provide common building block API to handle  JPEG / JPEG 2000 (encode and decode)  Color space conversion and packing/unpacking  De-blocking / de-ringing  Simple rotation and scaling  The building blocks includes:  JPEG 8x8 DCT and 8x8 IDCT  JPEG quantization  JPEG Huffman encoding and decoding  Color conversion  Multiple 8x8 DCT and quantization in one function  Multiple 8x8 DCTs in one function  ...

41/48 OpenMax Integration Layer (IL)

 Abstracts Hardware/Software Architecture  Provides a uniform interface for framework integration across many architectures  Integrates with Major Frameworks (OS’s, Systems):  Microsoft DirectShow, MDF, GStreamer, Java MMAPI

Application Application Application

Framework Framework Framework

IL IL IL

IPC Codec Running IPC Part of Codec On Host Processor On Host Processor Codec Running Part of Codec On Accelerator On Accelerator

42/48 OpenMax IL Architecture

Multimedia Framework

IL client IL client

commands commands IL core API

IL core

callbacks callbacks

OMX Component OMX Component

port port data tunnel

Platform 43/48 OpenMax IL Concept

 OpenMax IL is based on the visual programming model (similar to DirectShow)  Example: A DVD player can use the OpenMax IL core API to set up a component network: time Audio Renderer Audio data Decoder Clock File Component Reader/ Demux Video Decoder data Video Video Scheduler Renderer time  The system behavior must specify†  Data  Timing ( of processing)

†More on this topic will be discussed in “system formalism”in future classes 44/48 OpenMax Core API

 Core API is used to build a “component graph”  OMX_init(), OMX_Deinit()  OMX_ComponentNameEnum(), OMX_GetComponentsOfRole(), Component enumeration OMX_GetRolesOfComponent()  OMX_GetHandle(), OMX_FreeHandle() Component invocation  OMX_SetupTunnel() Component connection  There are many types of components:  Source, filter, sink, mixer, scheduler, …, etc.

45/48 Relationship Between IL and DL

Application Application Application Multimedia Framework/Middleware

System Media Driver

OpenMax Integration Layer Component Component Component Interface Interface Interface

codec

codec OpenMax Development Layer DL Primitives DL Primitives

Hardware 46/48 OpenMAX AL Concept

 OpenMax IL is too complex for many developers  AL is designed for simplified streaming media applications  AL adopts an object-oriented media approach (similar to the concept in MPEG-21)  Media Objects enable PLAY and RECORD of media

Camera Audio Mix

Audio Input Display Window

URI ? OpenMAX AL ? URI Media Object Memory Memory

Content pipe Content pipe

47/48 Discussions

 OpenMax is not the only portable framework for multimedia applications  MPEG has MPEG-21, MPEG-B/C Reconfigurable Video Coding, and MPEG Multimedia Middleware (M3W)  Microsoft has Direct X/DirectShow framework  Many, many other stuff …  If there are too many “standard”frameworks for the developers to support, these “standards”are as bad as no standard all

48/48