Open standards accelerating next-generation multimedia device delivery

April 01, 2008

Story

Open standards accelerating next-generation multimedia device delivery

A gradual yet revolutionary change is transforming the way multimedia is used in software applications today.

Consumers are expecting more and more from their multimedia devices, pressuring application developers to keep up. Fortunately, middleware frameworks with standard components are emerging to help designers. Fakhir introduces the OpenMAX multimedia framework and illustrates how it is changing multimedia device development.

A gradual yet revolutionary change is transforming the way multimedia is used in software applications today. Not long ago, most multimedia vendors had their own implementation. Code interoperability and portability were generally not major requirements. But now, with more powerful hardware and increasing demand from end users, the multimedia domain has expanded in all directions.

This expansion has now reached a level where a single vendor can no longer address all the requirements. Accelerated hardware, codecs, container formats, network streaming, and other very highly specialized subdomains have appeared. This growth has triggered a major shift in the way multimedia services are perceived.

From services to frameworks

To understand this shift, developers must examine traditional multimedia libraries. These libraries normally have a static structure and provide a fixed set of services. The services provided are definitive, such as “playing a WAV file” or “playing an MP3 file.” The API itself is vendor-specific, and applications written for one multimedia library are usually not portable to another. Library implementation is kept opaque, limiting options for customization or extension.

To address the growing demands of the expanding multimedia domain, software vendors have shifted their focus to multimedia frameworks, illustrated in Figure 1. A framework is a heterogeneous mixture of software from different sources. The key feature of multimedia frameworks is a flexible and extensible architecture that allows the services provided by the framework to evolve with the changing requirements of the industry.

Figure 1


21

 

Multimedia framework flexibility is achieved by utilizing the concept of a component. Components perform like simple building blocks that fit together to form more complex systems. A framework API does not provide access to particular services, instead allowing the developer to assemble different components according to design requirements. The framework is independent of what these components actually do and how they do it.

Why has the framework paradigm worked well for multimedia? The answer lies in the nature of multimedia processing. Multi-media processing inherently involves a linear flow of data through different stages. Each stage is well defined and logically independent of other stages. Therefore, a linear arrangement of components in the form of a pipeline naturally suits multi-media. Figure 2 shows a sample pipeline for audio playback. Multimedia data flows in from one end and is processed by different components as it exits the pipeline from the other end.

Figure 2


22

 

Multimedia framework advantages

Realizing the power of this concept can be difficult without examples. Frameworks normally contain a rich library of components. Table 1 categorizes four types of components. A framework user will usually select one component from each column of the table and make a pipeline of the resulting four components. It is easy to see that numerous configurations are possible using these sample components. For example, the MP4 demultiplexer, MPEG4 decoder, video scaling, and video output components can be connected together to display video. Adding support for subtitles to this video will be as simple as adding the subtitles component to the pipeline.

One important characteristic of a framework is that each component is loosely coupled with other components and thus is easily replaceable. For example, one may replace a standard video decoder with a hardware-accelerated video decoder. Enhancing existing applications is simplified as the user only adds or replaces existing components with more enhanced versions.

Standardization ensures interoperability

Each component’s internal logic is encapsulated in a standard component definition. This standardization and the aforementioned loose coupling provide an excellent platform to ensure interoperability among components written by different software vendors. Several software vendors may contribute to a single framework, and all of their components fit and work together seamlessly. A framework also serves as a software integration tool.

The more popular multimedia frameworks in use today are usually platform dependent. Examples include DirectX for MS Windows and GStreamer for Linux. But standardization has risen a notch higher. Cross-industry groups such as Khronos have standardized the framework definition itself. An open, royalty-free framework definition by a neutral group has encouraged collaboration among software vendors. The multimedia framework defined by Khronos is called OpenMAX (www.khronos.org/openmax/). Although this is a new standard, several companies have already embraced it.

The OpenMAX standard is made up of three levels, as shown in Figure 3. What has been discussed up until now corresponds to the OpenMAX Integration Level (IL), which defines a component-based framework. The other two levels above and below the IL level address equally important aspects of the framework: implementation and usage.

Figure 3


23

 

Writing components for multimedia frameworks

The component library is the largest functional area of a multimedia framework and involves the most effort from software and silicon vendors. Vendors usually specialize in certain services; for example, a software vendor may specialize in providing video codecs like MPEG4. This particular codec can become part of a multimedia framework once embedded in the framework’s components. Vendors encapsulate services into components to make them standardized and easily pluggable into existing software, opening up opportunities for widespread use of their products.

Another notable characteristic of a multimedia framework is the ease with which it allows third-party services to be integrated into components. Frameworks provide special tools and techniques for this very purpose.

Given that these assistance tools usually vary from one framework to another, this discussion will focus on OpenMAX framework-related features, particularly the Nucleus Multimedia Framework implementation by Mentor Graphics.

Multimedia data processing is extremely time critical. Data must be compressed, decompressed, or converted to other formats in real time. This data processing employs computationally intensive algorithms that must be highly optimized. OpenMAX Development Level (DL) addresses this vital area of optimization, providing an API to a large set of commonly used algorithms related to multimedia processing.

Service providers will not have to worry about implementing and optimizing these algorithms; they simply use the OpenMAX DL API in their software. The actual implementation of these APIs is then provided by another stakeholder in such systems – silicon vendors. A silicon vendor implements all OpenMAX DL-defined algorithms, which are specifically optimized for the vendor’s hardware platform. This benefits software vendors by allowing their software to run efficiently on hardware and helps silicon vendors by ensuring that software written for their platforms utilizes the hardware to its full potential.

Framework components carry out many common operations, such as managing buffers, maintaining the component state, and protecting data. Some frameworks simplify the task of component writers by allowing component hierarchies. One generic base component provides all the common functionality and other components that can be derived from this base component, as shown in Figure 4. Using object-oriented design principles, a derived component inherits the properties of a base component, minimizing redundancy and helping component writers focus solely on their specific services.

Figure 4


24

 

Because a framework serves as a heterogeneous mixture of software from different sources, a component writer may not always be familiar with another component. This is where the additional debug and development tools provided by the framework come into play. Debug tools are most crucial as they help visualize the multimedia pipeline and locate problems. Figure 5 represents a component pipeline in real time with the Nucleus Multimedia Framework debugger.

Figure 5


25

 

Using multimedia frameworks in software applications

Despite the advantages of component-based frameworks, these types of APIs are not readily accepted by application developers, who are accustomed to plain APIs such as “play an MP3 file.” Having to create components, connect them together, and then use them – no matter how simple an operation – does not provide a sufficient level of abstraction to justify their use.

The OpenMAX Application Level (AL) seeks to address these concerns, providing an easy-to-use API that hides the mechanisms of the underlying framework. This also makes user applications more portable, as they use an open standard consistent across all hardware platforms instead of relying on a proprietary API.

Recently, some frameworks have moved to an even higher level of abstraction. Instead of providing a programming language API, the developer creates an application by defining it in simple XML. This technique is catching on in user interface applications. Integrating a multimedia framework at such a high level has allowed multimedia to be used in ways that were not possible until now.

APIs easing integration

The embedded industry is accelerating efforts to establish royalty-free APIs, which enable media authoring and promote adoption across a wide variety of platforms and devices. The Khronos Group is closely involved with these efforts, and its OpenMAX standard for media library portability is gaining serious momentum.

The OpenMAX cross-platform API enables accelerated multimedia components from different software vendors to be developed, integrated, and programmed across multiple operating systems and silicon platforms. With this approach, embedded device integrators can take advantage of library and codec components from any software vendor, as long as they are built on OpenMAX APIs, while realizing the full acceleration potential of new silicon platforms. The result will be devices with the most advanced multimedia capabilities delivered into the hands of consumers at the silicon beat-rate.

Fakhir Ansari is a technical lead for multimedia software development at Mentor Graphics Corporation, based in Wilsonville, Oregon. Fakhir has nearly five years of experience in embedded software development, with particular focus on cryptography, networking protocols, and multimedia systems for handheld devices. He is also a software development hobbyist with 10 years of programming experience. Fakhir holds a BS in Computer Science from the National University of Computing and Emerging Sciences in Lahore, Pakistan.

Mentor Graphics
251-208-3400
[email protected]
www.mentor.com/embedded

Fakhir Ansari (Mentor Graphics Corporation)