DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Simplifying Multi-Cloud Observability With Open Source
  • Front-End Troubleshooting Using OpenTelemetry
  • Commonly Occurring Errors in Microsoft Graph Integrations and How To Troubleshoot Them (Part 4)
  • Allow Users to Track Fitness Status in Your App

Trending

  • Comparing SaaS vs. PaaS for Kafka and Flink Data Streaming
  • Scalability 101: How to Build, Measure, and Improve It
  • Immutable Secrets Management: A Zero-Trust Approach to Sensitive Data in Containers
  • Scalable System Design: Core Concepts for Building Reliable Software
  1. DZone
  2. Data Engineering
  3. Data
  4. Create an HD Video Player With HDR Tech

Create an HD Video Player With HDR Tech

A walkthrough implementing HDR into a video player.

By 
Jackson Jiang user avatar
Jackson Jiang
DZone Core CORE ·
Nov. 27, 22 · Tutorial
Likes (1)
Comment
Save
Tweet
Share
2.4K Views

Join the DZone community and get the full member experience.

Join For Free

What Is HDR and Why Does It Matter

Streaming technology has improved significantly, giving rise to higher and higher video resolutions from those at or below 480p (which are known as the standard definition or SD for short) to those at or above 720p (high definition, or HD for short).

The video resolution is vital for all apps. Research that I recently came across backs this up: 62% of people are more likely to negatively perceive a brand that provides a poor-quality video experience, while 57% of people are less likely to share a poor-quality video. With this in mind, it's no wonder that there are so many emerging solutions to enhance video resolution.

One solution is HDR — high dynamic range. It is a post-processing method used in imaging and photography, which mimics what a human eye can see by giving more details to dark areas and improving the contrast. When used in a video player, HDR can deliver richer videos with a higher resolution.

Many HDR solutions, however, are let down by annoying restrictions. These can include a lack of unified technical specifications, a high level of difficulty in implementing them, and a requirement for videos in ultra-high definition. I tried to look for a solution without such restrictions and luckily, I found one. That's the HDR Vivid SDK from HMS Core Video Kit. This solution is packed with image-processing features like the optoelectronic transfer function (OETF), tone mapping, and HDR2SDR. With these features, the SDK can equip a video player with richer colours, a higher level of detail, and more.

I used the SDK together with the HDR Ability SDK (which can also be used independently) to try the latter's brightness adjustment feature and found that they could deliver an even better HDR video playback experience. And on that note, I'd like to share how I used these two SDKs to create a video player.

Before Development

1. Configure the app information as needed in AppGallery Connect.

2. Integrate the HMS Core SDK.

For Android Studio, the SDK can be integrated via the Maven repository. Before the development procedure, the SDK needs to be integrated into the Android Studio project.

3. Configure the obfuscation scripts.

4. Add permissions, including those for accessing the Internet, obtaining the network status, accessing the Wi-Fi network, writing data into the external storage, reading data from the external storage, reading device information, checking whether a device is rooted, and obtaining the wake lock. (The last three permissions are optional.)

App Development

Preparations

1. Check whether the device is capable of decoding an HDR Vivid video. If the device has such a capability, the following function will return true.

Java
 
public boolean isSupportDecode() {
    // Check whether the device supports MediaCodec.
    MediaCodecList mcList = new MediaCodecList(MediaCodecList.ALL_CODECS);
    MediaCodecInfo[] mcInfos = mcList.getCodecInfos();


    for (MediaCodecInfo mci : mcInfos) {
        // Filter out the encoder.
        if (mci.isEncoder()) {
            continue;
        }
        String[] types = mci.getSupportedTypes();
        String typesArr = Arrays.toString(types);
        // Filter out the non-HEVC decoder.
        if (!typesArr.contains("hevc")) {
            continue;
        }
        for (String type : types) {
            // Check whether 10-bit HEVC decoding is supported.
            MediaCodecInfo.CodecCapabilities codecCapabilities = mci.getCapabilitiesForType(type);
            for (MediaCodecInfo.CodecProfileLevel codecProfileLevel : codecCapabilities.profileLevels) {
                if (codecProfileLevel.profile == HEVCProfileMain10
                    || codecProfileLevel.profile == HEVCProfileMain10HDR10
                    || codecProfileLevel.profile == HEVCProfileMain10HDR10Plus) {
                    // true means supported.
                    return true;
                }
            }
        }
    }
    // false means unsupported.
    return false;
}


2. Parse a video to obtain information about its resolution, OETF, color space, and color format. Then save the information in a custom variable. In the example below, the variable is named as VideoInfo.

Java
 
public class VideoInfo {
    private int width;
    private int height;
    private int tf;
    private int colorSpace;
    private int colorFormat;
    private long durationUs;
}


3. Create a SurfaceView object that will be used by the SDK to process the rendered images.

Java
 
// surface_view is defined in a layout file.
SurfaceView surfaceView = (SurfaceView) view.findViewById(R.id.surface_view);


4. Create a thread to parse video streams from a video.

Rendering and Transcoding a Video

1. Create and then initialize an instance of HdrVividRender.

Java
 
HdrVividRender hdrVividRender = new HdrVividRender();
hdrVividRender.init();


2. Configure the OETF and resolution for the video source.

Java
 
// Configure the OETF.
hdrVividRender.setTransFunc(2);
// Configure the resolution.
hdrVividRender.setInputVideoSize(3840, 2160);


When the SDK is used on an Android device, only the rendering mode for input is supported.

3. Configure the brightness for the output. This step is optional.

Java
 
hdrVividRender.setBrightness(700);


4. Create a Surface object, which will serve as the input. This method is called when HdrVividRender works in rendering mode, and the created Surface object is passed as the inputSurface parameter of configure to the SDK.

Java
 
Surface inputSurface = hdrVividRender.createInputSurface();

5. Configure the output parameters.
  • Set the dimensions of the rendered Surface object. This step is necessary in the rendering mode for output.
Java
 
// surfaceView is the video playback window.
hdrVividRender.setOutputSurfaceSize(surfaceView.getWidth(), surfaceView.getHeight());
  • Set the colour space for the buffered output video, which can be set in the transcoding mode for output. This step is optional. However, when no colour space is set, BT.709 is used by default.
Java
 
hdrVividRender.setColorSpace(HdrVividRender.COLORSPACE_P3);
  • Set the colour format for the buffered output video, which can be set in the transcoding mode for output. This step is optional. However, when no colour format is specified, R8G8B8A8 is used by default.
Java
 
hdrVividRender.setColorFormat(HdrVividRender.COLORFORMAT_R8G8B8A8);


6. When the rendering mode is used as the output mode, the following APIs are required.

Java
 
hdrVividRender.configure(inputSurface, new HdrVividRender.InputCallback() {
    @Override
    public int onGetDynamicMetaData(HdrVividRender hdrVividRender, long pts) {
        // Set the static metadata, which needs to be obtained from the video source.
        HdrVividRender.StaticMetaData lastStaticMetaData = new HdrVividRender.StaticMetaData();
        hdrVividRender.setStaticMetaData(lastStaticMetaData);
        // Set the dynamic metadata, which also needs to be obtained from the video source.
        ByteBuffer dynamicMetaData = ByteBuffer.allocateDirect(10);
        hdrVividRender.setDynamicMetaData(20000, dynamicMetaData);
        return 0;
    }
}, surfaceView.getHolder().getSurface(), null);


7. When the transcoding mode is used as the output mode, call the following APIs.

Java
 
hdrVividRender.configure(inputSurface, new HdrVividRender.InputCallback() {
    @Override
    public int onGetDynamicMetaData(HdrVividRender hdrVividRender, long pts) {
        // Set the static metadata, which needs to be obtained from the video source.
        HdrVividRender.StaticMetaData lastStaticMetaData = new HdrVividRender.StaticMetaData();
        hdrVividRender.setStaticMetaData(lastStaticMetaData);
        // Set the dynamic metadata, which also needs to be obtained from the video source.
        ByteBuffer dynamicMetaData = ByteBuffer.allocateDirect(10);
        hdrVividRender.setDynamicMetaData(20000, dynamicMetaData);
        return 0;
    }
}, null, new HdrVividRender.OutputCallback() {
    @Override
    public void onOutputBufferAvailable(HdrVividRender hdrVividRender, ByteBuffer byteBuffer,
        HdrVividRender.BufferInfo bufferInfo) {
            // Process the buffered data.
    }
});


new HdrVividRender.OutputCallback() is used for asynchronously processing the returned buffered data. If this method is not used, the read method can be used instead. For example: 

Java
 
hdrVividRender.read(new BufferInfo(), 10); // 10 is a timestamp, which is determined by your app.


8. Start the processing flow.

Java
 
hdrVividRender.start();


9. Stop the processing flow.

Java
 
hdrVividRender.stop();


10. Release the resources that have been occupied.

Java
 
hdrVividRender.release();
hdrVividRender = null;


During the above steps, I noticed that when the dimensions of Surface change, setOutputSurfaceSize has to be called to re-configure the dimensions of the Surface output.

Besides, in the rendering mode for output, when WisePlayer is switched from the background to the foreground or vice versa, the Surface object will be destroyed and then re-created. In this case, there is a possibility that the HdrVividRender instance is not destroyed. If so, the setOutputSurface API needs to be called so that a new Surface output can be set.

Setting up HDR Capabilities

HDR capabilities are provided in the class HdrAbility. It can be used to adjust brightness when the HDR Vivid SDK is rendering or transcoding an HDR Vivid video.

1. Initialize the function of brightness adjustment.
Java
 
HdrAbility.init(getApplicationContext());


2. Enable the HDR feature on the device. Then, the maximum brightness of the device screen will increase.

Java
 
HdrAbility.setHdrAbility(true);


3. Configure the alternative maximum brightness of white points in the output video image data.

Java
 
HdrAbility.setBrightness(600);


4. Make the video layer highlighted.

Java
 
HdrAbility.setHdrLayer(surfaceView, true);


5. Configure the feature of highlighting the subtitle layer or the bullet comment layer.

Java
 
HdrAbility.setCaptionsLayer(captionView, 1.5f);

Summary

Video resolution is an important influencer of user experience for mobile apps. HDR is often used to post-process video, but it is held back by a number of restrictions, which are resolved by the HDR Vivid SDK from Video Kit.

This SDK is loaded with features for image processing such as the OETF, tone mapping, and HDR2SDR, so that it can mimic what human eyes can see to deliver immersive videos that can be enhanced even further with the help of the HDR Ability SDK from the same kit. The functionality and straightforward integration process of these SDKs make them ideal for implementing the HDR feature into a mobile app.

Software development kit Data Types

Opinions expressed by DZone contributors are their own.

Related

  • Simplifying Multi-Cloud Observability With Open Source
  • Front-End Troubleshooting Using OpenTelemetry
  • Commonly Occurring Errors in Microsoft Graph Integrations and How To Troubleshoot Them (Part 4)
  • Allow Users to Track Fitness Status in Your App

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!