This is how your PC hardware plays audio and video


Nowadays computers of any size can reproduce video in different formats and resolutions without problems, but specialized hardware is used for this to be seen reproduced on the screen. But have you ever wondered how hardware video codecs work? In this article, we are going to explain it to you.

The first multimedia computers needed a separate card to be able to decode video, even some remember how the first DVD drives for PC were sold with decoder cards.

Little by little, thanks to the benefits of Moore's law, the size of these decoders ended up reducing until they were integrated into the graphics processors, saving us from having to buy additional hardware to watch movies and series on our PCs, and today we can enjoy these anywhere.

What is a digital signal processor?

Digital signal processors, or known by the acronym DSP, take an input data signal and apply an algorithm to it, generating an output signal or data. They can be used to decode multimedia file formats and thus decode an input data stream that may well belong to one multimedia file format or another.

They are not to be confused with fixed-function processors, since DSPs execute a program and this can be modified, but normally said the program is not accessible at the user level and it is the DSP manufacturer itself that has access to the memory. that includes said the program, which is usually updated through firmware updates made by the manufacturer itself.

That is, DSPs are a type of processor like CPUs, GPUs, etc. But in recent years they have been increasingly integrated into different types of processors to speed up certain multimedia tasks, especially the real-time decoding of certain multimedia formats.

How do hardware codecs work on our PCs?

A decoder is nothing more than a DSP, which executes a program in which it converts the data blocks of a multimedia file into the succession of images and sound that we see on our television. But, wouldn't it be enough to be able to reproduce the images as is? The answer is that this would be extremely inefficient since if we were to handle the data uncompressed we would need an enormous amount of space for storage and bandwidth.

That is why multimedia files are compressed in different formats, it is not really different than when we compress or decompress a file and the principle is the same. Compression can be based on things like taking common elements and giving them a certain value, storing the color variation from frame to frame, etc.

The most difficult part is the encoding, which consists of converting a raw image, movie, or audio file into one of the formats, the process requires a much higher computational capacity than decoding, that is why many times the new formats enjoy decoders long before encoders to that same format.

Why are new Video Codecs being developed?

The reason for this is because new forms of consumption appear, for example, the video codec for DVD-Video was H.263 or MPEG-2, which was good enough to reproduce video on a conventional tube television but when made the leap to Blu-Ray, it was seen that this format was not the best for the transmission speed of the then-new storage format, so the creation of H.264 was necessary.

Currently, the era of optical formats has gone down in history and content providers have to transmit over the network, although a fiber-optic network has a higher speed than a BluRay for content providers it is much better to be able to transmit as much more content better within a specific bandwidth, since that means savings for them in their server infrastructure, both in servers and in communication.

The counterpart at the user level? The video and audio codecs, the more data they compress, they end up needing a greater calculation capacity since the number of steps to rebuild the original data ends up being much larger, requiring for this DSPs of greater power.

Where are the hardware video codecs found?

On the PC, the GPU is normally one of its accelerators, which are connected to the private Northbridge of the GPU itself. In the case of SoCs, because CPU and GPU share the same Northbridge, they are connected to the general Northbridge of the same.

In the case of CPUs, it is not normal to find specialized hardware for video encoding and decoding, although there is no impediment for them to be integrated as a coprocessor, although it is not usual because it is necessary to access the video memory. since the display controller will be the one that reads the image decoded by the video CODEC as an image buffer.

In SoCs, codecs have a direct relationship with other DSPs and / or accelerators, such as the ISP in charge of digitizing the images captured by the camera and even with specialized AI neural processors, with which they work side by side, either to digitize in video form what we capture with the camera and to change the resolution and correct the noise of the image and sound of the videos.

Post a Comment

0 Comments