Media Heresy: Compression is becoming redundant

Posted by Unknown Rabu, 14 Januari 2004 0 komentar
Yesterday, Ross sent Bambi Francisco to talk to me, and she asked an interesting question: "Is there a Moore's Law for compression?" My answer didn't all make it into her article, so here is an expanded version for my 'media heresies' series.



"Is there a Moore's Law for compression?"

In the sense of compression getting uniformly better over time? No.

Compression has different constraints - it is primarily based around fooling human perception systems by sending less information. Compression generally has two phases - a lossy phase where the data is transformed into a less accurate version by exploiting limitations of human vision or hearing, and a lossless phase where redundancy is squeezed out mathematically (this phase is like using .zip).



With more computing power, more elaborate transformations can be done in the first phase, and more complex mathematical compression can take place in the second phase, and still give the computer enough time to achieve a useful frame rate, but overall compression standards do not improve at anything like Moore's Law speed.



I'd say video compression is maybe 2-4 times as efficient (in quality per bit) than it was in 1990 or so when MPEG was standardised, despite computing power and storage having improved a thousandfold since then.



However, what does happen is that the Moore's Law effects on computing power, and the Moore's Law cubed effect on storage capacity mean that compression becomes less relevant over time.



You can now buy an off-the shelf computer that can edit uncompressed High Definition TV for under 10% of the cost of an HD tapedeck.



Consider that the iPod has gone from 5GB to 40GB in under 18 months - a factor of 8. The MP3 compression iTunes uses is about 8:1, so that means you could fill the new iPod with uncompressed audio and store as much as you did in the old one. Apply that rate of doubling another few times and think about pocket TiVos. Farfetched? I'm not so sure - Computer users have been watching DVDs on laptops for a while now; hand-held DVD players are being bought for children in the backseat and people who travel. According to my friends at Best Buy, they sold out all the portable DVD players they had this Christmas - they had hit a sensible price point.



The deeper point is a trend based one. If storage continues to improve in capacity per dollar at 3 times the rate of computing power, compression becomes wholly redundant - the CPU running the bit-manipulation is the bottleneck. The HD editing computers work this way - they have DMA (direct memory access) hardware in the disk interface and the screen interface, and the computer's job is to get out of the way.



The other reason compression is a bad idea in the long run is precisely because of its success in removing redundancy. If you have uncompressed audio or video, a single bit error will likely go un-noticed. If you are unlucky and it is the high bit of a sample, you will get a transient click in the sound, or a brightly coloured dot in the wrong place in video, but it will soon pass and be covered by a correct bit.

If you have a single bit error in a compressed stream it will make the rest of the frame, or possibly many frames, corrupt. In the worst case it can destroy the rest of the file from then onwards.

For archival content this kind of fragility is not what you want.
TERIMA KASIH ATAS KUNJUNGAN SAUDARA
Judul: Media Heresy: Compression is becoming redundant
Ditulis oleh Unknown
Rating Blog 5 dari 5
Semoga artikel ini bermanfaat bagi saudara. Jika ingin mengutip, baik itu sebagian atau keseluruhan dari isi artikel ini harap menyertakan link dofollow ke https://apk-zenonia5.blogspot.com/2004/01/media-heresy-compression-is-becoming.html. Terima kasih sudah singgah membaca artikel ini.

0 komentar:

Posting Komentar

Trik SEO Terbaru support Online Shop Baju Wanita - Original design by Bamz | Copyright of apk zenonia 5.