Showing posts with label Internet. Show all posts
Showing posts with label Internet. Show all posts

Saturday, May 29, 2010

First Look: H.264 and VP8 Compared

By Jan Ozer
VP8 is now free, but if the quality is substandard, who cares? Well, it turns out that the quality isn't substandard, so that's not an issue, but neither is it twice the quality of H.264 at half the bandwidth. See for yourself.

[In response to reader questions and comments, this article was updated at 6:20 a.m. on Monday, May 24. See author's comments at the end of the article.—Ed]

VP8 is now free, but if the quality is substandard, who cares? Well, it turns out that the quality isn't substandard, so that's not an issue, but neither is it twice the quality of H.264 at half the bandwidth. See for yourself, below.

To set the table, Sorenson Media was kind enough to encode these comparison files for me to both H.264 and VP8 using their Squish encoding tool. They encoded a standard SD encoding test file that I've been using for years. I'll do more testing once I have access to a VP8 encoder, but wanted to share these quick and dirty results.

Here are the specs; VP8 on the left, H.264 on the right:

Ozer VP8Tables

You can download and play the files themselves, though you'll need to download a browser from http://www.webmproject.org/users/ to play the webm file. Click here to download the H.264 file, and here for the VP8 file.

What about frame comparisons? Here you go; you can click on each to see a larger version that reveals more detail.

Low motion videos like talking heads are easy to compress, so you'll see no real difference.

Ozer VP8 Figure 1

In another low motion video with a terrible background for encoding (finely detailed wallpaper), the VP8 video retains much more detail than H.264. Interesting result.

Ozer VP8 Figure 2

Moving to a higher motion video, VP8 holds up fairly well in this martial arts video.

Ozer VP8 Figure 3

In higher motion videos, though, H.264 seems superior. In this pita video, blocks are visible in the pita where the H.264 video is smooth. The pin-striped shirt in the right background is also sharper in the H.264 video, as is the striped shirt on the left.

Ozer VP8 Figure4

In this very high motion skateboard video, H.264 also looks clearer, particularly in the highlighted areas in the fence, where the VP8 video looks slightly artifacted.

Ozer VP8 Figure 5

In the final comparison, I'd give a slight edge to VP8, which was clearer and showed fewer artifacts.

Ozer VP8 Figure 6

What's this add up to? I'd say H.264 still offers better quality, but the difference wouldn't be noticeable in most applications.

Update

We've received some comments about the comparisons, and I wanted to address them en masse.

In my haste to post the article, I didn't use the exact same frames in the comparison images. Though the frames I used accurately showed comparative quality, we've updated the images to include the same frames—a better presentation with the same conclusion. In defense of this quick and dirty analysis, we did include links to the encoded source files with the original article, so anyone wishing to dig a bit deeper could certainly have done so and still can (H.264 here; VP8 here). Still, we should have updated to the identical frames sooner.

Some readers have questioned why we used the baseline H.264 profile and the MainConcept codec instead of x264 for our H.264 encoding. As stated in the article, the VP8 and H.264 files were encoded by Sorenson Media using its Squish tool, since Sorenson had been apparently been working with Google for some time to get the tool up and running, and could produce the VP8 and H.264 encoded files soon after Google made the WebM announcement. Lots of websites use Squish, and even more use Squeeze; I was comfortable that their encoding of both formats would be representative of true quality. We used non-configurable Squish presets in the interest of time, so couldn't change the profile from baseline to main, and obviously couldn't substitute x264 for the MainConcept codec that Sorenson deploys.

In my tests, MainConcept has been a consistent leader in H.264 quality, and is certainly a very solid choice for any commercial encoding tool. We have an x264 evaluation on the editorial calendar—I know it's quite good, and I look forward to seeing how it stacks up with others in terms of quality and downstream compatibility. I didn't ask Sorenson how long it took to encode the files, but Google's FAQ does indicate that encoding can be quite slow at the highest quality configurations, though they're working to optimize that.

As I said in the article, "I'll do more testing once I have access to a VP8 encoder, but wanted to share these quick and dirty results." So, hang on for a few weeks, and I'll try to get the next comparison right the first time.

Some other questions:

>> hmmm, do you think VP8 is trying to determine who the primary subject is, and then focusing on providing that region more detail? or it's redistributing the bits in a different way? maybe if you took a look at each frame, and see how much data is being given to each frame

I'm not aware of a tool that provides this information - if you know of one, please let me know.

>>I think it's a bit misleading to point out background flaws in h264 when the overall quality is higher.

The overall quality of that frame did not appear higher. I look at standard frames each time I compared encoding tools or codecs, I was simply passing along my observation. Not sure how that can be misleading.

>> only 24-bit PNG files are acceptable for frame comparisons

I've never seen a case where high quality JPEGs were not visually indistinguishable from PNGs, and they slow web page load times immensely, especially for mobile viewers. I'm open to reconsider this approach, though; if you want to take PNG frame grabs from the files, compare them to the JPEGs, and show that there's a difference, I'll post the PNGS.

>> The VP8 video is put in a WebM (which is just MKV) container and the H.264 video is put in a mp4 container. Not a big deal, but in a video codec comparison, it only makes sense to keep everything else constant. MKV is able to handle both formats, so why the difference?

This is pretty much irrelevant, and you could equally make the case that it's more accurate to display the H.264 file the way most viewers would actually watch them, which is to say, as H.264.

>> Why is the audio even included? Furthermore, why is the audio encoded differently? The fact that the two files had similar filesizes means nothing when you attach different audio streams.

Audio is included because I like to check synchronization at the end, and folks seem to prefer watching video with audio. You've no doubt noted that I made no audio related conclusions. Both audio streams are 64Kbps as reported by MediaInfo, so should't impact the data rate of the video file, or overall file size. This is also what Sorenson reported; if you have other information about the respective file size, please let me know.

The fact that one file is mono and the other stereo is irrelevant, particularly given the lack of audio-related conclusions and the fact (as far as I can tell) that the audio configuration had the same bit rate.

>>Finally, take a look at http://x264dev.multimedia.cx/?p=377

Very useful analysis, though I'm glad I focused on a more streaming-oriented configuration. While I wasn't aware of it when I wrote this article, Dan Rayburn let me know about it later that day and I blogged about it (and quoted it extensively) here later that day.

Sorry for any confusion, I'll try to do better next time.


Read More..

Wednesday, March 17, 2010

Internet Explorer 9: A Platform Preview

source: extremetech.com
At the Mix developer conference today in Las Vegas, Microsoft launched preview code for the most critical element of its upcoming Internet Explorer 9 Web browser: the underlying rendering engine.

Key goals for the new browser engine are few and simple: improved speed and support for new Web standards, particularly HTML5 and SVG (the World Wide Web Consortium (WC3) organization, to which Microsoft contributes test code and other resources, oversees both).

IE9 isn't an full-fledged Web browser yet; so far, it's merely the rendering subsystems wrapped in a simple user interface that does nothing more than allow Web address entry and provide developer tools. IE9's director Dean Hachamovitch told a select group of reporters (myself and PCMag Editor-in-Chief Lance Ulanoff among them) last week at its Redmond, Washington, campus that big changes were also coming to the browser's user interface, but that this release was aimed at giving developers a target for their Web site code.

Internet Explorer has been something of an object of derision among browsers when it comes to speed, and particularly in JavaScript performance. SunSpider JavaScript Benchmark results are often cited as an indication of this, but Microsoft has long maintained that it's put development efforts in other parts of the pipeline needed to render Web pages, such as layout and display. This newest version of Internet Explorer addresses both JavaScript and other speed bottlenecks.

Speed Through GPU Acceleration

Because much of what a browser does involves rendering graphical images and drawing them to the display, it seems logical that such operations would be performed on the hardware optimized for them: the video card. Current browsers, however, only use the CPU for these operations. IE9 moves graphics processing to the GPU.

IE9 will be able to take advantage of both high-end gamers' powerhouse video cards and the more modest models found in low-powered machines. Though JavaScript performance plays a role in Web browser performance (as we'll discuss below), there are other factors as well: too—networking, HTML parsing, CSS, data collections, DOM, COM marshalling, layout, and display rendering. The IE dev team has profiled the most common performance patterns among thousands of the world's most popular sites and found that rendering actually accounted for a bigger part of the pipeline than JavaScript. Clearly, handling the display rendering steps in the GPU will have a significant positive impact on Web performance.

JavaScript Speed through Compilation on a Second Core

Browsers handle JavaScript through on-the-fly interpretation which, just like human language interpretation, requires steps between the code and the execution on the CPU. To be fair, what Firefox, Chrome, and Opera use is a bit better than straight-up interpretation: They use a JIT (just-in-time) compiler for some JavaScript, which delivers a palpable speed boost over interpreting the code.

But IE9 will go beyond the JIT idea by taking advantage of the fact that nearly all PCs bought in the last few years have, in effect, more than one CPU—they're dual- or quad-core machines these days (and Intel just released the first six-core CPU), and can run up to twice as many threads. Multicore CPUs use one core to render the JavaScript the old way and the second to actually compile it to run directly in machine code on the hardware, with no translation required. Any programmer knows that the speed difference between interpreted and compiled code is enormous, so it follows that in some cases the performance gain will be game-changing.

Jason Weber, the Principal Program Manager lead for Internet Explorer, showed a demo of spinning 3D icons that dramatically illustrated the difference both JavaScript compilation and GPU acceleration can make. As icons were added and spun faster, all current browsers topped out the CPU (or "pegged" it) and the spinning demo slowed down to a painful 5 frames per second or fewer, with a dozen icons spinning slowly. But spinning and zooming 256 icon globes at warp speed in IE9 left the primary CPU core with processing time to burn. Said Weber, "We're only using a quarter of the first core of the CPU. This is enabling developers to create a completely new class of applications on the Web."


The SunSpider benchmark only tests one element of browser performance, and it actually doesn't even test some of the most frequently used JavaScript commands. Microsoft has set up a test of the most frequently used ones on the test site for the new browser platform and, though engineers were quick to deny the term "benchmark" for these, the results were impressive. For these top 15 JavaScript actions, IE9 came out twice as fast as the current SunSpider-leading Opera browser. Of course, we'll need to do our own tests to validate Microsoft's results—and we will—but it made for an effective demonstration nonetheless.



IE has a history of frustrating developers because of the need to fork parallel code, especially for earlier versions. Hachamovitch wanted to send the message that sites should just deliver one code base for all browsers that adhere to real standards. This comes back to what has been a programmer's Holy Grail for years: the concept of write-once, run-anywhere. One of the recurring themes at the press preview was IE9's goal of "browser interoperability."

IE hasn't been alone in needing its own special code. Firefox has had the "-moz" prefix for some commands that only work in that browser; and Webkit, too, has required a "-Webkit" prefix for some. Microsoft Program Manager Tony Ross demonstrated how a simple two lines of code to get rounded corners on a rectangle quickly grew far more cumbersome when all the alternate code for these different browsers was added. "Ultimately, running the same markup and having complete interoperability is a two-way street: There's the part that the browser has to play, and the part developers have to play. We encourage them to detect for capabilities rather than for browsers."

Ross and other Microsoft engineers have been extremely active in the W3C, the official standards body of the Internet. Though tests like Acid3 have purported to indicate support for "standards," it turns out that many of the capabilities it tests aren't official W3C specs. Hachamovitch did note that "as we support more of the markup that Web sites are using, our Acid score will go up." But he went on to decry the test's claim to represent true standards: "As standards and browsers change, you see a lot of variance." That said, the new IE9 browser engine's Acid3 score is up from 20 to 55 out of 100 on the test, and that could still go up when we see beta.



A couple of specific HTML5 features that the IE9 platform will support are the video and audio markup tags. And where Firefox hopes content providers will switch to the open-source Ogg formats, IE9 will support industry standard MPEG-4 and H.264 for video and AAC and MP3 for audio.

New Support for SVG

Sure, you can offer a PDF download for complex content like floor plans and org charts, but why not make these decipherable right inside the webpage? SVG allows just that. SVG is a W3C standard for animated, interactive graphics based on paths rather than bitmaps. No matter how much you zoom in on an SVG image, edges remain razor sharp—unlike bitmapped image formats such as JPEG, which show degradation as you enlarge them. This holds true for text, too; that's key for the org chart example mentioned above.

In fact, John Hrvatin, Senior Program Manager Lead, Internet Explorer, told the press group that IE9 is the first browser that natively supports SVG inline with HTML; previously, XHTML was required. SVG is the descendent of VML, which came out of the Visio drawing tool. Other browsers use SVG for the popular map sites, while until version 9, IE uses VML.

Said Visio veteran and now Internet Explorer Partner Program Manager Ted Johnson, "SVG is a huge standard. We're not doing all of it in the first release, but we're doing a tremendous amount of it. When we ship, we're going to be covering all the way through what we consider 'static SVG.' Filter effects, animation, and fonts won't be included. A lot of these are still in flux. What's exciting about SVG is that you can easily see the markup in view source alongside its graphical result in the browser."

Will Microsoft Get It Right with IE9?

After years of criticism about IE's lack of support for standards and slower Web page loading than that of its competitors, it looks like Microsoft is taking some concrete, drastic steps to address these issues. Only time will tell, but Hachamovitch's plea to developers to stop coding separately for Internet Explorer is a good sign, as are the rendering speed initiatives. Taking advantage of today's ubiquitous multiple cores in the CPU and discrete graphics hardware is both radical and logical, and could completely change the game when it comes to page-rendering speed. It will be interesting to see if other browser makers take the gauntlet and implement these techniques.

One factor, though, is IE9's release schedule. Microsoft reps were completely noncommittal about when the browser would be released, in either a beta or final version. Given the frequency with which we see new editions of Firefox and Google Chrome, it's conceivable that another browser could come out with hardware acceleration before anyone gets a chance to use IE9.

For now, you can take the platform for a test drive by visiting ie.microsoft.com/testdrive in your current browser.
Read More..

Friday, June 26, 2009

Facebook slams AMD & Intel face

source: fudzilla.com
New server chips not as good as expected

Facebook vice president of technical operations Jonathan Heiliger says he's disappointed by performance gains offered by the latest Intel and AMD server CPUs.

Facebook has a massive user base, and it's still growing rapidly, and thus constantly upgrading and expanding its infrastructure to keep up with growth. However, the latest server parts simply don't seem to deliver advertised performance gains.

"The biggest thing that surprised us is ... less-than-anticipated performance gains from new microarchitectures - so, new CPUs from guys like Intel and AMD. The performance gains they're touting in the press, we're not seeing in our applications," said Heiliger. "And we're, literally in real time right now, trying to figure out why that is."

Heiliger thinks chipmakers and OEMs are missing the point when it comes to power efficiency and server functionality needed for homogenous applications.

"You guys don't get it," Heiliger said. "To build servers for companies like Facebook, and Amazon, and other people who are operating fairly homogeneous applications, the servers have to be cheap, and they have to be super power-efficient."

In spite of working with server makers on improving their products, Facebook claims they still continue to fail. Read More..

Friday, June 5, 2009

Bing, another Microsoft flop?

IN seeking to make the new search engine Bing as much a part of the popular culture as “bada bing,” Bing Crosby or Stanley Bing, Microsoft is buying prominent placement for bing.com inside television shows and the online video hub Hulu.

The effort to weave advertising for Bing into content, known as branded entertainment, is intended to complement an elaborate traditional campaign, which began on Wednesday with commercials created by the JWT unit of WPP.

The Microsoft Corporation is estimated to be spending $80 million to $100 million on ads to help establish Bing as a viable alternative to the 800-pound gorilla of search, Google. It is the most recent of several attempts by Microsoft — all flops — to become a significant factor in search, where ad spending has held up better than in most other media.

“It’s a very tall marketing challenge and a very tall product challenge,” acknowledged Yusuf Mehdi, senior vice president for the online services division of Microsoft in Redmond, Wash.

“It’s going to take multiple steps to get where we want to go,” he added, “and this is the first step.”

Bing has two goals, Mr. Mehdi said: “Win a fan base and start to grow share.” The latter refers to the fact that “every other provider” of search-engine services “has lost market share in the last five years,” he added, “except for the leader” — that being, of course, google.com from Google.

“The key will be whether we deliver a product and connect with people emotionally in the advertising,” Mr. Mehdi said. To achieve the second point, “you have to do something a little bit more surprising,” he added.

First up is a program-style commercial on Hulu, scheduled for 8 p.m. (Eastern time) on Monday. The hourlong spiel, a first for Hulu, is being styled like a telethon and carries the title “Bing-a-thon.” It was developed for Microsoft by the Creative Artists Agency in Los Angeles.

The cast of the faux show on Hulu — a joint venture of the NBC Universal division of General Electric and the News Corporation — will include Jason Sudeikis of “Saturday Night Live,” Olivia Munn of the G4 cable channel and the comedian Fred Willard.

Those Hulu users who watch the “Bing-a-thon” will receive a reward: the ability to watch TV shows or movies on hulu.com without commercial interruptions. (Yes, you have to watch a commercial to avoid watching other commercials.)

After that will come integrations of Bing into shows on networks that are part of NBC Universal as well as on cable channels that are units of the MTV Networks division of Viacom.

The NBC Universal networks include NBC, with segments on “Late Show With Jimmy Fallon,” beginning next Friday, and integrations of Bing into episodes of a summer series, “The Philanthropist,” which starts on June 24.

As a seller of technology products and services, Microsoft “is in a highly competitive space,” said Ben Silverman, co-chairman at the NBC Entertainment unit of NBC Universal in Los Angeles, so it needs “innovation marketing” to break through the clutter.

For instance, the segments on “Late Show” will present Mr. Fallon as a quiz master, asking contestants to use bing.com to search for answers to questions in categories like travel, health and shopping.

“ ‘Bing’ sounds like a Jimmy Fallon word,” Mr. Silverman said, laughing. “The alignment is great.”



On “The Philanthropist,” in which James Purefoy portrays a globetrotting do-gooder, the Bing Maps feature will establish where in the world the character is; other characters will use bing.com to seek information.

And viewers will be prompted as commercial breaks begin to visit bing.com to learn more about subjects discussed during “The Philanthropist,” scheduled to run for eight episodes (which makes it, in the vocabulary of the drum-beating Mr. Silverman, “a summer maxi-series”).

The sponsorship deal with MTV Networks is to start on Thursday on “The Daily Show With Jon Stewart” on Comedy Central, and continue through June 17 on “Top 20 Countdown” on CMT, “The George Lopez Show” on Nick at Nite, “Charm School” on VH1 and “Real World — Road Rules Challenge Duel II Reunion Special” on MTV.

The Bing sponsorship will be centered on offering viewers about two minutes of additional content for each show by reducing the number of commercials. (Advertisers like Philips have done that before, enabling programs like “NBC Nightly News With Brian Williams” to report more news.)

The MTV Networks shows will carry a Bing spot, created by JWT, called “Fast Forward,” which looks like viewers are using the fast-forward feature on a DVR or VCR to zip through 2 1/2 minutes of commercials in 30 seconds. The intended message is that bing.com is about “getting what you want,” the spot declares.

“What’s great about ‘Fast Forward’ is that it flips traditional TV advertising on its head,” Judy McGrath, chairwoman and chief executive at MTV Networks in New York, wrote in an e-mail message, “to the benefit of the marketer and the consumer.”

“We’re delivering stronger exposure for the brand,” she added, “and more show for the fan.”

The risk with all branded entertainment is that it comes across to consumers as too much brand and not enough entertainment.

In searching for ways to be “baked into the shows,” said Eric Hadley, general manager of worldwide marketing for search and MSN at Microsoft, the goal must be to get consumers to “say ‘Oh, wow’ ” and not “ ‘That’s it?’
Read More..