Thank you to all the amazing students and parents! Your smiles inspire great teachers!
Wednesday, December 19, 2012
Google Invites: Wallace and Grommit
We are rendering and encoding the invites this week on Google homepage!
http://www.google.com/
Click on "Invite the whole family to hang out" and invite your friends!
Saturday, December 1, 2012
Wednesday, November 28, 2012
Phantom Flex Cheetah
A National Geographic film crew captured stunning slow motion footage of cheetahs running in excess of 60 MPH using a Phantom high speed camera filming at 1200 frames per second.
Wednesday, September 26, 2012
Bits vs. Bytes
BIT
In computing a bit refers to a binary state of 0 or 1, information processed by an electrical circuit flipping between on/off states. Bits were first used to record data in 1725 by Basile Bouchon and Jean-Baptiste Falcon. They invented the punch card as a more robust form of the perforated paper rolls then in use for controlling textile looms in France. This technique was greatly improved by Joseph Marie Jacquard for his Jacquard loom in 1801.
BYTE
The byte is a unit of digital information in computing that consists of eight bits. Historically, a byte was the number of bits used to encode a single character of text in a computer. The standard of eight bits is a convenient power of two permitting the values 0 through 255 for one byte. It is also the standard unit used to render the 256 levels of color in 8-bit computer graphics, such as the "sprite" objects for Nintendo's NES gaming console released in 1985. In modern YUV 4:4:4 video, each 8-bit channel (24 bits per pixel) defines a digital color space in terms of one luma (Y) and two chroma (UV) samples.
USAGE
8 bits = 1 byte
KB (uppercase "B") refers to kilobytes @ 1024 KB = 1 MB
MB (uppercase "B") refers to megabytes @ 1024 MB = 1 GB
GB (uppercase "B") refers to gigabytes @ 1024 GB = 1 TB
kbps (lowercase "b") refers to kilobits per second @ 1024 kbps = 1 mpbs
mbps (lowercase "b") refers to megabits per second @ 1024 mbps = 1 gbps
gbps (lowercase "b") refers to gigabits per second @ 1024 gbps = 1 tbps
EDIT "LOSSLESS" CODECS
Generally video codecs fall into two categories. Professional edit codecs are designed to be high bitrate (25 mbps ~ 250 mbps), preserving image quality thru multiple generations of editing, compositing, and color correction. "Lossless" is a relative term, since these are mostly compressed YUV 4:4:4 color spaces. (See PSNR) Also, many professional cameras and codecs use YUV 4:2:2 to cut chroma sampling in half.
Avid DNxHD @ 36 ~ 220 mbps (270 ~ 1650 MB/minute)
Apple ProResHD @ 50 ~ 250 mbps (375 ~ 1875 MB/minute)
Apple Animation (32-bit RLE) @ 90 ~ 320 mbps (675 ~ 2400 MB/minute)
Sony HDCAM @ 135 mbps (1012 MB/minute)
Sony XDCAM @ 18 ~ 50 mbps (135 ~ 375 MB/minute)
Panasonic DVCProHD @ 40 mbps ~ 110 mbps (300 ~ 825 MB/minute)
REDcode @ 80mbps ~ 336 mbps (600 ~ 2520 MB/minute)
8-bit Uncompressed 720p30 4:2:2 @ 442 mbps (3315 MB/minute)
10-bit Uncompressed 1080p30 4:4:4 @ 1866 mbps (14000 MB/minute)
Sample math:
30 bits/pixel x 1920 x 1080 = 62208000 bits per frame
62.208 megabits/frame x 30 frames = 1866.24 mbps
DELIVERY "LOSSY" CODECS
Delivery codecs are typically used once all post production is complete, and a deliverable is needed for VOD streaming playback to mobiles, web browsers, cable, DVD, BluRay, and DVRs. They are very lossy and low bitrate (300kbps ~ 6 mbps). Color space is usually constrained to YUV 4:2:0 subsampling, since it will no longer be manipulated, and only needs to be viewed by the human eyeball.
MP4 H.264/AVC is the delivery codec used for modern video applications. It is decoded with hardware in just about every video device made today, from phones to HD televisions. It is transported in an MP4 (.mp4) file container, and may be embedded using HTML5, Flash, or Java players. Common H.264 streaming video targets include:
LD 240p 3G Mobile @ H.264 baseline profile 350 kbps (3 MB/minute)
LD 360p 4G Mobile @ H.264 main profile 700 kbps (6 MB/minute)
SD 480p WiFi @ H.264 main profile 1200 kbps (10 MB/minute)
HD 720p @ H.264 high profile 2500 kbps (20 MB/minute)
HD 1080p @ H.264 high profile 5000 kbps (35 MB/minute)
H.264 FRAMESIZE - square pixel examples
4:3 (1.33) standard: 320x240, 384x288, 480x360, 576x432, 640x480, 768x576
16:9 (1.77) widescreen: 432x240, 512x288, 640x360, 768x432, 854x480, 1024x576, 1280x720, 1920x1080
H.264 PROFILE - based on vertical resolution "p" value
Baseline - for low definition (LD) 240p to 288p, compatible with older 3G mobiles
Main - for standard definition (SD) 360p to 480p, good for 4G smartphones and tablets
High - for high definition (HD) 720p to 1080p, best quality for hardware with a good decoder
THE HUMAN EYE
The human eye contains two major types of light-sensitive photoreceptor cells used for vision. There are an average of 90 million rods and 4.5 million cones in the human retina.
Rod cells detect luminance (brightness). They are responsible for low-light (scotopic) and monochrome (black-and-white) vision. Rod density is greater in the peripheral retina.
Cone cells detect chrominance (color), and they require brighter light than rods to function. In humans, there are three types of cones, maximally sensitive to long-wavelength "red" (564 nm ~ 580 nm), medium-wavelength "green" (534 nm ~ 545 nm), and short-wavelength "blue" (420 nm ~ 440 nm) light. Cones are mostly concentrated in and near the fovea. Only a few are present at the sides of the retina.
In computing a bit refers to a binary state of 0 or 1, information processed by an electrical circuit flipping between on/off states. Bits were first used to record data in 1725 by Basile Bouchon and Jean-Baptiste Falcon. They invented the punch card as a more robust form of the perforated paper rolls then in use for controlling textile looms in France. This technique was greatly improved by Joseph Marie Jacquard for his Jacquard loom in 1801.
BYTE
The byte is a unit of digital information in computing that consists of eight bits. Historically, a byte was the number of bits used to encode a single character of text in a computer. The standard of eight bits is a convenient power of two permitting the values 0 through 255 for one byte. It is also the standard unit used to render the 256 levels of color in 8-bit computer graphics, such as the "sprite" objects for Nintendo's NES gaming console released in 1985. In modern YUV 4:4:4 video, each 8-bit channel (24 bits per pixel) defines a digital color space in terms of one luma (Y) and two chroma (UV) samples.
USAGE
8 bits = 1 byte
KB (uppercase "B") refers to kilobytes @ 1024 KB = 1 MB
MB (uppercase "B") refers to megabytes @ 1024 MB = 1 GB
GB (uppercase "B") refers to gigabytes @ 1024 GB = 1 TB
kbps (lowercase "b") refers to kilobits per second @ 1024 kbps = 1 mpbs
mbps (lowercase "b") refers to megabits per second @ 1024 mbps = 1 gbps
gbps (lowercase "b") refers to gigabits per second @ 1024 gbps = 1 tbps
EDIT "LOSSLESS" CODECS
Generally video codecs fall into two categories. Professional edit codecs are designed to be high bitrate (25 mbps ~ 250 mbps), preserving image quality thru multiple generations of editing, compositing, and color correction. "Lossless" is a relative term, since these are mostly compressed YUV 4:4:4 color spaces. (See PSNR) Also, many professional cameras and codecs use YUV 4:2:2 to cut chroma sampling in half.
Avid DNxHD @ 36 ~ 220 mbps (270 ~ 1650 MB/minute)
Apple ProResHD @ 50 ~ 250 mbps (375 ~ 1875 MB/minute)
Apple Animation (32-bit RLE) @ 90 ~ 320 mbps (675 ~ 2400 MB/minute)
Sony HDCAM @ 135 mbps (1012 MB/minute)
Sony XDCAM @ 18 ~ 50 mbps (135 ~ 375 MB/minute)
Panasonic DVCProHD @ 40 mbps ~ 110 mbps (300 ~ 825 MB/minute)
REDcode @ 80mbps ~ 336 mbps (600 ~ 2520 MB/minute)
8-bit Uncompressed 720p30 4:2:2 @ 442 mbps (3315 MB/minute)
10-bit Uncompressed 1080p30 4:4:4 @ 1866 mbps (14000 MB/minute)
Sample math:
30 bits/pixel x 1920 x 1080 = 62208000 bits per frame
62.208 megabits/frame x 30 frames = 1866.24 mbps
DELIVERY "LOSSY" CODECS
Delivery codecs are typically used once all post production is complete, and a deliverable is needed for VOD streaming playback to mobiles, web browsers, cable, DVD, BluRay, and DVRs. They are very lossy and low bitrate (300kbps ~ 6 mbps). Color space is usually constrained to YUV 4:2:0 subsampling, since it will no longer be manipulated, and only needs to be viewed by the human eyeball.
MP4 H.264/AVC is the delivery codec used for modern video applications. It is decoded with hardware in just about every video device made today, from phones to HD televisions. It is transported in an MP4 (.mp4) file container, and may be embedded using HTML5, Flash, or Java players. Common H.264 streaming video targets include:
LD 240p 3G Mobile @ H.264 baseline profile 350 kbps (3 MB/minute)
LD 360p 4G Mobile @ H.264 main profile 700 kbps (6 MB/minute)
SD 480p WiFi @ H.264 main profile 1200 kbps (10 MB/minute)
HD 720p @ H.264 high profile 2500 kbps (20 MB/minute)
HD 1080p @ H.264 high profile 5000 kbps (35 MB/minute)
H.264 FRAMESIZE - square pixel examples
4:3 (1.33) standard: 320x240, 384x288, 480x360, 576x432, 640x480, 768x576
16:9 (1.77) widescreen: 432x240, 512x288, 640x360, 768x432, 854x480, 1024x576, 1280x720, 1920x1080
H.264 PROFILE - based on vertical resolution "p" value
Baseline - for low definition (LD) 240p to 288p, compatible with older 3G mobiles
Main - for standard definition (SD) 360p to 480p, good for 4G smartphones and tablets
High - for high definition (HD) 720p to 1080p, best quality for hardware with a good decoder
THE HUMAN EYE
The human eye contains two major types of light-sensitive photoreceptor cells used for vision. There are an average of 90 million rods and 4.5 million cones in the human retina.
Rod cells detect luminance (brightness). They are responsible for low-light (scotopic) and monochrome (black-and-white) vision. Rod density is greater in the peripheral retina.
Cone cells detect chrominance (color), and they require brighter light than rods to function. In humans, there are three types of cones, maximally sensitive to long-wavelength "red" (564 nm ~ 580 nm), medium-wavelength "green" (534 nm ~ 545 nm), and short-wavelength "blue" (420 nm ~ 440 nm) light. Cones are mostly concentrated in and near the fovea. Only a few are present at the sides of the retina.
Tuesday, September 25, 2012
Friday, September 14, 2012
Friday, August 31, 2012
Wednesday, August 29, 2012
8 Milestones in Recorded Sound
From Edison to T-Pain: cdza co-founder Matt McCorkle goes over the ways in which recording methods have changed, as we hear how music sounded like in each era.
Tuesday, August 28, 2012
Friday, August 24, 2012
Monday, August 20, 2012
Thursday, August 16, 2012
Close Captioning
The first thing to understand is how closed captions are delivered, stored, and read. There are two main approaches today.
1. Embedded within a video: CEA-608, CEA-708, DVB-T, DVB-S, WST. These caption formats are written directly in a video file, either as a data track or embedded into a video stream itself. Broadcast television uses this approach, as does iOS.
2. Stored as a separate file: DFXP, SAMI, SMPTE-TT, TTML, EBU-TT (XML), WebVTT, SRT (text), SCC, EBU-STL (binary). These formats pass caption information to a player alongside of a video, rather than being embedded in the video itself. This approach is usually used by browser-based video playback (Flash, HTML5)
Formats and standards
CEA-608 (also called Line 21) captions are the NTSC standard, used by analog television in the United States and Canada. Line 21 captions are encoded directly into a hidden area of the video stream by broadcast playout devices. If you’ve ever seen white bars and dots at the top of a program, that’s Line 21 captioning.
SCC files contain captions in Scenarist Closed Caption format. The file contains SMTPE timecodes with the corresponding encoded caption data as a representation of CEA-608 data.
CEA-708 is the standard for closed captioning for ATSC digital television (DTV) streams in the United States and Canada. There is currently no standard file format for storing CEA-708 captions apart from a video stream.
TTML stands for Timed Text Markup Language. TTML describes the synchronization of text and other media such as audio or video. See the W3C TTML Recommendation for more.
DFXP is a profile of TTML defined by W3C. DFXP files contain TTML that defines when and how to display caption data. DFXP stands for Distribution Format Exchange Profile. DFXP and TTML are often used synonymously.
SMPTE-TT (Society of Motion Picture and Television Engineers – Timed Text) is an extension of the DFXP profile that adds support for three extensions found in other captioning formats and informational items but not found in DFXP: #data, #image, and #information. See the SMPTE-TT standard for more. SMPTE-TT is also the FCC Safe Harbor format – if a video content producer provides captions in this format to a distributor, they have satisfied their obligation to provide captions in an accessible format. However, video content producers and distributors are free to agree upon a different format.
SAMI (Synchronized Accessible Media Interchange) is based on HTML and was developed by Microsoft for products such as Microsoft Encarta Encyclopedia and Windows Media Player. SAMI is supported by a number of desktop video players.
EBU-STL is a binary format used by the EBU standard, stored in separate .STL files.
EBU-TT is a newer format supported by the EBU, based on TTML. EBU-TT is a strict subset of TTML, which means that EBU-TT documents are valid TTML documents, but some TTML documents are not valid EBU-TT documents because they include features not supported by EBU-TT.
SRT is a format created by SubRip, a Windows-based open source tool for extracting captions or subtitles from a video. SRT is widely supported by desktop video players.
WebVTT is a text format that is similar to SRT. The Web Hypertext Application Technology Working Group (WHATWG) has proposed WebVTT as the standard for HTML5 video closed captioning.
Hard subtitles (hardsubs) are, by definition, not closed captioning. Hard subtitles are overlaid text that is encoded into the video itself, so that they cannot be turned on or off, unlike closed captions or soft subtitles. Whenever possible, soft subtitles or closed captions are generally be preferred, but hard subtitles can be useful when targeting a device or player that does not support closed captioning.
Monday, August 6, 2012
Vidly test for Vince
test embed code
<iframe frameborder="0" width="640" height="360" name="vidly-frame" src="http://s.vid.ly/embeded.html?link=9i0p6s&autoplay=false">
<a target="_blank" href="http://vid.ly/9i0p6s"><img src="http://cf.cdn.vid.ly/9i0p6s/poster.jpg" /></a></iframe>
Friday, July 20, 2012
Thursday, July 12, 2012
Silverlight Test Streams
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803_300k.ismv
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803_600k.ismv
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803_1200k.ismv
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803_2400k.ismv
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803.enc
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803.ism
http://d2db0x7wiq0z0s.cloudfront.net/silverlight/gopro_hdhero2_33008803.ismc
Sample XML:
<?xml version="1.0"?>
<query>
<action>AddMedia</action>
<userid></userid>
<userkey></userkey>
<source>http://markusbucket.s3.amazonaws.com/source/gopro_hdhero2.mp4?nocopy</source>
<notify/>
<region>us-east-1</region>
<format>
<output>smooth_streaming</output>
<destination></destination>
<audio_bitrate>128k</audio_bitrate>
<audio_sample_rate>48000</audio_sample_rate>
<audio_channels_number>2</audio_channels_number>
<framerate>24</framerate>
<keep_aspect_ratio>yes</keep_aspect_ratio>
<video_codec>libx264</video_codec>
<profile>smooth_streaming</profile>
<audio_codec>dolby_aac</audio_codec>
<turbo>yes</turbo>
<keyframe>48</keyframe>
<audio_volume>100</audio_volume>
<bitrates>300k,600k,1200k,2400k</bitrates>
<sizes>224x0,448x0,768x0,1280x0</sizes>
</query>
Sample XML:
<?xml version="1.0"?>
<query>
<action>AddMedia</action>
<userid></userid>
<userkey></userkey>
<source>http://markusbucket.s3.amazonaws.com/source/gopro_hdhero2.mp4?nocopy</source>
<notify/>
<region>us-east-1</region>
<format>
<output>smooth_streaming</output>
<destination></destination>
<audio_bitrate>128k</audio_bitrate>
<audio_sample_rate>48000</audio_sample_rate>
<audio_channels_number>2</audio_channels_number>
<framerate>24</framerate>
<keep_aspect_ratio>yes</keep_aspect_ratio>
<video_codec>libx264</video_codec>
<profile>smooth_streaming</profile>
<audio_codec>dolby_aac</audio_codec>
<turbo>yes</turbo>
<keyframe>48</keyframe>
<audio_volume>100</audio_volume>
<bitrates>300k,600k,1200k,2400k</bitrates>
<sizes>224x0,448x0,768x0,1280x0</sizes>
</query>
Wednesday, June 27, 2012
Thursday, June 21, 2012
Blu-ray authoring in Encore CS6
Dave Helmly guides you through the basics of Blu-ray authoring with Adobe Encore.
More of Dave's videos here:
http://blogs.adobe.com/davtechtable/
http://tv.adobe.com/show/davtechtable/
Tuesday, May 15, 2012
Encoding.com at NAB
Yesterday, Encoding.com announced that Revision3 (which was recently acquired by Discovery Channel) is replacing its in-house encoding infrastructure with Encoding.com.
Jeff Malkin, Encoding.com's president, discusses why over 3,000 companies across multiple industries have elected to work with the company for their encoding needs. Encoding.com is also moving into TV Everywhere, to support longer-form video encoding for multiple device delivery.
Monday, April 30, 2012
Silverlight Smooth Streaming
For output, Silverlight Smooth Streaming creates the following file types:
*.ismv
MP4 container files that contain MP4 video fragments (and audio fragments if the video source also contains audio). Expression Encoder creates one .ismv file per bit rate, and the number of bit rates depends on the IIS Smooth Streaming preset that you select. For example, if you select a preset that specifies that the video be encoded using nine different bit rates, Expression Encoder creates nine .ismv files.
*.isma
MP4 container files that contain only audio fragments. If you encode an audio-only source, this is the file format that results. As with .ismv files, the number of .isma files that are created can vary depending on your output choice.
*.ism
An XML-based server manifest file that describes the available bit rates in the encoded presentation. A Smooth Streaming-enabled server uses this file.
*.ismc
An XML-based client manifest file that includes important information about the presentation, such as the available bit rates, the codecs that are used, and other information required by Smooth Streaming-compatible clients to view the presentation.
*.ismv
MP4 container files that contain MP4 video fragments (and audio fragments if the video source also contains audio). Expression Encoder creates one .ismv file per bit rate, and the number of bit rates depends on the IIS Smooth Streaming preset that you select. For example, if you select a preset that specifies that the video be encoded using nine different bit rates, Expression Encoder creates nine .ismv files.
*.isma
MP4 container files that contain only audio fragments. If you encode an audio-only source, this is the file format that results. As with .ismv files, the number of .isma files that are created can vary depending on your output choice.
*.ism
An XML-based server manifest file that describes the available bit rates in the encoded presentation. A Smooth Streaming-enabled server uses this file.
*.ismc
An XML-based client manifest file that includes important information about the presentation, such as the available bit rates, the codecs that are used, and other information required by Smooth Streaming-compatible clients to view the presentation.
Thursday, April 26, 2012
Thursday, April 19, 2012
Wednesday, April 11, 2012
HTTP pseudo streaming vs. HTTP adaptive streaming
HTTP pseudo streaming
Both MP4 and FLV videos can be played back with a mechanism called HTTP pseudo streaming. This mechanism allows your viewers to seek (aka "trick play") to not-yet downloaded parts of a video by referencing timescale in the file header. YouTube is an example site that offers this functionality.
HTTP pseudo streaming combines the advantages of straight HTTP "progressive download" (it passes any firewall, viewers on bad connections can simply wait for the download) with the ability to seek to non-downloaded parts. The drawbacks compared to RTSP/RTMP are its reduced security (HTTP is easier to sniff than RTMP) and long loading times when seeking in large videos (durations over 15 minutes).
Both MP4 and FLV videos can be played back with a mechanism called HTTP pseudo streaming. This mechanism allows your viewers to seek (aka "trick play") to not-yet downloaded parts of a video by referencing timescale in the file header. YouTube is an example site that offers this functionality.
HTTP pseudo streaming combines the advantages of straight HTTP "progressive download" (it passes any firewall, viewers on bad connections can simply wait for the download) with the ability to seek to non-downloaded parts. The drawbacks compared to RTSP/RTMP are its reduced security (HTTP is easier to sniff than RTMP) and long loading times when seeking in large videos (durations over 15 minutes).
HTTP adaptive streaming
HTTP pseudo streaming should not be confused with HTTP adaptive streaming. Pioneered by Apple for iOS, it uses master index files (.m3u8) and the segmented mpeg-4 files (.ts). Apple recommended 10-second segments at 30fps (300 frames each), per Apple Technical Note TN2224.
https://developer.apple.com/library/ios/#technotes/tn2010/tn2224.html
HTTP pseudo streaming should not be confused with HTTP adaptive streaming. Pioneered by Apple for iOS, it uses master index files (.m3u8) and the segmented mpeg-4 files (.ts). Apple recommended 10-second segments at 30fps (300 frames each), per Apple Technical Note TN2224.
https://developer.apple.com/library/ios/#technotes/tn2010/tn2224.html
With HTTP adaptive streaming, 5 different bitrates will output 30 segment (.ts) files per minute of video, so a 10-minute video will be 300 files. Because of this, you will probably want to encode each video to a separate destination directory to keep your server organized.
HTTP adaptive streaming has different "brand names", known as Apple (HLS) HTTP Live Streaming, Microsoft Silverlight Smooth Streaming, and Adobe Dynamic Streaming (HDS)
Wowza server fully supports both HTTP pseudo streaming and HTTP adaptive streaming.
More details:
http://phpmotionwiz.com/info/what-is-pseudo-streaming
http://h264.code-shop.com/trac
http://flowplayer.org/plugins/streaming/pseudostreaming.html
http://www.longtailvideo.com/support/jw-player/jw-player-for-flash-v5/12534/video-delivery-http-pseudo-streaming
Wowza server fully supports both HTTP pseudo streaming and HTTP adaptive streaming.
More details:
http://phpmotionwiz.com/info/what-is-pseudo-streaming
http://h264.code-shop.com/trac
http://flowplayer.org/plugins/streaming/pseudostreaming.html
http://www.longtailvideo.com/support/jw-player/jw-player-for-flash-v5/12534/video-delivery-http-pseudo-streaming
Monday, April 9, 2012
7561 Pro HTML5 Test
Embed code
<video controls width="640" height="480">
<source src="http://www.pixelgoat.net/encoding/demos/7561_pro.mp4" type='video/mp4' />
<source src="http://www.pixelgoat.net/encoding/demos/7561_pro.webm" type='video/webm' />
</video>
Thursday, April 5, 2012
Monday, March 19, 2012
Sunday, March 11, 2012
Thursday, February 23, 2012
Monday, January 30, 2012
Thursday, January 26, 2012
Halo "Landfall" HLS Embed Test (Safari-only)
<video controls width="512" height="288">
<source src="http://markusbucket.s3.amazonaws.com/watch_apple_output/halo_landfall_shorts_2925.m3u8" type='application/x-mpegURL' /></video>
Subscribe to:
Posts (Atom)