This lesson is designed to introduce basic information about streaming technologies. By the end of this lesson you should be able to:
· Streaming Defined [[link to single html page on this]].
· Streaming: How It Works [[link to single page on this]].
· Standards Efforts [[link to single html page on this]].
· Streaming Technology Issues [[link to single html page on this]].
· Unicast vs., Multicast [[link to single html page on this]].
· Streaming Protocols [[link to single html page on this]].
· Tips on Preparing Video Streaming [[link to single html page on this]].
· Streaming Technology Survey: Players and Servers [[link to single html page on this]].
· Activity [[link to single html page on this]].
· Streaming Summary [[link to single html page on this]].
· Streaming Media News (http://www.internetnews.com/streaming-news/)
· Streaming Media World (http://www.streamingmediaworld.com/)
· StreamingMedia.com (http://www.streamingmedia.com/)
· IcanStream [[http://www.icanstream.com]]
· Focus on Web3D (http://web3d.about.com/compute/web3d/msubstream.htm)
Streaming is a technique developed for transferring data between two computers in a steady and continuous stream. Typically, streaming technologies are fashioned on a client-server model. Streaming technologies have gained popularity because most internet users do not have fast enough connections for downloading large multimedia files. In the streaming scenario, the client browser or plug-in starts playing dynamic data as soon as a sufficient amount of data has arrived from the streaming server. This directly contrasts with a static model of data delivery, where all the data is delivered to the client machine prior to actual playback.
[[Graphic Here Showing Streaming Process]]
For this whole process to work properly, the client browser must receive the data from the server and pass it to the streaming application for processing. The streaming application converts the data into pictures and sounds. An important factor in the success of this process is the ability of the client to receive data faster that the application can display the information. Excess data is stored in a buffer – an area of memory reserved for data storage within the application. If the data is delayed in transfer between the two systems, the buffer empties and the presentation of the material will not be smooth.
After reading the above information and answering the following reflective questions, you should be able to:
· Identify streaming technologies that are commonly used.
· Describe difficulties in using streaming technologies
· Explain the probable cause of jerky movie clips on the web]]
1. Based on the definition of streaming, can you identify some of the streaming technologies that you are using on a regular basis? Give some examples.
2. What are some of the difficulties that you have had trying to use these applications?
3. Have you ever watched a movie clip in the web that played in a jerky, start-stop fashion? What was the probable cause?
Streaming: How It Works
To understand why streaming technologies have evolved, it is important to understand how the World Wide Web functions. Web servers are often described as "stateless" which means a Web server process requests for information, send the data completing the request, it disconnects, and goes to the next task. On the browser or client side, the browser takes the incoming data, assembles it in the browser window and disconnects from the server until a new request is initiated by a mouse click. In summary, the stateless connection is a quick exchange of information and the transaction is completed.
[[Graphic or animation showing how Stateless connections work]]
This approach works very well for static media like graphics and text. But dynamic data, like sound, animation, video or dynamic graphics, become problematic. These types of elements tend to be larger in size and have the dimension of time incorporated into their delivery. Because of the larger file sizes and the slow connection speed that most users have, stateless approaches typically are unacceptable for use in most dynamic media applications. In the stateless scenario, a video file would have to be downloaded before it could be used. This could take several minutes or more. Typically users are not willing to wait while these large media files are being transferred.
Because the Web was primarily designed for a stateless connection approach, new technologies were needed for the display of dynamic data (video, sound, and animation). Thus, continuous connection strategies evolved for “streaming” of dynamic content. Most of these technologies were developed to bypass the limitation imposed by the specifications used for the World Wide Web.
In the continuous connection approach, the client and the server stay connected while content is being delivered to the application displaying the dynamic content. Thus, data is being displayed as it is delivered to the client application - data flows into the buffer, when a sufficient amount of information is delivered to the buffer, it begins displaying in the application. Data continues to flow into the buffer, the application manages the use of this material making sure that it fed to the display window at an even rate.
[[Graphic Here. Show how continuous connections work]]
A major difference between the stateless and continuous approach is that the data is thrown away once it is used. In the stateless approach, when a video clip is downloaded, it actually resides on my computer, it is a file one can save and store. On the other hand, in the continuous approach, the material is stored in the buffer, thus, all you only ever have stored is the material that is in the buffer – nothing more. Once the buffer is depleted, all content is removed from your machine.
After reading the above information and answering the following reflective questions, you should be able to:
· Explain the difference between stateless and continuous connections in streaming.
· Evaluate which types of data are best used for stateless or continuous connections.]]
1. Describe the difference between stateless and continuous connections.
2. What types of data are best suited for a stateless connection? Why? For continuous connections? Why?
What sits behind all this streaming technology? As in most Internet technologies, standards have been developed to provide common ways of doing things. Standards generally develop out of a need for content (in this case dynamic media) to inter-operate between applications and vendors solutions. The following materials provide a description of standards efforts underway or completed that deal with dynamic media.
Advanced Streaming Format, a streaming multimedia file format developed by Microsoft. ASF has been submitted to ISO and IETF for standardization. It is expected to eventually replace the older AVI format.
A standard approved by the International Telecommunication Union (ITU) that defines how audiovisual conferencing data is transmitted across networks. In theory, H.323 should enable users to participate in the same conference even though they are using different videoconferencing applications. Although most videoconferencing vendors have announced that their products will conform to H.323, it's too early to say whether such adherence will actually result in interoperability.
A suite of standards approved by the International Telecommunications Union (ITU) that defines videoconferencing over analog (POTS) telephone wires. One of the main components of H.324 is the V.80 protocol that specifies how modems should handle streaming audio and video data.
Real-Time Transport Protocol (RTP)
Real-Time Transport Protocol, is an Internet protocol for transmitting real-time data such as audio and video. RTP itself does not guarantee real-time delivery of data, but it does provide mechanisms for the sending and receiving applications to support streaming data. Typically, RTP runs on top of the UDP protocol, although the specification is general enough to support other transport protocols.
Real Time Streaming Protocol (RTSP)
Real Time Streaming Protocol is a proposed standard for controlling streaming data over the World Wide Web. RTSP grew out of work done by Columbia University, Netscape and RealNetworks, and has been submitted to the IETF for standardization. Like H.323, RTSP uses RTP (Real-Time Transport Protocol) to format packets of multimedia content. But whereas H.323 is designed for videoconferencing of moderately-sized groups, RTSP is designed to efficiently broadcast audio-visual data to large groups.
Synchronized Multimedia Integration Language (SMIL)
Synchronized Multimedia Integration Language is a new markup language being developed by the World Wide Web Consortium (W3C) that would enable Web developers to divide multimedia content into separate files and streams (audio, video, text, and images), send them to a user's computer individually, and then have them displayed together as if they were a single multimedia stream. The ability to separate out the static text and images should make the multimedia content much smaller so that it doesn't take as long to travel over the Internet. SMIL is based on the eXtensible Markup Language (XML). Rather than defining the actual formats used to represent multimedia data, it defines the commands that specify whether the various multimedia components should be played together or in sequence.
After reading the above information and answering the following reflective questions, you should be able to:
· Explain why standards efforts for dynamic media are important
· Evaluate the impact of developers using and not using standards]]
1. Why are standards efforts important?
2. What happens if developers use standards? If they don’t?
The real issue with providing dynamic media via the Web goes back to the subject of file size. Digital audio and video files are very large – huge by net standards. Thus, streaming technologies have focused primarily on the streaming audio and postage-stamp video. This is changing though. Advancements in compression technologies and the widespread growth in higher bandwidth services are making the quality of streaming materials much more acceptable.
There are several issues that must be addressed when considering streaming media. The discussion generally comes down to one of bandwidth verses quality. The following discussion will give you a greater understanding of the issues related to streaming.
Determining which direction you take with streaming technologies requires that you have some knowledge about the audience for your content and the technology infrastructure in which it will be delivered. If you have a user community that is primarily dial-up, then you have limited options and a low bandwidth solution might be in order. If you have a closed intranet community within your company and have high speed data networks to the desktop, you more than likely can stream much higher quality content.
These characteristics of your audience all reflect on the preparation of the material you will stream to your users and how it must be optimized for that community of users that you serve. The situation gets much more muddled when you don’t have a clear understanding of the audience. In this situation, you might develop a strategy that will provide different resource at varying degrees of quality.
Now that you know your audience and the infrastructure, you need to reduce file sizes so that they work for a modem or LAN, and this is where the idea of compression comes to light. The goal of streaming compression is to throw away data that you don't need. This makes the file size much smaller. But be careful - if reduced too far, it begins to degrade the image and sound quality. Software compression/decompression is a rapidly evolving and competitive industry, thus much of the work has taken place outside the standards bodies and still remains proprietary in nature. Because of the evolving nature of this industry, advancements are still radically changing the character of the marketplace and quality of the compression continues to improve dramatically as a result.
The process for delivering the dynamic content (audio, video, and graphics) is generally achieved through the use of a codecs. The term codec is short for compressor/decompressor. Codecs can be implemented in a variety off ways -- in hardware, software or both. Some popular codecs for digital video are MPEG, Indeo and Cinepak. On the Web, codecs are typically implemented as software plug-ins that add functionality to your browser. Plug-ins such as Apple’s QuickTime player, Microsoft’s Media Player, Real’s RealPlayer and MacroMedia’a Shockwave Player are examples of some of the major plug-ins that currently have wide acceptance on the Internet. Generally, these player are built in such a way that they will support a variety of different formats. See the Lesson on Video for more information [[link to T7L3]].
As mentioned earlier, bandwidth also will play an important role in how you develop content for streaming. Generally, the more bandwidth available the better the quality. Thus, if your know your typical user has a 28.8 modem connection, the quality of the content that will be streamed, particularly video content, will be quite low. This means so much data will have to be thrown away in the compression process, that the resultant stream will need to be very small file. This is typically considered a low-quality stream. Thus a low-bandwidth connection of 28.8 kbs or less would be considered a low quality stream.
On the other side of the equation, if you have dedicated networks with lots of capacity, you will be able to serve and stream high-bandwidth files. High-bandwidth, or high-quality stream files, would be files that typically would be above 300 kbs. With high-capacity networks, it is possible to serve MPEG 1 files, which give full-screen, full-motion video comparable to standard television
The type of content also determines how successfully you will be able to streaming the content. Typically, video is the most problematic because the sheer size of the data files. On the other hand, other types of content such as high-quality audio and vector graphics, might yield perfectly acceptable streams of high-quality content at 28.8 connections speeds.
Another area where streaming media delivery has a major impact is in the area of Quality of Service (QoS). Quality of Service refers to the ability to get service without interruption. Logically, the more streams you are serving the more saturated you networks become. Likewise, the higher the quality, the bigger the file sizes and again the more saturated the networks become. Thus, the whole area of QoS becomes an issue as your streaming applications impact other mission critical services. With this in mind, the highest quality streaming video may not necessarily be the best solution to implement if network capacity is already stressed. . Alternative delivery strategies sacrifice the highest quality for lesser quality to support a larger community of use. It might also be advisable to consider different strategies that conserve network capacity (see below).
How Do I Store My Files
As mentioned earlier, if your audience is fragmented and has different access capabilities, you might want to consider a strategy for streaming that stores multiple files of varying degrees of quality. In this way the user can select the file that is most suited for their particular access environment. There is a problem with this approach though. It requires much more storage capacity to store all these duplicate files at different degrees of quality. Remember as the quality increases for those with higher speed access, so does the amount of space required to store the file. So for example, you store the same video clip in three different types of users at low-, medium, and high quality. You now need much more storage capacity to store all three files.
Current Problems with Streaming Technologies
There are a number of issues in the current streaming marketplace. It would be good to review these.
1. No Standards
First, there is no single standard at the moment. In an effort to establish market dominance, the different vendors all have developed proprietary technologies. Thus, it is difficult to exchange content between vendors. For the market to really mature, standards must be established so that the vendors are able to exchange various kinds of streaming content.
2. Server Licensing Fees
Several of the server vendors have a business model that is built on the number of continuous streams being served. This fee escalates as the number of streams increase. Thus, it becomes expensive to serve large numbers of continuous connections. Several vendors are now basing there business model on the cost of the server and this will model will challenges the existing vendors to rethink the pricing structures.
3. Support for Multiple Platforms
Currently, most of the vendors either support Windows or Mac. This is changing but it has been a stumbling block to a more widespread acceptance of the streaming technologies. To be successful vendors will have to support all the major operating system vendors with their player products. Expect to see this in the next year.
4. Saturated Networks
Networks continue to see more and more use. For example, streaming video uses large amounts of bandwidth. Networks have become saturated. If you are streaming over the Internet, it is common for connections to drop information as it is being delivered causing a poor quality experience. Expect strategies like Multicast streaming, higher bandwidth connections and higher capacity networks to improve the quality of the experience in the coming years.
· Analyze the reason why there is a growing demand for streaming applications
· Identify and justify the most important consideration in when to use streaming technologies]]
1. What is fueling the growing demand for streaming applications?
2. What do you feel is the most important consideration in choosing to use streaming technologies? Why?
Codec Central (http://www.codeccentral.com)
Unicast vs. Multicast
Because streaming media content can consume large amounts of bandwidth, several strategies have evolved to conserve precious network resources. The two popular techniques currently in use are 1) Unicast and 2) Multicast. These two strategies generally are used where network administrators can monitors the saturation of the network capacity.
Multicast streaming techniques have gained popularity in recent years because they are the most efficient way to stream content. Multicast works much like the TV broadcast model, in other words, users “tune-in” to a “webcast” of a particular program or channel. Like TV channels, programs are cast in real time to the network and views tune-in to the program. This technique conserves resources since a single stream is sent out and multiple people can connect to that stream. In other words, multicast sends a single packet to a subset of destinations on the network without replication. Multicast streaming has grown in popularity for such uses as live concerts, TV network programming and radio programming.
A benefit of multicasting live events is you don’t have to have huge storage capacity. Often content is digitized live by using specialized real-time digitizing equipment which allow content providers to push streams directly to the network. Often content providers will loop back this digitized stream to very fast storage devices for playback later in a store-and-forward unicast streaming model. Keep your eyes on these technologies the will become more and more pervasive as high speed networks and services become more and more inexpensive.
Receiving multicast programming can sometimes be problematic though. Often ISP and network administrators want to control the bandwidth usage on their networks, thus, they often will not enable multicast features in network hubs and routers to avoid network congestion.
Unicast streaming techniques sends multiple point-to-point streams to all destinations. In unicast the network is used inefficiently as packets are replicated throughout the network. Unicast applications have found a niche primarily in applications where there are closed, high capacity networks. In corporate sector where private networks have large amounts of available bandwidth, media server can store large amounts of content in digital repositories for use in a store-and-forward model. Because these networks have large capacity, high-quality video can be streamed to the desktop. Typically, this materials is used for corporate training, marketing and communications applications. The benefit of Unicast streaming is that the viewer can receive very high-quality video and audio.
· Explain the difference between multicast and unicast
· Evaluate the benefits of multicast and unicast]]
1. Describe the difference between multicast and unicast?
2. What are the benefits of each?
The HTTP is the predominant way in which documents are linked on the Internet. The client makes a connection to the server containing the file to be streamed, the file is retrieved and the connection closed. The HTTP server communicates to the browser the type of file to be transferred.
Benefits Using HTTP
When streaming a file using HTTP, a special streaming server is not required. As long as your browser understands MIME types (see the Browsers lesson for more information [[link to T2L4]]) it can receive a streaming file from a HTTP server. One of the distinct advantages of streaming files using HTTP is that it can pass through firewalls and utilize proxy servers.
HTTP streaming uses TCP/IP (Transmission Control Protocol and Internet Protocol) to ensure reliable delivery of the files. This process checks for missing packets and asks for them to be retransmitted. This become problematic in the streaming scenario when you want the data to be disregarded if it is lost in delivery, so dynamic files keep playing. HTTP cannot detect modem speed so server administrators must purposefully produce files at different compression rates to server users with different types of connections. Streaming files from HTTP servers is not recommended for high-demand situations.
RTSP is the standard protocol used by most of the streaming server vendors. RTSP servers use the UDP (User Datagram Protocol) to transfer media files. UDP does not continually check that files have arrived at their destination. This is an advantage for streaming applications because it allows for file transfers to be interrupted as long as the delay is not too long. The result of this method is that there is data loss at times, but files continue to play if the delay is small.
· Evaluate the advantages and disadvantages of using the two types of streaming protocols.
· Describe streaming protocols and their function]]
1. Describe the advantages and disadvantages of using RTSP and HTTP streaming.
2. How would you describe protocols and what they do?
1. Optimize your the source material when possible
* Video considerations
1) Shoot tighter shots
2) Minimize camera movements such as zooms and pans
3) Try to shoot against plain backgrounds with little activity
4) Always use a tripod for steady shots
5) Turn auto focus off, avoids unintended focusing on non-essential elements
6) Try to use a high-quality tape format such as MiniDV, S-VHS or Hi8
6) Use new tapes to avoid glitches, dropouts, and other image degradation
* Audio considerations
1) Record in a quiet environment with little or no background noise if possible
2) Use high-quality recording equipment to minimize noise
2. Choosing the format
· Tutorials on Production, Encoding, Server Side, and Web Integration (http://www.streamingmedia.com/tutorials/index.asp)
· Streaming Media 101: A series of article on aspects of streaming media (http://builder.cnet.com/Graphics/StreamingMedia/)
We have established that streaming is a client-server application, so it would now be appropriate to look at the various technologies that are current part of the streaming landscape. We will look first at the major player technologies, then at the server technologies.
Streaming players are needed to support the streaming of content to the end user. . Players come in a variety of flavors. Since you can stream almost any kind of dynamic data, many proprietary players have been developed to support these media types. Thus, it is difficult to cover all the different players used to stream various types of media. The following discussion of player and underlying technologies will give an overview of those technologies that are most popular and dominate the market.
A new multimedia streaming technology developed by Microsoft. ActiveMovie is already built into the Internet Explorer browser will be part of future versions of the Windows operating system. Supporting most multimedia formats, including MPEG, ActiveMovie enables users to view multimedia content distributed over the Internet, an intranet, or CD-ROM. ActiveMovie's main competition is the QuickTime standard developed by Apple Computer.
QuickTime is Apple's technology for handling video, sound, animation, graphics, text, music, and even 360-degree virtual reality (VR) scenes. Using a single player, QuickTime allows you to play more than 200 kinds of digital media by supporting the playback of multiple file formats. This player also supports streaming technologies such as streaming of MPEG, QuickTime, RTSP, MP3 and other media types.
IBM VideoCharger Player
A proprietary software player design to work with IBM Content Manager VideoCharger server. This player has all the functionality of a VCR remote and allows access to streams being sent from the Content Manager VideoCharger.
Macromedia Shockwave Player
This software player enables the playback of Director, interactive content over the Internet. It is freely distributed and is widely used. With this player interactive multimedia projects developed in Director can now be incorporated and streamed into most popular Web browsers.
Microsoft Windows Media Player
Microsoft’s Windows Media Player is player application, which works with your Browser, to enable the streaming of dynamic media. Windows media player allow end users to play a variety of formats such as Windows Media (WMA), MP3, WAV, AVI, MPEG and others. The Media Player has all the functions for stopping, starting and pausing your stream, as well as the ability to resize the player window to receive streaming video at various sizes.
Real Networks RealPlayer
The RealPlayer is Real Networks current media player technology. This player has evolved from RealAudio and Real Video Players, which were early iterations of the player technology. The current RealPlayer is built upon the proprietary G2 technology that allows the streaming of various kinds of media content from the RealServer platform. The following are short descriptions of Real’s earlier technologies.
The de facto standard for streaming audio data over the World Wide Web. RealAudio was developed by RealNetworks and supports FM-stereo-quality sound. To hear a Web page that includes a RealAudio sound file, you need a RealAudio player or plug-in, a program that is freely available from a number of places. It's included in current versions of both Netscape Navigator and Microsoft Internet Explorer.
A streaming technology developed by RealNetworks for transmitting live video over the Internet. RealVideo uses a variety of data compression techniques and works with both normal IP connections as well as IP Multicast connections.
IBM Content Manager VideoCharger
IBM’s media server product is called the VideoCharger. Like most of the other servers it manages the streaming of simultaneous streams to a media player. IBM has a proprietary player called the VideoCharger Player, that is used in conjunction with your web browser and it can streams a variety of formats including MPEG-1 and MPEG-2, Quicktime and others.
The RealServer is the guts behind RealNetworks streaming services. Built upon proprietary streaming technology, RealServer allows the management of continuous streams of video and audio to be sent to the RealPlayer client application or plug-in that sits on the end-users computer. RealServer comes in a variety of configurations based on how many simultaneous session you would like to streams to the clients.
Microsoft NetShow (Microsoft Media Player)
A specification developed by Microsoft for streaming multimedia content over the World Wide Web. A competing specification backed by Netscape is RTSP. NetShow has both a client (NetShow client) and a server (NetShow Server). In recent releases of the client, it’s name has been changed to Microsoft Media Player. Netshow server uses the Active Streaming Format (ASF) and a method of enabling the streaming content to the client. AFS adds application-level bandwidth reservation to any data stream. NetShow also has a high-end server product called the NetShow Theatre Server. This product has been developed for streaming very high quality video and audio content in controlled network environments.
Overview of NetShow and ASF
NetShow Theatre Server
QuickTime Streaming Server
Bundled as part of the OSX Server, the QuickTime Streaming servers allows for the management and streaming of simultaneous streams or video to any QuickTime Player client. The QuickTime Players is supported on Macintosh, Windows and some Unix based computers. Unlike most of the competitors, the QT Streaming Server is bundled with both the OSX server and there is no charge for how many simultaneous streams you are streaming.
This lesson is designed to introduce basic information about streaming technologies. When finished with this lesson, you should be able to:
The following is a short summary of the topic covered in this lesson. If you are still having problems understanding these topic, you should consider review the lessons again. If you are still having difficulty, you should consider some of the additional resources listed as a source of greater information on each of the topics.
1. Streaming is a technique developed for transferring data between two computers in a steady and continuous stream. Typically, streaming technologies are fashioned on a client-server model. In streaming applications, the two computers remain connected until the streams has been completed.
2. Stateless connections are the foundation for the World Wide Web. In this model, a request for data is made from a client to the server, the server then opens a connection and transfers the data. Once completed, the connection is closed and the transaction is completed. In the continuous connection, used for streaming technologies, a continuous connections is made between the client and the server until the stream has completed.
3. As in most Internet technologies, standards have been developed to provide standard ways of doing things. Standards generally develop out of a need for content (in this case dynamic media) to inter-operate between applications and vendors solutions. The standards efforts are still in their infancy related to streaming, thus inter-operability is still a big issue for end-users.
4. One should not blindly stumble into streaming. There are a number of considerations what must be weighted before you implement streaming technologies. You should consider your audience, your technology infrastructure, the type content, compression strategies, Quality of Service and you’re available bandwidth.
5. Should I Multicast or Unicast? is a question you should ask of yourself. It might be good to do your homework before you implement a plan for streaming.