A terabyte is big. Imagine 60 stacks of paper as tall as the Eiffel Tower. The first thing Netflix does is spend a lot of time validating the video. It looks for digital artifacts, color changes, or missing frames that may have been caused by previous transcoding attempts or data transmission problems. A pipeline is simply a series of steps data is put through to make it ready for use, much like an assembly line in a factory. More than 70 different pieces of software have a hand in creating every video.
The video chunks are then put through the pipeline so they can be encoded in parallel. In parallel simply means the chunks are processed at the same time. Which would be faster, one person washing the dogs one after another? Or would it be faster to hire one hundred dog washers and wash them all the same time?
They need a lot of servers to process these huge video files in parallel. It works too. Netflix says a source media file can be encoded and pushed to their CDN in as little as 30 minutes. The encoding process creates a lot of files. The end goal for Netflix is to support every internet-connected device. Netflix started streaming video in on Microsoft Windows. In all, Netflix supports different devices. Each device has a video format that looks best on that particular device.
Netflix also creates files optimized for different network speeds. There are also files for different audio formats. Audio is encoded into different levels of quality and in different languages. There are also files included for subtitles. A video may have subtitles in a number of different languages. There are a lot of different viewing options for every video.
What you see depends on your device, your network quality, your Netflix plan, and your language choice. Stranger Things season 2 has even more files. It was shot in 8K and has nine episodes. The source video files were many, many terabytes of data. It took , CPU hours to encode just one season. A CDN is a content distribution network. Content for Netflix—is of course—the video files we discussed in the previous section. Distribution means video files are copied from a central location, over a network and stored on computers located all over the world.
The idea behind a CDN is simple: put video as close as possible to users by spreading computers throughout the world. When a user wants to watch a video, find the nearest computer with the video on it and stream to the device from there. The video stream must pass through a lot of networks, including an undersea cable, so the connection will be slow and unreliable.
By moving video content as close as possible to the people watching it, the viewing experience will be as fast and reliable as possible. Each location with a computer storing video content is called a PoP or point of presence. Each PoP is a physical location that provides access to the internet. It houses servers, routers, and other telecommunications equipment. In , when Netflix debuted its new streaming service, it had 36 million members in 50 countries, watching more than a billion hours of video each month, streaming multiple terabits of content per second.
To support the streaming service, Netflix built its own simple CDN in five different locations within the United States. The Netflix video catalog was small enough at the time that each location contained all of its content. In , Netflix decided to use 3rd-party CDNs. Around this time, the pricing for 3rd-party CDNs was coming down.
Using 3rd-party CDNs made perfect sense for Netflix. Why spend all the time and effort building a CDN of your own when you can instantly reach the globe using existing CDN services? In fact, pretty much every company does. For example, the NFL has used Akamai to stream live football games.
By not building out its own CDN, Netflix had more time to work on other higher priority projects. Netflix put a lot of time and effort into developing smarter clients.
Netflix created algorithms to adapt to changing networks conditions. Even in the face of errors, overloaded networks, and overloaded servers, Netflix wants members always viewing the best picture possible. One technique Netflix developed is switching to a different video source—say another CDN, or a different server—to get a better result. At the same time, Netflix was also devoting a lot of effort into all the AWS services we talked about earlier.
Netflix calls the services in AWS its control plane. Control plane is a telecommunications term identifying the part of the system that controls everything else. In your body, your brain is the control plane; it controls everything else. In , Netflix realized at its scale it needed a dedicated CDN solution to maximize network efficiency. Video distribution is a core competency for Netflix and could be a huge competitive advantage. Open Connect launched in The 3rd-party CDNs must support users accessing any kind of content from anywhere in the world.
Netflix has a much simpler job. Netflix knows exactly who its users are because they must subscribe to Netflix. Netflix knows exactly which videos it needs to serve. Netflix also knows a lot about it members. The company knows which videos they like to watch and when they like to watch them. With this kind of knowledge, Netflix built a really high-performing CDN. Netflix developed its own computer system for video storage. There are many OCAs in the above picture.
OCAs are grouped into clusters of multiple servers. Each OCA is a fast server, highly optimized for delivering large files, with lots and lots of hard disks or flash drives for storing video. There are several different kinds of OCAs for different purposes. Smaller OCAs are filled with video every day, during off-peak hours, using a process Netflix calls proactive cachin g. You could buy the same computers if you wanted to.
Netflix had their computers specially made to match their logo color. Yes, every OCA has a web server. The number of OCAs on a site depends on how reliable Netflix wants the site to be, the amount of Netflix traffic bandwidth that is delivered from that site, and the percentage of traffic a site allows to be streamed. For the best possible video viewing experience, what Netflix would really like to do is cache video in your house. The next best thing is to put a mini-Netflix as close to your house as they can.
Netflix delivers huge amounts of video traffic from thousands of servers in more than 1, locations around the world. Take a look at this map of video serving locations:. Other video services, like YouTube and Amazon, deliver video on their own backbone network.
These companies literally built their own global network for delivering video to users. An ISP is your internet provider. It might be Verizon, Comcast, or thousands of other services. Each wire in the above picture connects one network to another network. For Netflix, this is another win. IXPs are all over the world. Netflix has all this video sitting in S3. They have all these video serving computers spread throughout the world.
Netflix uses a process it calls proactive caching to efficiently copy video to OCAs. A cache is a hiding place, especially one in the ground, for ammunition, food and treasures. Each location they bury nuts is a cache.
During the winter, any squirrel can find a nut cache and chow down. Arctic explorers sent small teams ahead to cache food, fuel and other supplies along the route they were taking. The larger team following behind would stop at every cache location and resupply. Both the squirrels and Arctic explorers were being proactive ; they were doing something ahead of time to prepare for later.
Everywhere in the world, Netflix, knows to a high degree of accuracy what its members like to watch and when they like to watch it. Remember how we said Netflix was a data-driven company?
Netflix uses its popularity data to predict which videos members probably will want to watch tomorrow in each location. Netflix copies the predicted videos to one or more OCAs at each location. This is called prepositioning. Video is placed on OCAs before anyone even asks. This gives great service to members. The video they want to watch is already close to them, ready and available for streaming. These are too small to contain the entire Netflix catalog of videos.
Still, other locations have big OCAs containing the entire Netflix catalog. These get their videos from S3. Each OCA is in charge of making sure it has all the videos on its list. With 5G, 4K videos start with no delay or time wasted on buffering. But how can we achieve fast transmission of these 4K videos — which will bring with them petabytes of data — effectively?
There is a lot going on in the background to manage how such large amounts of data are stored and transmitted over thousands of miles to create a smooth user experience, making Netflix and similar services work they way they should. Worldwide data centre spending in will fall by 2. Simply put, there are organisations and businesses which specialise in storing data smartly, enabling faster and lag free access whenever needed. How then can AWS manage this data storage and transmission so efficiently, so that Netflix in-turn is able to provide a seamless user experience to subscribers?
This is where data centres come into play. How does Netflix communicate quickly and securely with its customers on a global scale? The original source material of the movies or series we watch are sizable, cinema-quality files. These compressed files can then be stored on a centralised hosting server of a data centre. However, if the viewer is in India and wants to watch the latest season of Stranger Things which is stored in a server in Canada, for example, you would likely expect there to be a lot of lag and buffering right.
Multiple copies of the compressed encoded file are copied into hundreds of servers across the globe, forming a wider CDN for the likes of Netflix, around the world. Figure: Netflix content distribution flow. The CDN, in the case of Netflix, is simply brought closer to you for a better streaming experience.
But, how does the communication or the copy and fetch of data between different data centres and servers happen so smoothly and instantly? There are lots of reasons for Netflix to operate its own CDN.
With its service accounting for such a high proportion of ISP traffic, it's better for it to have a direct relationship with them than work through companies like Akamai. It also gives Netflix "end-to-end control" of its network, providing more opportunities to optimize the system. Its servers are purpose built for streaming movies, for instance, with the spinning disks carefully laid out to minimize "heat spots," or areas of overheating.
It also does lots of intelligent mapping in the network, to figure out the best location to stream each movie from. Netflix has close to 50 million streaming customers, in North America, South America and parts of Western Europe, and it's likely to expand further in future.
Netflix also uses Amazon Web Services, for tasks like running its website and its recommendations engine. Also, the movie studios upload their content to the Amazon cloud, where Netflix encodes it to its format before distributing it to its network. Only about 40 people work on the CDN, he said, with half working on software, 10 network engineers and 10 in operations. Building the CDN wasn't without hiccups. In , Netflix leads the pack with 36 Oscar nominations , 10 of which belong to its film Mank.
In Netflix required massive amounts of fast storage and fast networking for streaming. At that time, it had to terabytes per server, which has undoubtedly grown massively since then. As of , Netflix users watched an average of 3. Based on data collected from our Time Spent Streaming tool, which allows Netflix users to upload their Netflix data to get information about their viewing habits, the early s hit Clueless was the top-watched film available on Netflix. But when it came to TV series, most people turned to the controversial 13 Reasons Why.
Our Time Spent Watching tool uncovered that the average Netflix user has put in over 1, hours, or 49 days, worth of content watching since opening their account. And the biggest binge-watchers?
In a letter to shareholders, Netflix revealed that it had gained an additional A key fact: Netflix was forced to reduce streaming quality in several countries to help reduce strain on overtaxed bandwidth.
Meanwhile, Google searches for Netflix surged in March as more people were forced into quarantine. According to a Streaming Observer report from , Netflix viewers across all of its available countries watched around million hours per day of content. Netflix has not released similar numbers since but has started releasing total viewing stats for select original content.
While Netflix is available in almost every country in the world, its home base is still its most important one. Netflix has not been shy about informing the public about its usage of Amazon Web Services. The streaming giant fully migrated its content over to Amazon Web Services in , a project that took 7 years to complete. The streaming giant continues to earn big each year. Nonetheless, Netflix is still earning a healthy profit but diminishing profit from DVD sales.
Disney, in turn, likely used some of that revenue to build its own service. However, the stock price had jumped back up to Although the company earns billions each year, its net income in was small as most of that money went right back out the door. And no service is spending as much on content and especially original content as Netflix is. This marks a slight reduction in spending from last year, which is understandable given how uncertain has been for international markets.
This may sound like a bad thing, but at least one industry observer notes that Netflix is doing just fine financing its growth with debt.
The company has seen its value grow rapidly over the years. Although Netflix content libraries shift in size almost daily, some countries regularly maintain larger library sizes than others. The US currently has the largest library of Netflix content, with over 5, titles as of February Differences in library sizes result in Netflix subscribers paying vastly different amounts on a per-title basis. As of December 31, , Netflix employed roughly 9, people. Between and , Netflix added 1, employees — the largest number of new employees brought in by the company over the course of 12 months in its history.
Netflix continues to be the single most ubiquitous streaming service in the US and abroad.
0コメント