Dlaczego HD to 1080p? - skąd to się wzięło? (film, 21m)
Nostalgia Nerd w swoim najnowszym filmie przybliża historię rozwoju technologii telewizyjnej, sięgając aż do 1940 roku. W tym czasie, w Stanach Zjednoczonych powstał Narodowy Komitet Telewizyjny, który miał na celu rozwiązanie konfliktów pomiędzy prywatnymi firmami, które starały się dostarczać telewizję na całym kraju. W 1939 roku RCA rozpoczęło nadawanie sieci NBC w Nowym Jorku i Los Angeles, której odbiór opierał się na dużych odbiornikach telewizyjnych produkowanych przez General Electric. DuMont, kolejna firma pionierska, starała się zwiększyć jakość obrazu poprzez zwiększenie liczby linii skanowania, co w końcu doprowadziło do ustalenia standardu telewizyjnego w USA na poziomie 525 linii. To był moment, w którym czarno-biała telewizja zyskała swój formalny kształt, co miało ogromny wpływ na rozwój telewizji przez następne dekady.
Film Nostalgia Nerda kontynuuje przegląd zmian i wyzwań, które pojawiły się, gdy do telewizorów zaczęto wprowadzać nowe technologie, takie jak komputery domowe. W latach 70-tych, komputery takie jak VIC-20 i Sinclair ZX80 zaczęły wykorzystywać telewizory jako wyświetlacze, co z kolei wymusiło dostosowanie standardów jakości obrazu do wymogów telewizyjnych. Komputery i telewizory zaczęły flirtować, współpracując w trudnych warunkach, co skutkowało pojawieniem się nowych sposobów prezentacji obrazów na ekranach. Narzędzia i standardy telewizyjne, takie jak NTSC i PAL, były analizowane, a ich ograniczenia stawały się coraz bardziej zauważalne w kontekście wymogów obliczeniowych.
Następnie film przeskakuje do lat 80-tych, kiedy to pojawiły się nowe standardy graficzne. W 1987 roku IBM wprowadził monitor VGA, co zrewolucjonizowało możliwości wyświetlania na komputerach osobistych. Standardy te wkrótce weszły w życie i ustalono dalsze rozdzielczości, takie jak 800x600 oraz 1024x768. Przysłuchując się temu, można dostrzec, jak bardzo uzgodnienia techniczne i normalizacyjne miały wpływ na dalszy rozwój technologii. Współpraca pomiędzy różnymi organizacjami, takimi jak SMPTE i EBU, doprowadziła do powstania zharmonizowanych standardów, które miały na celu ustanowienie wspólnej platformy dla różnych systemów na całym świecie.
Na końcu, Nostalgia Nerd przywołuje znaczenie wsparcia, jakie otrzymuje od swoich widzów, i zachęca do zapoznania się z jego książką „Nostalgia Nerds Gadgets, Gizmos and Gimmicks”. Utrzymuje rozmowę i zaprasza do subskrypcji channelu, aby nie przegapić ciekawych treści, które jeszcze się pojawią. Film pokazuje, że podróż od analogowych sygnałów do cyfrowych rozwiązań była długim procesem, ale ostatecznie doprowadziła do standaryzacji, która dziś pełni kluczową rolę w produkcji i odbiorze telewizyjnym.
Na czasie pisania tego artykułu, film Nostalgia Nerda miał już 410423 wyświetleń oraz 16737 polubień, co dowodzi jego popularności. Społeczność wykazuje ogromne zainteresowanie merytorycznymi treściami, które dostarczają fascynujących informacji o historii technologii w przystępny sposób. W miarę jak rozwija się technologia, Nostalgia Nerd pozostaje w czołówce dostawców wiedzy o technologiach przeszłych i teraźniejszych, co czyni jego kanał niesamowitym źródłem informacji.
Toggle timeline summary
-
Wprowadzenie, kwestionowanie punktu wyjścia dyskusji.
-
Ustanowienie Krajowego Komitetu Telewizyjnego w 1940 roku.
-
Konflikty między firmami nadawczymi i ich wpływ na standardy telewizyjne.
-
Przegląd standardu kolorowej telewizji NTSC wprowadzonego w 1953 roku.
-
Różne standardy opracowywane w Wielkiej Brytanii i w Europie, które przeciwdziałały ograniczeniom NTSC.
-
Wprowadzenie komputerów domowych w latach 70. i ich wpływ na technologię telewizyjną.
-
Pojawienie się dedykowanych monitorów komputerowych w kontraście do tradycyjnych telewizorów.
-
Wprowadzenie standardów rozdzielczości VGA w latach 80.
-
Rozwój propozycji telewizji wysokiej rozdzielczości, które rozpoczęły się w latach 70.
-
Standaryzacja formatu 16:9 i wprowadzenie rozdzielczości HD.
-
Formalne zatwierdzenie ITU-BT709 ustalającego rozdzielczość 1920x1080.
-
Dyskusja na temat przyjęcia wyświetlaczy i rozdzielczości HD od milenium do 2010 roku.
-
Podsumowanie, jak nadawanie telewizyjne doprowadziło do standaryzacji rozdzielczości.
-
Konkluzja na temat wpływu standardów HD na przyszłe rozdzielczości, takie jak 4K i 8K.
-
Zakończenie i podziękowanie za oglądanie.
Transcription
Ai yai yai, where do we start on this one? The beginning. Always the beginning. In this instance, that beginning is 1940. Honestly, why do I do this to myself? I could just say it's 1080p because of this. But no no, I have to create an entire bloody history lesson. But before I get started, there are a couple of ways you can support the channel. The first is to grab my latest book, Nostalgia Nerds Gadgets, Gizmos and Gimmicks, an essential guide to personal tech through the ages. The second is Patreon, where you'll find various tiers, including early access, producer credits and a whole lot more. Of course, just watching is support enough, so without further ado, let's get back to it. In 1940, the National Television Committee was established by the United States Federal Communication Commission, and it had one purpose. To resolve the conflicts between private companies who were attempting to deliver television across the nation. In 1939, RCA had already begun broadcasting their NBC TV network across New York and Los Angeles, which relied on bulky television receivers produced by General Electric and designed to receive 441 lines broadcast by NBC. However, DuMont, one of the pioneers in extending the life of cathode ray tubes, had their own television station, W2XVT, and were campaigning to increase the number of broadcast scan lines to between 605 and 800, greatly improving the resolution of TV pitches. The committee would eventually settle for a compromise suggested by the Radio Manufacturers Association of 525 or 486 visible scan lines split into two interlaced fields of 262.5 seconds, 30 times per second. Adopted nationwide, this would remain the black and white television standard until 1953, when the NTSC colour television standard was introduced, reducing the scanning frequency to 29.970 frames per second, with the remaining bandwidth used to carry the colour signal. Other parts of the world would have their own stuff going on. For example, here in the UK, and other parts of Europe, our power system was devised to improve on the shortcomings of NTSC, with one of those being an increased vertical resolution to 625 lines, with 576 visible lines, and like NTSC, the rest were reserved for retrace and sync data. Although, due to our 50Hz electrical system, this would broadcast at a slightly lower frame rate of 25 per second, which is half of our 50Hz rate. These standards would persist, pushing signals to our 4x3 televisions for the remainder of the century. However, in the 70s, a new form of technology arose, and it had the arduous task of interfacing with television sets that were not really designed for it's rival. Ah yes, the home computer. Humble, yet powerful. Limited, yet mind blowing in it's potential. Machines like the VIC-20, Sinclair ZX80 and even game consoles like the Atari VCS had to effectively act as mini TV broadcasting stations, so they could display pictures on your television set, but they had to rely on a few tricks to make this possible, and importantly, usable. Although standards like NTSC and PAL had a set number of scan lines, they really didn't have any fixed horizontal resolution. It was mainly limited by the signal bandwidth and the ability of the TV receiver. For practical purposes, there were technical targets, which equated to about 338 TVL for NTSC, and 403 TVL for PAL, TVL standing for Television Lines, defined as the number of vertical lines that can be discerned, usually. This is affected by the broadcast signal bandwidth, fidelity, TV receiver and the grill the electron gun fires through. The lines you see going up and down here, are just that. The aperture grill, or shadow mask behind the glass, used to focus the electron beams onto the phosphor. This dot pitch does affect the picture quality, giving a forced horizontal resolution, but it's not tied to the broadcast signal. Interestingly, the human eye is more sensitive to vertical resolution than horizontal, so although this is lower than you might expect, it was helped in part by the natural anti-aliasing effect of CRT screens. Although analogue signals, or CRT displays for that matter, don't have a fixed horizontal resolution, a computer, being digital, can. However, a computer especially at the time, had a limited amount of RAM, and therefore the number of pixels, and thus the resolution it could display, were limited. Many computers of the time would therefore display in something like 160x200 or 320x200. lower than could be discerned on your TV set, whilst also forgoing interlaced scanning to save memory and processing time, which also meant they could only access half the screen's vertical resolution, each pixel effectively taking up two horizontal lines. In the sense of a computer then, a pixel is a logical rather than physical unit, the smallest addressable block on screen, as defined by the machine's ROM and screen mode. You could in theory have a 1x1 resolution screen, but still takes up the entire screen. It's just one single colour addressable block. As more powerful computers came along, with more memory and faster processes, their abilities to display higher resolutions increased, and although machines like the Amiga could display pretty high resolutions through a TV set, the number of horizontal lines of vertical resolution was a hard limit, which when interlaced would be particularly flickering, and the horizontal fidelity was key to bandwidth, the dot pitch of a shadow mask and phosphor density, which could vary from set to set. Therefore dedicated computer monitors would become more widely adopted, something that professional machines such as the IBM PC compatible had already been using since their inception. Unlike television signals, monitor signals are progressive, meaning each line is drawn in order with no interlacing, giving a more stable image. Also unlike televisions, monitors could have their own refresh rates and the ability to scan the image onto the screen at different rates, allowing for different resolutions. Compared to our modern LCD monitors, CRT monitors didn't have a set number of physical pixels, but they still had a maximum number of pixels they could draw on screen. Early PC graphics standards would more or less mimic the resolutions of TV based systems, but by 1987, something more usable was required, and so IBM introduced the VGA monitor and VGA graphics card with their PS2 line of computers, offering a resolution of 640x480. Within 3 years, this was the standard display type among PC users, which was quickly improved upon by third party manufacturers into a de facto standard that became known as Super VGA. But, just like the TV broadcasters before, lack of compatibility between these graphics cards and screens meant something needed to be done to ensure they could be universally programmed for. This time, it was the Video Electronics Standards Association that would create a common software interface for all cards, which conformed to their VBE specification. This and subsequent specifications would standardise the following resolutions. 800x600, 1024x768, also known as XGA, and finally 1280x1024, also known as Super XGA. A natural step up in resolution, that all conformed to the 4x3 viewing ratio of monitors at the time. Well, apart from 1280x1024, that was for the 5x4 crew who emerged a bit later. However, whilst computers were doing their own thing, the TV boffins were not at rest, and anyone who has watched my video on aspect ratios, will know where this part of the tale begins. Way back in 1972, Japan proposed a new CCIR study programme to work on an analogue, high definition television system. Ultimately utilising 2D filtering, dot interlacing, motion vector compensation, line sequential colour encoding and time compression, it managed to cram a 20MHz analogue bandwidth source signal into just 8.1MHz. However, creating all this took time and discussion, and signals wouldn't begin broadcasting until December 1988. The initial aspect ratio for this high vision system was 2 to 1. At around the same time, Sony released this. This is the HDM 3830 colour monitor, and it's a beast. This CRT not only has a 38 inch visible picture, but it can take a 1080i input. Yep, you could plug a PS5 into this thing and play until your heart's content. It also has a 16x9 aspect ratio that... OK, let's just stop there a second. I've covered aspect ratios, but this TV could take 1080i in 1988. That's crazy, right? Well yes, and no. In 1974, work on the CCIR study programme for high definition television had led to a number of groups and other international initiatives, one of which was that in March 1977, the Society of Motion Picture and Television Engineers in America had begun development of a digital television interface standard. By 1979, a standard had been drafted that allowed the NTSC television signal to be sampled as a single composite colour signal. Over in Europe, the European Broadcasting Union issued a document at around the same time, recommending a component television production standard, and it was quickly realised that the community would be best served by a single set of standards that could be applied to NTSC, PAL and CCAM. By January 1980, a SMPTE task force was set up to ensure compatibility across all systems. Of course, to do that, we'd need to take our analogue signals and convert them. Now remember our television lines number? Well, this needed to be sampled to make it digital. This sampling rate is known as the number of samples per active line, and how many samples we take depends on the signal sampling frequency. For example, a signal sampled at 6.75MHz would give 360 samples per active line. A signal sampled at double that rate would give double the samples per line. At this point in negotiations, the stipulations were that the European community wanted a luminance signal sampling frequency lower than 14.318MHz, whilst America wanted more than 12MHz. To accommodate all European standards for active line periods, it was further put forward that the number of samples per line should be greater than 715.5. With 720 being 6 factorial, allowing for many small factors, it seemed the ideal number. This is much higher than the TVL numbers I mentioned earlier, but remember, TVL doesn't relate to how much data there is, it's simply what is discernible on screen, and really shows how much of the image was lost due to analogue transmission bandwidth and TV receivers. That's in part why a DVD image looks so much clearer than an analogue image of the same source material. So in the business of converting an analogue signal to digital, we're pulling a defined number of horizontal samples, which is then normalised to our number of scan lines, so as to retain the 4x3 image format. It was therefore suggested that a 3x4.5MHz sampling frequency should be used, equating to 13.5MHz. This makes sense, as both NTSC and Powerline frequencies can be synchronised to 4.5MHz, and therefore any multiple thereof. It was this frequency which delivered 720 samples per active line, accommodating the needs of both European and American systems, and allowing our TV lines to be defined in acceptable digital clarity. By February 1981, this component recording technology was effectively born, with the EBU demonstrating component coded systems in January 1981 to the International Broadcasting Authority, followed by the Bureau of the Technical Committee in San Francisco a month later. These demonstrations were supported by companies including ABC Television, CBS, RCA Laboratories and Sony Corporation. By Autumn 1981, NHK in Japan, the EBU in Europe and the SMPTE in America had agreed on the 13.5MHz 422 component digital recording standard. By 1982, this had been put forward in SMPTE 125 and ITU Recommendation 601, which defined how to encode interlaced analogue video signals into digital video form. It was also recommended that the horizontal resolution for HDTV should be twice that of conventional television systems. If you take 720 samples per active line, that gives us a horizontal sample rate, or resolution of 1.440. Fast forward a few years to 1984, and a gentleman by the name of Kearns H. Powers was working on the problem of standardising the image formats used in our homes and cinemas. Ultimately, he laid out all the main existing film and broadcasting ratios in front of him, overlapping their centre points, to give us the aspect ratio of 16 to 9. Now expanding a 4x3 screen out to 16x9 requires 1 third more samples than a 4x3 picture ratio. So if we take our 1440 and adjust the count for a 16x9 ratio, we get 1920 samples per active line. Yeah, things are starting to take shape. Literally. In 1987, the ITU were still not entirely convinced, and were faffing about with screen ratios, having defined the following parameters for their first HDTV recommendation. Active lines 1152, Field Rate 50Hz, Scanning Method Progressive, Aspect Ratio 19x9, Samples per line 1920 for Luminance. So when Sony released their HDM3830, they were pretty much taking a punt. Based on the SMPTE's work, the technology for component connections had been finalised, but globally, high definition TV had not been completely defined, nor digital. Japan was pushing out it's experimental HiVision HDTV system, which delivered 1035 lines of visible screen data, with 1920 samples per line, but there was no guarantee this would stick around. The fact that this thing can take 1080i signals is really testament to the work that the ITU and SMPTE had done on standardisation. It wouldn't be until 1990, at the 17th CCIR Plenary Assembly, that the ITU recommendation 709 was approved, which defined a picture ratio of 16x9, along with our 1920 samples per active line. However, at this stage, the total number of agreed vertical lines still varied to allow for international variations. Something worth considering at this point, is that neither NTSC nor PAL pixels are square. If you digitise our analogue signal, and take the sampled horizontal width, place that against our line height, you don't get a square. NTSC pixels are narrower than they are high, allowing for the 720x486 aspect ratio, whilst PAL pixels are wider than they are high, allowing for the 720x576 aspect ratio. It's what allows both formats to be different ratios of themselves, but yet fit in the same aspect screen ratio. It's here where we jump back to the computing community, who had one wish. Well, I imagine they had hundreds, but in terms of pixels, they wanted one thing. Square pixels. And that's because computers, well, they already used square pixels. Your average PC display in the States, was the same as your average PC display in the UK. No fuss. Just perfection, which allowed our fabulous resolutions to exist in harmony and abundance. It's also because editing NTSC and PAL video in a PC, without standard pixel sizes is a menace. So, if you take our 1920 samples per line, and convert them into the world of PCs, you effectively get 1920 pixels. If you work out how many square pixels you'd then need to fill the vertical space of a 16x9 screen, you get 1080. BINGO. 1920x1080. Our HD resolution is born, and although it wasn't commonplace anywhere yet, it made sense from a manufacturer's point of view, and it made sense from a standardisation point of view. Especially when you consider that the LCD screens about to hit the market, had a defined number of physical pixels, as opposed to the more versatile whims of a CRT. Meaning that a pixel in your computer's memory can actually equate to a physical pixel on screen. It wouldn't be until the year 2000 that recommendation ITU-BT709 was approved, that set our HD resolution and our screen ratio, allowing manufacturers to produce equipment at a lower cost, and for broadcasters to work towards common HDTV program production standards. Our 720p screens also followed the same rule. 720 horizontal pixels, times 4 thirds resolution improvement, adjusted for 16x9, equated to 1280x720, offering a cost effective step into the world of Full HD, and a noticeable step up in quality from the standard resolution video. As for 1080i, well again, it worked as a stopgap, and did the whole interlaced thing that our standards had been used to up until this point. But progressive scanning was always the future. It would take about 10 years for HD 16x9 displays to be adopted universally by both the computing world and the TV world. TV owners went through a variety of resolutions between the millennium and 2010, with 1024x768 being the most common at the dawn, before an abundance of 5x4 LCD screens made 1280x1024 more popular, followed by 1440x900 on 16x10 displays, which were more prevalent until economies of scale meant that standardisation between the computer screens and TVs made much more sense. Ultimately, it also allowed graphics cards and their cost to catch up to the point where a 1920x1080 desktop and gaming resolution made much more sense. But it will always be strange to me that although computer resolutions led the way for so long, it would be the weird, cumbersome and disparate world of television broadcasting that ultimately took the lead and steered us on. It may have taken 70 years, but standardisation finally came. Just about. Everything else that came after, 4K, 8K, well, it's all born from this original HD standard. Until next time, I've been Nostalgia Nerd. Toodaloo! Thanks for watching!