Currently, the uncrowded 60GHz spectrum looks to greatly extend Wi-Fi if designers can meet challenges for inexpensive, small and low power modules.
By John Blyler, Editorial Director
One of the bright consumer stars at this year’s International CES was 4K Ultra-High Definition (UHD) televisions. While current implementations use wired connections, the support for 60GHz wireless is coming up fast. UHD includes 4K UHD (2160p) and 8K UHD (4320p), which are two digital video formats approved by the International Telecommunication Union (ITU). UHD video has four times the number of pixels as regular 1080p high definition (HD) for a much clearer picture for large-screen TVs. Most major panel vendors and content providers predict affordable products in the very near future, including Samsung, Sony, Warner Brothers and Disney.
Currently, 4K content is delivered using two wired formats, namely, HDMI and DisplayPort. But a few semiconductor vendors, like nVidia, Nitero, and Silicon Image, are offering chip sets that boast cable-quality wireless connectivity between 4K displays and sources. Equally important is the claim that these wireless connections will not be interfered by current Wi-Fi technology and will extend the reach of wireless network traffic.
What challenges will SoC engineers face when designing 4k wireless streaming applications using 60GHz technology? Will these devices play a part in the Internet-of-Things (IoT) market? What standards exist for 60GHz? These are some of the questions that I put to experts, including Sven Mesecke, Vice President of Marketing at Nitero; and Andre Bourdoux and Piet Wambacq, Principal Scientists at Imec. What follows are portions of their responses. – JB
Blyler: What challenges will SoC designers face with 4K wireless streaming using 60GHz technology?
Mesecke from Nitero: 60GHz prefers to have antenna with the RF. As a result, antenna placement within a device is a bit more tedious. A laptop can have the module resident in the base/keyboard portion of the clamshell, while the antenna is positioned strategically throughout the display. 60GHz solutions that are mobile focused were designed with this in mind, but it’s fair to say that the lack of antenna placement options is a limitation. In the small form factor of a mobile device such as a smartphone however, designing to this challenge from the beginning, as Nitero did, is a benefit. In addition, the mobile solution translates well to recipients of 4K streams like displays, as the solution can be placed in an HDMI dongle or into the front face of the display itself. Beam forming takes care of the rest.
Bourdoux and Wambacq from Imec: 4k or UHD is a resolution of 3840 pixels × 2160 lines (8.3 megapixels). Assuming a progressive (i.e. non-interleaved) frame rate of 60Hz and a bit width of 8 bits per color, we end up with a rate of 11.9Gbps. For 10-bit color, it becomes 14.93Gbps. These rates are for non-compressed video. They cannot be supported with the current 60GHz 802.11ad standard that provides 4.62Gbps with the SC PHY or 6.76Gsps with the OFDM PHY. These 802.11ad PHYs were conceived to support uncompressed 2k (= full HD = 1920 pixels x 1080 lines), which is roughly 4 times lower in bit rate requirement. Hence, if the requirement for 4k uncompressed streaming exists, it will call for a new standard with at least four times higher rates. These rates will of course impact the SoC: higher bandwidth, faster ADCs and DACs and more parallelism in the digital part. Beamforming will again be needed to ensure communication over a few meters. Power consumption (alternatively: the energy per bit) will be the key requirement. Of course, if video compression is allowed, the above numbers show that a compression ratio of 5:1 or better would enable to stream 4k video over the 802.11ad links.
Blyler: How will 60GHz applications play into the Internet-of-Things? Will stacked dies with MEMS be required for IoT implementations?
Mesecke from Nitero: There are a number of reasons that 60GHz will be critical for IoT. One is spectrum congestion. We live in a congested world within the 2.4 and 5GHz spectrums. The explosion of IoT devices into the enterprise, home and manufacturing/hospital space in the next three years will only make every day Wi-Fi applications vs. IoT applications more competitive for this limited, congested spectrum. 60GHz, using 802.11ad, is a great alternative for many IoT applications. IoT is about choosing the best socket for the application based on the current networking environment. This could be Wi-Fi, Bluetooth, Zigbee, etc. When significant throughput is needed, we typically look to 2.4/5GHz. For critical applications that 60GHz can serve, protecting the spectrum from these new (and existing) high bandwidth applications will be essential for satisfying whole-home Wi-Fi and critical enterprise/manufacturing applications for both IoT and legacy Wi-Fi applications. An example of an IoT application that will require high bandwidth includes 4K video camera security systems.
Other reasons why 60GHz will be critical for IoT include latency (beamforming and spatial reuse help here), good power conservation, small form factor and cost (similar to Wi-Fi).
Bourdoux and Wambacq from Imec: It is not obvious the 60GHz is the technology of choice for IoT because 60GHz is attractive when large bandwidth (high rates) is needed. 60GHz is not an easy technology for IoT because it requires line-of-sight, does not penetrate through even thin walls and is by essence very directive. My vision is that 60GHz will not be a key technology for IoT.
Blyler: Is 802.11ad the definitive standard for multi-gigabit data throughput?
Mesecke from Nitero: Yes. Early ratification in record time for an 802.11 baseline was huge. ISO via IEEE. The Wi-Fi Alliance declaring it as the reference spec sealed the deal. The roadmap for the future will be based off 802.11ad to tens of Gbps, so the future interop of the first generation is secure. There will be no mainstream proprietary 60GHz.
Bourdoux and Wambacq from Imec: Yes and no. The whole world is still waiting for 60GHz (WiGig or 802.11ad) to become a success. We have heard and read for three years that “it is for next year”… As of today, it is proposed in only a minority of high-end consumer devices and the take-off is very slow. 60GHz is still very far from the worldwide adoption that we have seen in technologies such as Bluetooth, 802.11a/b/g/n/ac, GSM, LTE, etc… There is already, within 802.11, a new standardization activity to boost the capabilities of 11ad by a factor 5 to 10: the “Next Generation 60GHz Study Group – NG60 SG. If the current 11ad takes off this year or next year, we will for sure see this new NG60 standard be deployed in five or six years. If 11ad does not become a worldwide success, NG60 will be a “born dead” standard.
I must add to this that other mm-wave standards might be developed for outdoor use. Mm-wave is seen today as one of the key technologies for 5G cellular systems. It will probably not be at 60 GHz: carrier frequencies of 28, 33 or 38GHz are often mentioned.
Blyler: Why is RF at 28nm node important? Will CMOS or SOI dominate?
Mesecke from Nitero: Integration is still some years off. Manufacturers prefer to be apps processor and comms silicon agnostic. When integration does become a big part of who will be the stakeholders in comms, IP implementation will only then become a hot item. It’s certainly a discussion on the table, but the initial goals are to launch without true integration. This goes back to the early discussion on SOC challenges.
Bourdoux and Wambacq from Imec: 28nm planar bulk is an attractive technology for 60GHz applications as it is an affordable technology that features fast devices as well as very compact digital standard cells. Compared to less downscaled planar bulk technologies, the radio part can be designed with a better performance and/or a lower power consumption. On the other hand, fully-depleted SOI features smaller device parasitic capacitances. The wafer cost, however, is higher.
Blyler: Thank you.