{"id":3484,"date":"2020-11-01T20:46:54","date_gmt":"2020-11-01T20:46:54","guid":{"rendered":"https:\/\/techclot.com\/index.php\/2020\/11\/01\/this-ultrahigh-resolution-display-could-paint-stunning-vr-worlds\/"},"modified":"2020-11-01T20:46:54","modified_gmt":"2020-11-01T20:46:54","slug":"this-ultrahigh-resolution-display-could-paint-stunning-vr-worlds","status":"publish","type":"post","link":"https:\/\/techclot.com\/index.php\/2020\/11\/01\/this-ultrahigh-resolution-display-could-paint-stunning-vr-worlds\/","title":{"rendered":"This Ultrahigh Resolution Display Could Paint Stunning VR Worlds"},"content":{"rendered":"<p><a href=\"https:\/\/www.google.com\/url?rct=j&#038;sa=t&#038;url=https:\/\/singularityhub.com\/2020\/11\/01\/this-ultrahigh-resolution-display-could-paint-stunning-vr-worlds\/&#038;ct=ga&#038;cd=CAIyHGQzYWQwNmI0YTFiYjA3MmU6Y28udWs6ZW46R0I&#038;usg=AFQjCNHL7fu2sxz_PUW5seKXAkXwnXJmgQ\">This Ultrahigh Resolution Display Could Paint Stunning VR Worlds<\/a><\/p>\n<p><p>You can plausibly say, today\u2019s virtual reality is a descendent of smartphones. The affordable sensors, chips, and high-resolution displays critical to rendering a decent VR experience were engineered for iPhones and Galaxys not Rifts and Vives. Early on, VR pioneer <a href=\"https:\/\/www.wired.com\/2014\/05\/oculus-rift-4\/\">Oculus built prototypes with 1080p AMOLED displays<\/a> from Samsung Galaxy S4 smartphones.<\/p>\n<p>But after Facebook\u2019s $2 billion acquisition, the team had the wherewithal to begin dreaming up and ordering custom components. And of course, displays were first on their list.<\/p>\n<p>Early Oculus Rift developer kits were like looking through a coarsely patterned screen door. But the ideal experience is one in which the eye discerns nary a pixel on the screen, a heavenly state referred to as retina resolution.<\/p>\n<p>The best VR displays are somewhere between super-screen-door and retina resolution. High-end, tethered headsets offer higher resolution than early versions of the Rift. Yet, the image still isn\u2019t so crisp that the eye detects no pixelation at all. Not bad, not ideal.<\/p>\n<p>Retina resolution depends on a number of factors, one of which is how close the display is to your eyes. The closer it is, the more pixels you need. That means for VR to hit retina resolution, we\u2019ll need displays with way more, way smaller pixels.<\/p>\n<p>Luckily, science is on it, and with retina resolution laptops and phones yawn-worthy at this point, VR and AR are now key technologies driving cutting-edge research in high-res displays.<\/p>\n<p>In a recent example, a team of scientists led by Samsung\u2019s Won-Jae Joo and Stanford\u2019s Mark Brongersma, <a href=\"https:\/\/science.sciencemag.org\/content\/370\/6515\/459\">published a paper in <em>Science<\/em><\/a> describing a new meta-OLED display that can pack in 10,000 pixels per inch with room to scale. In comparison, today\u2019s smartphone and VR displays are less than 1,000 pixels per inch.<\/p>\n<p>The team says current displays, sufficient for TVs or smartphones, can\u2019t meet the pixel density needs of near-eye VR and AR applications. They\u2019re looking beyond headsets too, writing, \u201cAn ultrahigh density of 10,000 pixels per inch readily meets the requirements for the next-generation microdisplays that can be fabricated on glasses or contact lenses.\u201d<\/p>\n<p>Of course, they aren\u2019t alone in their quest for ultra-high-def, and the display is still firmly in the research phase, but it hints at what the future holds for stunning AR\/VR experiences.<\/p>\n<h3><strong>Very Meta: From <\/strong><strong>Solar Panels to Virtual Reality<\/strong><\/h3>\n<p>The new display was <a href=\"https:\/\/news.stanford.edu\/2020\/10\/22\/future-vr-employ-new-ultrahigh-res-display\/\">born from a breakthrough in solar cells<\/a>, where Brongersma\u2019s lab used optical metasurfaces\u2014these are surfaces with built-in nanoscale structures to control a material\u2019s properties\u2014to manipulate light. Joo, who was visiting Stanford at the time, learned of Brongersma\u2019s approach in a presentation by graduate student Majid Esfandyarpour.<\/p>\n<p>He realized the same approach could be useful in organic light-emitting diode (OLED) displays too.<\/p>\n<p>Some of the top displays in the world\u2014like the ones in high-end televisions and iPhones\u2014use OLEDs because they\u2019re very thin and flexible and known for their deep, pure colors.<\/p>\n<p>Currently, there are two ways to make OLED displays. For small screens like smartphones, the pixels are split into subpixels that emit red, green, or blue light. These are laid down by spraying dots of each material through a fine mesh. But the <a href=\"https:\/\/spectrum.ieee.org\/tech-talk\/consumer-electronics\/audiovideo\/metasurface-oled-display\">method has limitations<\/a> both in how small the subpixels (and therefore pixels) can be and how large the display can go. If the mesh is too big, it has a tendency to sag.<\/p>\n<p>So, for larger displays like televisions, manufacturers opt for white OLEDs with red, blue, and green filters sitting on top of them. The thing is, the filters absorb 70% of the light, thus requiring more power to keep them bright. Filtered OLEDs are also limited in how small you can make them.<\/p>\n<h3><strong>A Nanoscale Skyline to Filter Light Into Pixels<\/strong><\/h3>\n<p>The new Stanford and Samsung solves both problems at the same time.<\/p>\n<p>Instead of filters or color-specific OLED materials, the new display makes use of a surface bristling with tiny silver nanopillars 80 nanometers high.<\/p>\n<p>When bathed in white light, the <em>spacing<\/em> between the pillars determines which wavelengths are transmitted. Each subpixel contains pillars of different widths. The widest pillars with the least spacing give off red light, the next widest, green, and least wide, blue. Larger squares\u2014containing red, blue, and green subpixels\u2014make up the display\u2019s pixels. Each is 2.4 microns wide or roughly 1\/10,000th of an inch.<\/p>\n<figure id=\"attachment_137118\" aria-describedby=\"caption-attachment-137118\" class=\"wp-caption aligncenter\"><img data-recalc-dims=\"1\" decoding=\"async\" class=\"wp-image-137118 size-full lazyload\" data-src=\"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2020\/11\/ENq95w.jpg?resize=640%2C296&#038;ssl=1\" alt width=\"640\" height=\"296\" data-srcset=\"https:\/\/techclot.com\/wp-content\/uploads\/2020\/11\/ENq95w.jpg 1193w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-300x139.jpg 300w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-900x416.jpg 900w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-768x355.jpg 768w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-696x322.jpg 696w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-1068x494.jpg 1068w, https:\/\/singularityhub.com\/wp-content\/uploads\/2020\/11\/oled-metasurface-vr-ar-display-tech-samsung-stanford_1-908x420.jpg 908w\" data-sizes=\"(max-width: 1193px) 100vw, 1193px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 640px; --smush-placeholder-aspect-ratio: 640\/296;\"><figcaption id=\"caption-attachment-137118\" class=\"wp-caption-text\">Illustration of meta-OLED display highlighting the metaphotonic layer, in which grids of nanopillars manipulate light to produce the desired color (red, green, or blue). Image credit: <a href=\"https:\/\/news.stanford.edu\/2020\/10\/22\/future-vr-employ-new-ultrahigh-res-display\/\">Samsung Advanced Institute of Technology<\/a><\/figcaption><\/figure>\n<p>As the pixels are no longer shaded by filters\u2014and due to a curious property of the metasurface that allows light to build up and resonate, a bit like sound in a musical instrument\u2014the display\u2019s color is very pure and achieves greater brightness with less power.<\/p>\n<p>The team estimates the approach could yield pixel density up to 20,000 pixels per inch. But as the pixels go below a micron, you begin to sacrifice brightness. The next step is to make a full-size display, something Samsung is currently working to make happen.<\/p>\n<p>Of course, other research is going after ultrahigh resolution displays too.<\/p>\n<p><a href=\"https:\/\/www.nature.com\/articles\/s41377-020-0268-1\">MicroLED tech<\/a>, a leading candidate, has achieved 30,000 pixels per inch. However, displays over 1,000 pixels per inch are monochromatic, and full-color displays remain challenging.<\/p>\n<p>Whichever approach wins, computing power will have to scale too. More pixels equals more processing. One potential solution for AR\/VR is called foveated rendering, where devices <a href=\"https:\/\/www.wired.com\/story\/eye-tracking-vr\/\">track our eyes and only serve up the highest resolution image<\/a> where the gaze is directed. This makes use of the fact our peripheral vision is blurry, regardless of medium, thus saving precious processing power.<\/p>\n<h3><strong>More Immersion = Better?<\/strong><\/h3>\n<p>Of course, visuals are only part of the equation. Truly immersive, sci-fi-like VR will supply a sense of touch, muscular resistance, and maybe even direct manipulation of virtual worlds. And if we\u2019re to avoid only enjoying VR in huge warehouses, we\u2019ll need to figure out how to move in place.<\/p>\n<p>Practical solutions to these problems have been slowly emerging. Early, weird, and\/or expensive peripherals include <a href=\"https:\/\/www.theverge.com\/2020\/10\/7\/21504797\/virtuix-omni-one-vr-treadmill-announce-crowdfunding\">at-home VR treadmills<\/a>, <a href=\"https:\/\/singularityhub.com\/2020\/10\/02\/these-robotic-virtual-reality-boots-make-it-feel-like-youre-walking-while-you-stay-in-place\/\">robotic boots<\/a>, and even <a href=\"https:\/\/www.eurekalert.org\/pub_releases\/2020-10\/vt-vtl100520.php\">robotic exoskeletons<\/a>.<\/p>\n<p>One might wonder (quite rationally) if we really need to be more immersed in our devices. Many people have been spending a whole lot more time on screens, and it can be pretty draining. It\u2019s amazing what an afternoon spent sitting by a good old fashioned stream rendered by the laws of physics unfiltered can do for your sanity.<\/p>\n<p>Still, quality virtual and augmented reality interfaces may solve a few of the less palatable side effects of 2D displays because it\u2019s better tailored to <a href=\"https:\/\/singularityhub.com\/2019\/01\/10\/making-superhumans-through-radical-inclusion-and-cognitive-ergonomics\/\">how our brains operate in the world<\/a>. We very consciously have to teach ourselves to type and use a trackpad, but we learn to navigate the physical world almost by accident (just watch a baby move into toddlerhood).<\/p>\n<p>Perhaps, as we spend more time in the digital world, making it immersive will also help make it a better experience (in some ways, at least). And regardless, the next round of interfaces is coming. So, wouldn\u2019t it be nice if they looked especially pretty?<\/p>\n<p><em>Image credit: <a href=\"https:\/\/unsplash.com\/@lucahuter?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText\">Luca Huter<\/a> \/ <a href=\"https:\/\/unsplash.com\/?utm_source=unsplash&amp;utm_medium=referral&amp;utm_content=creditCopyText\">Unsplash<\/a><\/em><\/p>\n<p><!-- AddThis Advanced Settings above via filter on the_content --><!-- AddThis Advanced Settings below via filter on the_content --><!-- AddThis Advanced Settings generic via filter on the_content --><!-- AddThis Share Buttons above via filter on the_content --><!-- AddThis Share Buttons below via filter on the_content --><!-- AddThis Share Buttons generic via filter on the_content -->    <\/p>\n<p>Published at Sun, 01 Nov 2020 18:00:00 +0000<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This Ultrahigh Resolution Display Could Paint Stunning VR Worlds You can plausibly say, today\u2019s virtual&#8230;<\/p>\n","protected":false},"author":12,"featured_media":3483,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[61],"tags":[],"class_list":["post-3484","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-augmented-virtual-reality"],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/techclot.com\/wp-content\/uploads\/2020\/11\/ENq95w.jpg?fit=1193%2C552&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p3orZX-Uc","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/3484","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/users\/12"}],"replies":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/comments?post=3484"}],"version-history":[{"count":0,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/posts\/3484\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media\/3483"}],"wp:attachment":[{"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/media?parent=3484"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/categories?post=3484"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techclot.com\/index.php\/wp-json\/wp\/v2\/tags?post=3484"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}